How Embeddings Shape Graph Neural Networks: Classical vs Quantum-Oriented Node Representations
2026-04-16 • Machine Learning
Machine Learning
AI summaryⓘ
The authors compared different ways to turn graph nodes into useful numbers for machine learning, including traditional methods and newer quantum-inspired ones. They tested all methods fairly by using the same learning setup and data splits across multiple datasets. Their results show that quantum-inspired methods work better on graphs focused on structure, while classic methods still do well on social graphs with fewer node details. The study helps understand the trade-offs between how easy embeddings are to train, how well they fit the problem, and their stability. This creates a consistent benchmark for choosing embeddings in graph-based learning.
graph neural networksnode embeddingsquantum-inspired embeddingsgraph classificationbenchmarkingvariational embeddingsinductive biastrainabilityTU datasetsQM9 dataset
Authors
Nouhaila Innan, Antonello Rosato, Alberto Marchisio, Muhammad Shafique
Abstract
Node embeddings act as the information interface for graph neural networks, yet their empirical impact is often reported under mismatched backbones, splits, and training budgets. This paper provides a controlled benchmark of embedding choices for graph classification, comparing classical baselines with quantum-oriented node representations under a unified pipeline. We evaluate two classical baselines alongside quantum-oriented alternatives, including a circuit-defined variational embedding and quantum-inspired embeddings computed via graph operators and linear-algebraic constructions. All variants are trained and tested with the same backbone, stratified splits, identical optimization and early stopping, and consistent metrics. Experiments on five different TU datasets and on QM9 converted to classification via target binning show clear dataset dependence: quantum-oriented embeddings yield the most consistent gains on structure-driven benchmarks, while social graphs with limited node attributes remain well served by classical baselines. The study highlights practical trade-offs between inductive bias, trainability, and stability under a fixed training budget, and offers a reproducible reference point for selecting quantum-oriented embeddings in graph learning.