Back to list
Featured Reproducing Kernel Banach Spaces の学習および神経ネットワークへの適用
Featured Reproducing Kernel Banach Spaces for Learning and Neural Networks
Translated: 2026/3/15 13:03:57
Japanese Translation
arXiv:2602.07141v1 発表型: 新しい
要約: 再構成核ヒルベルト空間は、古典的な代数的定理を通じて有有限次元解を認める正規化および補間問題を含む、核ベースの学習の基礎的な枠組みを提供します。しかし、多くの現代の学習モデルは、非二次 norma を備えた固定構造の神経ネットワークを含む、この枠組みを超えた非ヒルベルト幾何自然に生じ、固定構造の神経ネットワークは、これらの再構成核バナッハ空間の特別な実例を自然に誘導することが示されています。本稿では、特徴のある再構成核バナッハ空間の概念に基づくバナッハ空間における学習ための関数解析的枠組みを発展させます。この枠組みでは、上付き学習は最小 norma の補間または正規化問題として形式化され、存在結果および条件付き代数的表現定理が確立されます。さらに、ベクトル値の特徴のある再構成核バナッハ空間への理論の拡張を行い、核ベースの学習の原則が再構成核ヒルベルト空間を超えて拡張される時機を明確にします。
Original Content
arXiv:2602.07141v1 Announce Type: new
Abstract: Reproducing kernel Hilbert spaces provide a foundational framework for kernel-based learning, where regularization and interpolation problems admit finite-dimensional solutions through classical representer theorems. Many modern learning models, however -- including fixed-architecture neural networks equipped with non-quadratic norms -- naturally give rise to non-Hilbertian geometries that fall outside this setting. In Banach spaces, continuity of point-evaluation functionals alone is insufficient to guarantee feature representations or kernel-based learning formulations. In this work, we develop a functional-analytic framework for learning in Banach spaces based on the notion of featured reproducing kernel Banach spaces. We identify the precise structural conditions under which feature maps, kernel constructions, and representer-type results can be recovered beyond the Hilbertian regime. Within this framework, supervised learning is formulated as a minimal-norm interpolation or regularization problem, and existence results together with conditional representer theorems are established. We further extend the theory to vector-valued featured reproducing kernel Banach spaces and show that fixed-architecture neural networks naturally induce special instances of such spaces. This provides a unified function-space perspective on kernel methods and neural networks and clarifies when kernel-based learning principles extend beyond reproducing kernel Hilbert spaces.