Hyperparameter Landscape Geometry in High Dimensional Neural Training
Keywords:
Loss Landscape Geometry, Hyperparameter Interaction, Training StabilityAbstract
The geometry of the optimization landscape plays a central role in determining stability and
generalization in high-dimensional neural network training. Rather than acting independently,
hyperparameters collectively influence curvature structure, trajectory continuity, and the connectivity
of converged minima. This work presents a geometric interaction framework that analyzes how
learning rate, batch size, momentum, and weight regularization jointly shape the training pathway
across the loss surface. Through curvature approximation, trajectory displacement analysis, and
effective energy contour mapping, we differentiate flat, wide basins associated with robust
generalization from sharp, narrow minima linked to performance fragility. Results show that
geometry-aligned hyperparameter configurations promote smooth, connected convergence regions,
whereas aggressive or unbalanced settings fragment the landscape and induce unstable optimization
dynamics. These findings support a shift from empirical tuning toward geometry-aware
hyperparameter design, where training stability emerges from structured parameter interplay rather
than isolated parameter choice.