Decision Boundary Geometry Under Structural Pruning in Deep Neural Networks

Authors

  • Marina Velcroft

Keywords:

Neural Pruning, Decision Boundary Geometry, Model Compression, Robustness, Lottery-Ticket Subnetworks, Structured Sparsity

Abstract

Neural pruning, the selective removal of parameters from trained neural networks, has become a central method for model compression and efficiency optimization. However, its impact extends beyond parameter count and latency reduction, influencing the structure and stability of decision boundaries that determine class separability. This study examines how different pruning strategiesmagnitude-based unstructured pruning, structured neuron and filter removal, and lottery-ticket subnet identificationaffect decision boundary geometry across multiple neural architectures. Experimental results show that moderate pruning preserves margin width and cluster separability, while aggressive sparsification increases decision surface curvature and fragmentation, reducing robustness to perturbations. Structured and lottery-ticket-based methods were found to maintain smoother boundaries relative to unstructured pruning, highlighting the importance of representational alignment in preserving classifier stability. These findings demonstrate that pruning must be evaluated not only in terms of computational efficiency but also in its geometric implications for reliability in real-world inference environments.

Downloads

Published

2026-02-05

How to Cite

Marina Velcroft. (2026). Decision Boundary Geometry Under Structural Pruning in Deep Neural Networks. Turquoise International Journal of Educational Research and Social Studies, 7(2), 21–25. Retrieved from https://theeducationjournals.com/index.php/tijer/article/view/410

Issue

Section

Articles