Biological Activation Constraints Enforce Robust Population Coding in Deep Neural Networks

Authors

DOI:

https://doi.org/10.22105/kmisj.v2i3.108

Keywords:

Robust neural coding, Numerical stability, Biologically inspired neural networks, Sensory processing models, Aktivation function constraints

Abstract

Biological sensory systems achieve robust and stable representations despite noise and variability, a property often lacking in artificial neural networks. We study robustness in neural coding from a numerical analysis perspective, focusing on how activation function constraints influence stability. By analyzing network Jacobians and population-level representation geometry, we compare Unconstrained (US) models with biologically inspired bounded and saturating activations. Numerical simulations show that activation constraints significantly reduce sensitivity to input perturbations. Constrained networks preserve representational structure while maintaining feature selectivity. These results suggest that simple biologically motivated nonlinearities provide an effective mechanism for robust sensory processing.

References

Pouget, A., Dayan, P., & Zemel, R. (2000). Information processing with population codes. Nature reviews neuroscience, 1(2), 125-132. https://doi.org/10.1038/35039062

Averbeck, B. B., Latham, P. E., & Pouget, A. (2006). Neural correlations, population coding and computation. Nature reviews neuroscience, 7(5), 358-366. https://doi.org/10.1038/nrn1888

DiCarlo, J. J., Zoccolan, D., & Rust, N. C. (2012). How does the brain solve visual object recognition? Neuron, 73(3), 415-434. https://doi.org/10.1016/j.neuron.2012.01.010

Sokolić, J., Giryes, R., Sapiro, G., & Rodrigues, M. R. (2017). Robust large margin deep neural networks. IEEE transactions on signal processing, 65(16), 4265-4280. https://doi.org/10.1109/TSP.2017.2708039

Chung, S., Lee, D. D., & Sompolinsky, H. (2018). Classification and geometry of general perceptual manifolds. Physical review X, 8(3), 031003. https://doi.org/10.1103/PhysRevX.8.031003

Stringer, C., Pachitariu, M., Steinmetz, N., Carandini, M., & Harris, K. D. (2019). High-dimensional geometry of population responses in visual cortex. Nature, 571(7765), 361-365. https://doi.org/10.1038/s41586-019-1346-5

Dayan, P., & Abbott, L. F. (2005). Theoretical Neuroscience: Computational and mathematical modeling of neural systems. MIT Press. https://boulderschool.yale.edu/sites/default/files/files/DayanAbbott.pdf

Sompolinsky, H., Crisanti, A., & Sommers, H. J. (1988). Chaos in random neural networks. Physical review letters, 61(3), 259. https://doi.org/10.1103/PhysRevLett.61.259

Yamins, D. L., & DiCarlo, J. J. (2016). Using goal-driven deep learning models to understand sensory cortex. Nature neuroscience, 19(3), 356-365. https://doi.org/10.1038/nn.4244

Bietti, A., & Mairal, J. (2019). On the inductive bias of neural tangent kernels. 33rd conference on neural information processing systems (PP. 1-12). Curran Associates, Inc. https://proceedings.neurips.cc/paper_files/paper/2019/file/c4ef9c39b300931b69a36fb3dbb8d60e-Paper.pdf

He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. Proceedings of the IEEE international conference on computer vision (pp. 1026-1034). IEEE .https://doi.org/10.1109/ICCV.2015.123

Lee, J., Bahri, Y., Novak, R., Schoenholz, S. S., Pennington, J., & Sohl-Dickstein, J. (2017). Deep neural networks as gaussian processes. Arxiv preprint arXiv:1711.00165. https://doi.org/10.48550/arXiv.1711.00165

Published

2025-09-11

How to Cite

Temel, Z. (2025). Biological Activation Constraints Enforce Robust Population Coding in Deep Neural Networks. Karshi Multidisciplinary International Scientific Journal, 2(3), 152-159. https://doi.org/10.22105/kmisj.v2i3.108

Similar Articles

1-10 of 23

You may also start an advanced similarity search for this article.