Paper 2026/253
Cryptanalytic Extraction of Deep Neural Networks with Non-Linear Activations
Abstract
Deep neural networks (DNNs) are today’s central machinelearning engines, yet their parameters represent valuable intellectual prop-erty exposed to extraction through black-box queries. While existingcryptanalytic attacks have primarily targeted ReLU-based architectures,this work extends model-stealing techniques to a broad class of non-linearactivation functions, including GELU, SiLU, SELU, Sigmoid, and oth-ers. We present the first universal black-box attack capable of recoveringboth weights and biases from networks whose activations converge to lin-ear behavior outside narrow non-linear regions. Our method generalizesprior geometric approaches by leveraging higher-order derivatives and ad-jacent linear zone analysis, bypassing the need for non-differentiability.We show that, for several activations, neuron signatures can be recov-ered more easily than in the ReLU case, and we further demonstratethat activation functions themselves can be identified when not publiclyknown. Our results broaden the scope of cryptanalytic model extraction,revealing that the secrecy of activation functions or smoothness of nonlin-earities does not provide effective protection against black-box recoveryattacks.
Metadata
- Available format(s)
PDF
- Category
- Attacks and cryptanalysis
- Publication info
- Preprint.
- Keywords
- DNNneural networkReLUGeLU
- Contact author(s)
- roderick asselineau @airbus com
patrick derbez @inria fr
pierre-alain fouque @inria fr
brice minaud @inria fr - History
- 2026-02-16: approved
- 2026-02-13: received
- See all versions
- Short URL
- https://ia.cr/2026/253
- License
CC BY
BibTeX
@misc{cryptoeprint:2026/253, author = {Roderick Asselineau and Patrick Derbez and Pierre-Alain Fouque and Brice Minaud}, title = {Cryptanalytic Extraction of Deep Neural Networks with Non-Linear Activations}, howpublished = {Cryptology {ePrint} Archive, Paper 2026/253}, year = {2026}, url = {https://eprint.iacr.org/2026/253}}