Mutual Information Scaling for Tensor Network Machine Learning
- PMID:35211672
- PMCID: PMC8862112
- DOI: 10.1088/2632-2153/ac44a9
Mutual Information Scaling for Tensor Network Machine Learning
Abstract
Tensor networks have emerged as promising tools for machine learning, inspired by their widespread use as variational ansatze in quantum many-body physics. It is well known that the success of a given tensor network ansatz depends in part on how well it can reproduce the underlying entanglement structure of the target state, with different network designs favoring different scaling patterns. We demonstrate here how a related correlation analysis can be applied to tensor network machine learning, and explore whether classical data possess correlation scaling patterns similar to those found in quantum states which might indicate the best network to use for a given dataset. We utilize mutual information as measure of correlations in classical data, and show that it can serve as a lower-bound on the entanglement needed for a probabilistic tensor network classifier. We then develop a logistic regression algorithm to estimate the mutual information between bipartitions of data features, and verify its accuracy on a set of Gaussian distributions designed to mimic different correlation patterns. Using this algorithm, we characterize the scaling patterns in the MNIST and Tiny Images datasets, and find clear evidence of boundary-law scaling in the latter. This quantum-inspired classical analysis offers insight into the design of tensor networks which are best suited for specific learning tasks.
Figures










Similar articles
- Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines.Cheng S, Chen J, Wang L.Cheng S, et al.Entropy (Basel). 2018 Aug 7;20(8):583. doi: 10.3390/e20080583.Entropy (Basel). 2018.PMID:33265672Free PMC article.
- A hybrid quantum-classical classification model based on branching multi-scale entanglement renormalization ansatz.Hou YY, Li J, Xu T, Liu XY.Hou YY, et al.Sci Rep. 2024 Aug 9;14(1):18521. doi: 10.1038/s41598-024-69384-6.Sci Rep. 2024.PMID:39122811Free PMC article.
- Multipartite Entanglement in Stabilizer Tensor Networks.Nezami S, Walter M.Nezami S, et al.Phys Rev Lett. 2020 Dec 11;125(24):241602. doi: 10.1103/PhysRevLett.125.241602.Phys Rev Lett. 2020.PMID:33412058
- Entanglement of Formation of Mixed Many-Body Quantum States via Tree Tensor Operators.Arceci L, Silvi P, Montangero S.Arceci L, et al.Phys Rev Lett. 2022 Jan 28;128(4):040501. doi: 10.1103/PhysRevLett.128.040501.Phys Rev Lett. 2022.PMID:35148155
- Review of some existing QML frameworks and novel hybrid classical-quantum neural networks realising binary classification for the noisy datasets.Schetakis N, Aghamalyan D, Griffin P, Boguslavsky M.Schetakis N, et al.Sci Rep. 2022 Jul 13;12(1):11927. doi: 10.1038/s41598-022-14876-6.Sci Rep. 2022.PMID:35831369Free PMC article.Review.
References
- Kolda T and Bader B “Tensor Decompositions and Applications”. In: SIAM Review 51.3 (Aug. 2009), pp. 455–500. ISSN: 0036–1445. DOI: 10.1137/07070111X. - DOI
- Hackbusch W Tensor Spaces and Numerical Tensor Calculus. Springer series in computational mathematics Springer Verlag, 2012. ISBN: 978–3-642–28026-9.
- Bridgeman Jacob C. and Chubb Christopher T. “Hand-waving and interpretive dance: an introductory course on tensor networks”. In: Journal of Physics A: Mathematical and Theoretical 50.22 (2017), p. 223001.
- Eisert J “Entanglement and tensor network states”. In: arXiv:1308.3318 [cond-mat, physics:quantph] (Sept. 2013). arXiv: 1308.3318. URL:http://arxiv.org/abs/1308.3318.
- Verstraete F, Cirac JI, and Murg V “Matrix Product States, Projected Entangled Pair States, and variational renormalization group methods for quantum spin systems”. In: Advances in Physics 57.2 (Mar. 2008). arXiv: 0907.2796, pp. 143–224. ISSN: 0001–8732, 1460–6976. DOI: 10.1080/14789940801912366. - DOI
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources