Movatterモバイル変換


[0]ホーム

URL:


Skip to main content
Springer Nature Link
Log in

PCA Meets RG

  • Published:
Journal of Statistical Physics Aims and scope Submit manuscript

Abstract

A system with many degrees of freedom can be characterized by a covariance matrix; principal components analysis focuses on the eigenvalues of this matrix, hoping to find a lower dimensional description. But when the spectrum is nearly continuous, any distinction between components that we keep and those that we ignore becomes arbitrary; it then is natural to ask what happens as we vary this arbitrary cutoff. We argue that this problem is analogous to the momentum shell renormalization group. Following this analogy, we can define relevant and irrelevant operators, where the role of dimensionality is played by properties of the eigenvalue density. These results also suggest an approach to the analysis of real data. As an example, we study neural activity in the vertebrate retina as it responds to naturalistic movies, and find evidence of behavior controlled by a nontrivial fixed point. Applied to financial data, our analysis separates modes dominated by sampling noise from a smaller but still macroscopic number of modes described by a non-Gaussian distribution.

This is a preview of subscription content,log in via an institution to check access.

Access this article

Log in via an institution

Subscribe and save

Springer+
from ¥17,985 /Month
  • Starting from 10 chapters or articles per month
  • Access and download chapters and articles from more than 300k books and 2,500 journals
  • Cancel anytime
View plans

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Explore related subjects

Discover the latest articles, books and news in related subjects, suggested using machine learning.

Notes

  1. The idea of PCA goes back at least to the start of the twentieth century [1]. For a brief modern summary, see Ref. [2].

  2. An alternative formulation treats the smallest eigenvalue separately, as with a mass term in field theory, measuring all eigenvalues by their distance from this minimum. Then\(\rho (\lambda )\) would always have, as\(N\rightarrow \infty \), support near\(\lambda = 0\).

  3. In the analytic discussion of model distributions, above, the natural quantities were the eigenvalues and eigenvectors of the matrix\(K_\mathrm{ij}\). As noted, we don’t have access to this matrix when we are confronted with real data, so we analyze the matrix\(C_\mathrm{ij}\) instead. To emphasize that what we are doing with the data is in the same spirit as the analysis of the models, we abuse notation slightly and recycle the symbols\(\{\lambda _\mu , \, u_\mathrm{i}(\mu )\}\).

  4. In some contexts it would be more natural to look at the distribution of eigenvalues, searching for modes that emerge clearly from a “bulk” that might be ascribed to sampling noise. Plotting eigenvalues vs their rank, as we do here, provides a representation of the cumulative distribution of eigenvalues, and does not require us to make bins along the eigenvalue axis. Rather than plotting from smallest to largest, we plot from largest to smallest, so that the spectra are more directly comparable to a plot of the susceptibility or propagatorG(k) vs momentumk in the usual statistical physics examples.

References

  1. Pearson, K.: On lines and planes of closest fit to systems of points in space. Philos. Mag.2, 559–572 (1901)

    Article MATH  Google Scholar 

  2. Shlens, J.: A tutorial on principal components analysis.arXiv:1404.1100 [cs.LG] (2014)

  3. Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA79, 2554–2558 (1982)

    Article ADS MathSciNet  Google Scholar 

  4. Wilson, K.G.: Problems in physics with many scales of length. Sci. Am.241, 158–179 (1979)

    Article  Google Scholar 

  5. Jona-Lasinio, G.: The renormalization group: a probabilistic view. Il Nuovo Cimento26B, 99–119 (1975)

    Article ADS MathSciNet  Google Scholar 

  6. Wilson, K.G., Kogut, J.: The renormalization group and the\(\epsilon \) expansion. Phys. Rep.12, 75–200 (1974)

    Article ADS  Google Scholar 

  7. Kadanoff, L.P.: Scaling laws for Ising models near\(T_c\). Physics2, 263–272 (1966)

    Google Scholar 

  8. Kadanoff, L.P.: From simulation model to public policy: an examination of Forrester’s “Urban Dynamics”. Simulation16, 261–268 (1971)

    Article  Google Scholar 

  9. Kadanoff, L.P., Weinblatt, H.: Public policy conclusions from urban growth models. IEEE Trans. Syst. Man Cybern.SMC–2, 139–165 (1972)

    Google Scholar 

  10. Bensimon, D., Kadanoff, L.P., Liang, S., Shraiman, B.I., Tang, C.: Viscous flows in two dimensions. Rev. Mod. Phys.58, 977–999 (1986)

    Article ADS MATH  Google Scholar 

  11. Halsey, T.C., Jensen, M.H., Kadanoff, L.P., Procaccia, I., Shraiman, B.I.: Fractal measures and their singularities: the characterization of strange sets. Phys. Rev. A33, 1141–1151 (1986); erratum34, 1601 (1986)

  12. Constantin, P., Kadanoff, L.P.: Singularities in complex interfaces. Philos. Trans. R. Soc. Lond. Ser. A333, 379–389 (1990)

    Article ADS MathSciNet MATH  Google Scholar 

  13. Bertozzi, A., Brenner, M., Dupont, T.F., Kadanoff, L.P.: Singularities and similarities in interface flows. In: Sirovich, L.P. (ed.) Trends and Perspectives in Applied Mathatematics. Springer Verlag Applied Math Series Vol. 100, pp. 155–208 (1994)

  14. Coppersmith, S.N., Blank, R.D., Kadanoff, L.P.: Analysis of a population genetics model with mutation, selection, and pleitropy. J. Stat. Phys.97, 429–459 (1999)

    Article ADS MATH  Google Scholar 

  15. Povinelli, M.L., Coppersmith, S.N., Kadanoff, L.P., Nagel, S.R., Venkataramani, S.C.: Noise stabilization of self-organized memories. Phys. Rev. E59, 4970–4982 (1999)

    Article ADS  Google Scholar 

  16. Kadanoff, L.P.: More is the same: mean field theory and phase transitions. J. Stat. Phys.137, 777–797 (2009)

    Article ADS MathSciNet MATH  Google Scholar 

  17. Kadanoff, L.P.: Relating theories via renormalization. Stud. Hist. Philos. Sci. B44, 22–39 (2013)

    MathSciNet MATH  Google Scholar 

  18. Kadanoff, L.P.: Reflections on Gibbs: from statistical physics to the Amistad. J. Stat. Phys.156, 1–9 (2014)

    Article ADS MathSciNet MATH  Google Scholar 

  19. Kadanoff, L.P.: Innovations in statistical physics. Annu. Rev. Cond. Matter Phys.6, 1–14 (2015)

    Article ADS  Google Scholar 

  20. Wilson, K.G., Fisher, M.E.: Critical exponents in 3.99 dimensions. Phys. Rev. Lett.28, 240–243 (1972)

    Article ADS  Google Scholar 

  21. Amit, D.J., Martin-Mayor, V.: Field Theory, the Renormalization Group, and Critical Phenomena. Graphs to Computers, 3rd edn. World Scientific, Singapore (2005)

    Book MATH  Google Scholar 

  22. Binder, K.: Finite size scaling analysis of Ising model block distribution functions. Z. Phys. B43, 119–140 (1981)

    Article ADS  Google Scholar 

  23. Tkačik, G., Marre, O., Amodei, D., Schneidman, E., Bialek, W., Berry II, M.J.: Searching for collective behavior in a large network of sensory neurons. PLoS Comput. Biol.10, e1003408 (2014)

    Article ADS  Google Scholar 

  24. Abarbanel, H.D.I., Brown, R., Sidorowich, J.J., Tsimring, L.S.: The analysis of observed chaotic data in physical systems. Rev. Mod. Phys.65, 1331–1392 (1993)

    Article ADS MathSciNet  Google Scholar 

  25. Mora, T., Bialek, W.: Are biological systems poised at criticality? J. Stat. Phys.144, 268–302 (2011)

    Article ADS MathSciNet MATH  Google Scholar 

  26. Tkačik, G., Mora, T., Marre, O., Amodei, D., Palmer, S.E., Berry II, M.J., Bialek, W.: Thermodynamics and signatures of criticality in a network of neurons. Proc. Natl. Acad. Sci. USA112, 11508–11513 (2015)

    Article ADS  Google Scholar 

  27. Marsili, M.: Dissecting financial markets: sectors and states. Quant. Financ.2, 297–302 (2002)

    Article  Google Scholar 

  28. Lillo, F., Mantegna, R.N.: Variety and volatility in financial markets. Phys. Rev. E62, 6126–6134 (2000)

    Article ADS  Google Scholar 

  29. Bouchaud, J.P., Potters, M: Financial applications. In: Akemann, G., Baik, J., Di Francesco, P. (eds.) The Oxford Handbook of Random Matrix Theory. Oxford University Press, Oxford (2011).arXiv:0910.1205 [q–fin.ST] (2009)

  30. Bun, J., Allez, R., Bouchaud, J.P., Potters, M.: Rotational invariant estimator for general noisy matrices.arXiv:1502.06736 [cond–mat.stat–mech] (2015)

  31. Bun, J., Bouchaud, J.-P., Potters, M.: Cleaning large correlation matrices: tools from random matrix theory.arXiv:1610.08104 [cond–mat.stat–mech] (2016)

  32. Aygün, E., Erzan, A.: Spectral renormalization group theory on networks. J. Phys. Conf. Ser.319, 012007 (2011)

    Article  Google Scholar 

  33. Castellana, M.: Real-space renormalization group analysis of a non-mean-field spin-glass. EPL95, 47014 (2011)

    Article  Google Scholar 

  34. Angelini, M.C., Parisi, G., Ricci-Tersenghi, F.: Ensemble renormalization group for disordered systems. Phys. Rev. B87, 134201 (2013)

    Article ADS  Google Scholar 

  35. Angelini, M.C., Biroli, G.: Spin glass in a field: a new zero-temperature fixed point in finite dimensions. Phys. Rev. Lett.114, 095701 (2015)

    Article ADS  Google Scholar 

  36. Brown, K.S., Hill, C.C., Calero, G.A., Myers, C.R., Lee, K.H., Sethna, J.P., Cerione, R.A.: The statistical mechanics of complex signaling networks: Nerve growth factor aignaling. Phys. Biol.1, 184–195 (2004)

    Article ADS  Google Scholar 

  37. Waterfall, J.J., Casey, F.P., Gutenkunst, R.N., Brown, K.S., Myers, C.R., Brouwer, P.W., Elser, V., Sethna, J.P.: Sloppy model universality class and the Vandermonde matrix. Phys. Rev. Lett.97, 150601 (2006)

    Article ADS  Google Scholar 

  38. Gutenkunst, R.N., Waterfall, J.J., Casey, F.P., Brown, K.S., Myers, C.R., Sethna, J.P.: Universally sloppy parameter sensitivities in systems biology. PLoS Comput. Biol.3, e189 (2007)

    Article ADS MathSciNet  Google Scholar 

  39. Transtrum, M.K., Machta, B.B., Sethna, J.P.: Geometry of nonlinear least squares with applications to sloppy models and optimization. Phys. Rev. E83, 036701 (2011)

    Article ADS  Google Scholar 

  40. Matcha, B.B., Chachra, R., Transtrum, M.K., Sethna, J.P.: Parameter space compression underlies emergent theories and predictive models. Science342, 604–607 (2013)

    Article ADS  Google Scholar 

  41. Mehta, P., Schwab, D.J.: An exact mapping between the variational renormalization group and deep learning.arXiv:1410.3831 [stat.ML] (2014)

Download references

Acknowledgements

We thank D Amodei, MJ Berry II, and O Marre for making available the data of Ref. [23] and M Marsili for the data of Ref. [27]. We are especially grateful to G Biroli, J–P Bouchaud, MP Brenner, CG Callan, A Cavagna, I Giardina, MO Magnasco, A Nicolis, SE Palmer, G Parisi, and DJ Schwab for helpful discussions and comments on the manuscript. Work at CUNY was supported in part by the Swartz Foundation. Work at Princeton was supported in part by Grants from the National Science Foundation (PHY-1305525, PHY-1451171, and CCF-0939370) and the Simons Foundation.

Author information

Authors and Affiliations

  1. Initiative for the Theoretical Sciences, The Graduate Center, City University of New York, 365 Fifth Ave., New York, NY, 10016, USA

    Serena Bradde & William Bialek

  2. Joseph Henry Laboratories of Physics, and Lewis–Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ, 08544, USA

    William Bialek

Authors
  1. Serena Bradde
  2. William Bialek

Corresponding author

Correspondence toWilliam Bialek.

Rights and permissions

About this article

Access this article

Subscribe and save

Springer+
from ¥17,985 /Month
  • Starting from 10 chapters or articles per month
  • Access and download chapters and articles from more than 300k books and 2,500 journals
  • Cancel anytime
View plans

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Advertisement


[8]ページ先頭

©2009-2026 Movatter.jp