Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Advertisement

Springer Nature Link
Log in

Modeling Winner-Take-All Competition in Sparse Binary Projections

  • Conference paper
  • First Online:

Part of the book series:Lecture Notes in Computer Science ((LNAI,volume 12457))

  • 1949Accesses

Abstract

With both theoretical and practical significance, the study of sparse binary projection models has attracted considerable recent research attention. The models project dense input samples into a higher-dimensional space and output sparse binary data representations after the Winner-Take-All competition, subject to the constraint that the projection matrix is also sparse and binary. Following the work along this line, we developed a supervised-WTA model which works when training samples with both input and output representations are available, from which the optimal projection matrix can be obtained with a simple, efficient, yet effective algorithm after proper relaxation. We further extended the model and the algorithm to an unsupervised setting where only the input representation of the samples is available. In a series of empirical evaluation on similarity search tasks, the proposed models reported significantly improved results over state-of-the-art methods in both search accuracies and running speed. The successful results give us strong confidence that the work provides a useful tool for industrial applications.

This is a preview of subscription content,log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 11439
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 14299
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Similar content being viewed by others

Notes

  1. 1.

    As in [7],c is set to\(\left\lfloor 0.1\times d \right\rfloor \) in this paper.

  2. 2.
  3. 3.
  4. 4.

    As in [7,19], the hash length is defined as the number of ones in each output vector for the FLY, LIFTING and WTA algorithms. For other algorithms, it is defined as the output dimension.

  5. 5.

    We set\(\gamma = 1\) without fine-tuning the parameter.

References

  1. Ailon, N., Chazelle, B.: The fast Johnson-Lindenstrauss transform and approximate nearest neighbors. SIAM J. Comput.39(1), 302–322 (2009)

    Article MathSciNet  Google Scholar 

  2. Arbib, M.: The Handbook of Brain Theory and Neural Networks. MIT Press, Cambridge (2003)

    MATH  Google Scholar 

  3. Baeza-Yates, R., Ribeiro-Neto, B.: Modern Information Retrieval, vol. 463. ACM Press, Cambridge (1999)

    Google Scholar 

  4. Bingham, E., Mannila, H.: Random projection in dimensionality reduction: applications to image and text data. In: Proceedings of the 7th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 245–250. ACM (2001)

    Google Scholar 

  5. Caron, S., Ruta, V., Abbott, L., Axel, R.: Random convergence of olfactory inputs in the drosophila mushroom body. Nature497(7447), 113 (2013)

    Article  Google Scholar 

  6. Charikar, M.: Similarity estimation techniques from rounding algorithms. In: Proceedings of the 34th Annual ACM Symposium on Theory of Computing, pp. 380–388. ACM (2002)

    Google Scholar 

  7. Dasgupta, S., Stevens, C., Navlakha, S.: A neural algorithm for a fundamental computing problem. Science358(6364), 793–796 (2017)

    Article MathSciNet  Google Scholar 

  8. Frank, M., Wolfe, P.: An algorithm for quadratic programming. Naval Res. Logist.3(1–2), 95–110 (1956)

    Article MathSciNet  Google Scholar 

  9. Gionis, A., Indyk, P., Motwani, R.: Similarity search in high dimensions via hashing. In: Proceedings of the 25th International Conference on Very Large Data Bases, vol. 99, pp. 518–529 (1999)

    Google Scholar 

  10. Gong, Y., Lazebnik, S., Gordo, A., Perronnin, F.: Iterative quantization: a procrustean approach to learning binary codes for large-scale image retrieval. IEEE Trans. Pattern Anal. Mach. Intell.35(12), 2916–2929 (2012)

    Article  Google Scholar 

  11. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res.3, 1157–1182 (2003)

    MATH  Google Scholar 

  12. Heo, J., Lee, Y., He, J., Chang, S., Yoon, S.: Spherical hashing: Binary code embedding with hyperspheres. IEEE Trans. Pattern Anal. Mach. Intell.37(11), 2304–2316 (2015)

    Article  Google Scholar 

  13. Jaggi, M.: Revisiting Frank-Wolfe: projection-free sparse convex optimization. In: Proceedings of the 30th International Conference on Machine Learning, pp. 427–435 (2013)

    Google Scholar 

  14. Jain, A., Murty, N., Flynn, P.: Data clustering: a review. ACM Comput. Surv.31(3), 264–323 (1999)

    Article  Google Scholar 

  15. Johnson, W., Lindenstrauss, J.: Extensions of Lipschitz mappings into a Hilbert space. Contemporary Mathematics26(189–206), 1 (1984)

    MathSciNet MATH  Google Scholar 

  16. Knuth, D.: The Art of Computer Programming, Sorting and Searching, vol. 3. Addison-Wesley, Reading (1998)

    Google Scholar 

  17. Kohonen, T.: The self-organizing map. Proc. IEEE78(9), 1464–1480 (1990)

    Article  Google Scholar 

  18. Kong, W., Li, W.: Isotropic hashing. In: Advances in Neural Information Processing Systems, pp. 1646–1654 (2012)

    Google Scholar 

  19. Li, W., Mao, J., Zhang, Y., Cui, S.: Fast similarity search via optimal sparse lifting. In: Advances in Neural Information Processing Systems, pp. 176–184 (2018)

    Google Scholar 

  20. Lynch, N., Musco, C., Parter, M.: Winner-take-all computation in spiking neural networks. arXiv preprintarXiv:1904.12591 (2019)

  21. Maass, W.: On the computational power of winner-take-all. Neural Comput.12(11), 2519–2535 (2000)

    Article  Google Scholar 

  22. MacQueen, J.: Some methods for classification and analysis of multivariate observations. In: Proceedings of the 5th Berkeley Symposium on Mathematical Statistics and Probability, Oakland, CA, USA, vol. 1, pp. 281–297 (1967)

    Google Scholar 

  23. Manning, C., Raghavan, P., Schütze, H.: Introduction to Information Retrieval. Cambridge University Press, Cambridge (2008)

    Book  Google Scholar 

  24. Olsen, S., Bhandawat, V., Wilson, R.: Divisive normalization in olfactory population codes. Neuron66(2), 287–299 (2010)

    Article  Google Scholar 

  25. Omondi, A., Rajapakse, J.: FPGA Implementations of Neural Networks, vol. 365. Springer, Heidelberg (2006).https://doi.org/10.1007/0-387-28487-7

    Book  Google Scholar 

  26. Panousis, K., Chatzis, S., Theodoridis, S.: Nonparametric Bayesian deep networks with local competition. In: Proceedings of the 36th International Conference on Machine Learning, pp. 4980–4988 (2019)

    Google Scholar 

  27. Pehlevan, C., Sengupta, A., Chklovskii, D.: Why do similarity matching objectives lead to Hebbian/anti-Hebbian networks? Neural Comput.30(1), 84–124 (2018)

    Article MathSciNet  Google Scholar 

  28. Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp. 1532–1543 (2014)

    Google Scholar 

  29. Powell, M.: On search directions for minimization algorithms. Math. Program.4(1), 193–201 (1973)

    Article MathSciNet  Google Scholar 

  30. Russakovsky, O., Deng, J., Su, H., et al.: Imagenet large scale visual recognition challenge. Int. J. Comput. Vision115(3), 211–252 (2015)

    Article MathSciNet  Google Scholar 

  31. Stevens, C.: What the fly’s nose tells the fly’s brain. Proc. Natl. Acad. Sci.112(30), 9460–9465 (2015)

    Article  Google Scholar 

  32. Turner, G., Bazhenov, M., Laurent, G.: Olfactory representations by drosophila mushroom body neurons. J. Neurophysiol.99(2), 734–746 (2008)

    Article  Google Scholar 

  33. Zheng, Z., Lauritzen, S., Perlman, E., Robinson, C., et al.: A complete electron microscopy volume of the brain of adult drosophila melanogaster. Cell174(3), 730–743 (2018)

    Article  Google Scholar 

Download references

Acknowledgment

We thank the anonymous reviewers for their insightful comments and suggestions. The work is supported by Shenzhen Fundamental Research Fund (JCYJ201703061410-38939, KQJSCX20170728162302784, JCYJ20170410172341657).

Author information

Authors and Affiliations

  1. The Chinese University of Hong Kong, Shenzhen, China

    Wenye Li

  2. Shenzhen Research Institute of Big Data, Shenzhen, China

    Wenye Li

Authors
  1. Wenye Li

    You can also search for this author inPubMed Google Scholar

Corresponding author

Correspondence toWenye Li.

Editor information

Editors and Affiliations

  1. Albert-Ludwigs-Universität, Freiburg, Germany

    Frank Hutter

  2. TU Darmstadt, Darmstadt, Germany

    Kristian Kersting

  3. Ghent University, Ghent, Belgium

    Jefrey Lijffijt

  4. Saarland University, Saarbrücken, Germany

    Isabel Valera

Rights and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, W. (2021). Modeling Winner-Take-All Competition in Sparse Binary Projections. In: Hutter, F., Kersting, K., Lijffijt, J., Valera, I. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2020. Lecture Notes in Computer Science(), vol 12457. Springer, Cham. https://doi.org/10.1007/978-3-030-67658-2_26

Download citation

Publish with us

Societies and partnerships

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 11439
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 14299
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only


[8]ページ先頭

©2009-2025 Movatter.jp