Part of the book series:Lecture Notes in Computer Science ((LNAI,volume 12457))
Included in the following conference series:
1949Accesses
Abstract
With both theoretical and practical significance, the study of sparse binary projection models has attracted considerable recent research attention. The models project dense input samples into a higher-dimensional space and output sparse binary data representations after the Winner-Take-All competition, subject to the constraint that the projection matrix is also sparse and binary. Following the work along this line, we developed a supervised-WTA model which works when training samples with both input and output representations are available, from which the optimal projection matrix can be obtained with a simple, efficient, yet effective algorithm after proper relaxation. We further extended the model and the algorithm to an unsupervised setting where only the input representation of the samples is available. In a series of empirical evaluation on similarity search tasks, the proposed models reported significantly improved results over state-of-the-art methods in both search accuracies and running speed. The successful results give us strong confidence that the work provides a useful tool for industrial applications.
This is a preview of subscription content,log in via an institution to check access.
Access this chapter
Subscribe and save
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
Buy Now
- Chapter
- JPY 3498
- Price includes VAT (Japan)
- eBook
- JPY 11439
- Price includes VAT (Japan)
- Softcover Book
- JPY 14299
- Price includes VAT (Japan)
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
As in [7],c is set to\(\left\lfloor 0.1\times d \right\rfloor \) in this paper.
- 2.
More details are available at:http://mypage.cuhk.edu.cn/academics/wyli/research/ecml20.html.
- 3.
- 4.
- 5.
We set\(\gamma = 1\) without fine-tuning the parameter.
References
Ailon, N., Chazelle, B.: The fast Johnson-Lindenstrauss transform and approximate nearest neighbors. SIAM J. Comput.39(1), 302–322 (2009)
Arbib, M.: The Handbook of Brain Theory and Neural Networks. MIT Press, Cambridge (2003)
Baeza-Yates, R., Ribeiro-Neto, B.: Modern Information Retrieval, vol. 463. ACM Press, Cambridge (1999)
Bingham, E., Mannila, H.: Random projection in dimensionality reduction: applications to image and text data. In: Proceedings of the 7th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 245–250. ACM (2001)
Caron, S., Ruta, V., Abbott, L., Axel, R.: Random convergence of olfactory inputs in the drosophila mushroom body. Nature497(7447), 113 (2013)
Charikar, M.: Similarity estimation techniques from rounding algorithms. In: Proceedings of the 34th Annual ACM Symposium on Theory of Computing, pp. 380–388. ACM (2002)
Dasgupta, S., Stevens, C., Navlakha, S.: A neural algorithm for a fundamental computing problem. Science358(6364), 793–796 (2017)
Frank, M., Wolfe, P.: An algorithm for quadratic programming. Naval Res. Logist.3(1–2), 95–110 (1956)
Gionis, A., Indyk, P., Motwani, R.: Similarity search in high dimensions via hashing. In: Proceedings of the 25th International Conference on Very Large Data Bases, vol. 99, pp. 518–529 (1999)
Gong, Y., Lazebnik, S., Gordo, A., Perronnin, F.: Iterative quantization: a procrustean approach to learning binary codes for large-scale image retrieval. IEEE Trans. Pattern Anal. Mach. Intell.35(12), 2916–2929 (2012)
Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res.3, 1157–1182 (2003)
Heo, J., Lee, Y., He, J., Chang, S., Yoon, S.: Spherical hashing: Binary code embedding with hyperspheres. IEEE Trans. Pattern Anal. Mach. Intell.37(11), 2304–2316 (2015)
Jaggi, M.: Revisiting Frank-Wolfe: projection-free sparse convex optimization. In: Proceedings of the 30th International Conference on Machine Learning, pp. 427–435 (2013)
Jain, A., Murty, N., Flynn, P.: Data clustering: a review. ACM Comput. Surv.31(3), 264–323 (1999)
Johnson, W., Lindenstrauss, J.: Extensions of Lipschitz mappings into a Hilbert space. Contemporary Mathematics26(189–206), 1 (1984)
Knuth, D.: The Art of Computer Programming, Sorting and Searching, vol. 3. Addison-Wesley, Reading (1998)
Kohonen, T.: The self-organizing map. Proc. IEEE78(9), 1464–1480 (1990)
Kong, W., Li, W.: Isotropic hashing. In: Advances in Neural Information Processing Systems, pp. 1646–1654 (2012)
Li, W., Mao, J., Zhang, Y., Cui, S.: Fast similarity search via optimal sparse lifting. In: Advances in Neural Information Processing Systems, pp. 176–184 (2018)
Lynch, N., Musco, C., Parter, M.: Winner-take-all computation in spiking neural networks. arXiv preprintarXiv:1904.12591 (2019)
Maass, W.: On the computational power of winner-take-all. Neural Comput.12(11), 2519–2535 (2000)
MacQueen, J.: Some methods for classification and analysis of multivariate observations. In: Proceedings of the 5th Berkeley Symposium on Mathematical Statistics and Probability, Oakland, CA, USA, vol. 1, pp. 281–297 (1967)
Manning, C., Raghavan, P., Schütze, H.: Introduction to Information Retrieval. Cambridge University Press, Cambridge (2008)
Olsen, S., Bhandawat, V., Wilson, R.: Divisive normalization in olfactory population codes. Neuron66(2), 287–299 (2010)
Omondi, A., Rajapakse, J.: FPGA Implementations of Neural Networks, vol. 365. Springer, Heidelberg (2006).https://doi.org/10.1007/0-387-28487-7
Panousis, K., Chatzis, S., Theodoridis, S.: Nonparametric Bayesian deep networks with local competition. In: Proceedings of the 36th International Conference on Machine Learning, pp. 4980–4988 (2019)
Pehlevan, C., Sengupta, A., Chklovskii, D.: Why do similarity matching objectives lead to Hebbian/anti-Hebbian networks? Neural Comput.30(1), 84–124 (2018)
Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp. 1532–1543 (2014)
Powell, M.: On search directions for minimization algorithms. Math. Program.4(1), 193–201 (1973)
Russakovsky, O., Deng, J., Su, H., et al.: Imagenet large scale visual recognition challenge. Int. J. Comput. Vision115(3), 211–252 (2015)
Stevens, C.: What the fly’s nose tells the fly’s brain. Proc. Natl. Acad. Sci.112(30), 9460–9465 (2015)
Turner, G., Bazhenov, M., Laurent, G.: Olfactory representations by drosophila mushroom body neurons. J. Neurophysiol.99(2), 734–746 (2008)
Zheng, Z., Lauritzen, S., Perlman, E., Robinson, C., et al.: A complete electron microscopy volume of the brain of adult drosophila melanogaster. Cell174(3), 730–743 (2018)
Acknowledgment
We thank the anonymous reviewers for their insightful comments and suggestions. The work is supported by Shenzhen Fundamental Research Fund (JCYJ201703061410-38939, KQJSCX20170728162302784, JCYJ20170410172341657).
Author information
Authors and Affiliations
The Chinese University of Hong Kong, Shenzhen, China
Wenye Li
Shenzhen Research Institute of Big Data, Shenzhen, China
Wenye Li
- Wenye Li
You can also search for this author inPubMed Google Scholar
Corresponding author
Correspondence toWenye Li.
Editor information
Editors and Affiliations
Albert-Ludwigs-Universität, Freiburg, Germany
Frank Hutter
TU Darmstadt, Darmstadt, Germany
Kristian Kersting
Ghent University, Ghent, Belgium
Jefrey Lijffijt
Saarland University, Saarbrücken, Germany
Isabel Valera
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Li, W. (2021). Modeling Winner-Take-All Competition in Sparse Binary Projections. In: Hutter, F., Kersting, K., Lijffijt, J., Valera, I. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2020. Lecture Notes in Computer Science(), vol 12457. Springer, Cham. https://doi.org/10.1007/978-3-030-67658-2_26
Download citation
Published:
Publisher Name:Springer, Cham
Print ISBN:978-3-030-67657-5
Online ISBN:978-3-030-67658-2
eBook Packages:Computer ScienceComputer Science (R0)
Share this paper
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative