Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Curriculum learning

From Wikipedia, the free encyclopedia
Technique in machine learning
Part of a series on
Machine learning
anddata mining

Curriculum learning is a technique inmachine learning in which amodel is trained on examples of increasing difficulty, where the definition of "difficulty" may be provided externally or discovered as part of the training process. This is intended to attain good performance more quickly, or to converge to a better local optimum if theglobal optimum is not found.[1][2]

Approach

[edit]

Most generally, curriculum learning is the technique of successively increasing the difficulty of examples in thetraining set that is presented to a model over multiple training iterations. This can produce better results than exposing the model to the full training set immediately under some circumstances; most typically, when the model is able to learn general principles from easier examples, and then gradually incorporate more complex and nuanced information as harder examples are introduced, such asedge cases. This has been shown to work in many domains, most likely as a form ofregularization.[3]

There are several major variations in how the technique is applied:

  • A concept of "difficulty" must be defined. This may come from human annotation[4][5] or an externalheuristic; for example inlanguage modeling, shorter sentences might be classified as easier than longer ones.[6] Another approach is to use the performance of another model, with examples accurately predicted by that model being classified as easier (providing a connection toboosting).
  • Difficulty can be increased steadily[7] or in distinct epochs,[8] and in a deterministic schedule or according to aprobability distribution. This may also be moderated by a requirement for diversity at each stage, in cases where easier examples are likely to be disproportionately similar to each other.[9]
  • Applications must also decide the schedule for increasing the difficulty. Simple approaches may use a fixed schedule, such as training on easy examples for half of the available iterations and then all examples for the second half.[3] Other approaches useself-paced learning to increase the difficulty in proportion to the performance of the model on the current set.[10]

Since curriculum learning only concerns the selection and ordering of training data, it can be combined with many other techniques in machine learning. The success of the method assumes that a model trained for an easier version of the problem cangeneralize to harder versions, so it can be seen as a form oftransfer learning. Some authors also consider curriculum learning to include other forms of progressively increasing complexity, such as increasing the number of model parameters.[11] It is frequently combined withreinforcement learning, such as learning a simplified version of a game first.[12]

Some domains have shown success withanti-curriculum learning: training on the most difficult examples first. One example is the ACCAN method forspeech recognition, which trains on the examples with the lowestsignal-to-noise ratio first.[13]

History

[edit]

The term "curriculum learning" was introduced byYoshua Bengio et al in 2009,[14] with reference to thepsychological technique ofshaping in animals and structured education for humans: beginning with the simplest concepts and then building on them. The authors also note that the application of this technique in machine learning has its roots in the early study ofneural networks such asJeffrey Elman's 1993 paperLearning and development in neural networks: the importance of starting small.[15] Bengio et al showed good results for problems inimage classification, such as identifyinggeometric shapes with progressively more complex forms, andlanguage modeling, such as training with a gradually expandingvocabulary. They conclude that, for curriculum strategies, "their beneficial effect is most pronounced on the testset", suggesting good generalization.

The technique has since been applied to many other domains:

References

[edit]
  1. ^Guo, Sheng; Huang, Weilin; Zhang, Haozhi; Zhuang, Chenfan; Dong, Dengke; Scott, Matthew R.; Huang, Dinglong (2018). "CurriculumNet: Weakly Supervised Learning from Large-Scale Web Images".arXiv:1808.01097 [cs.CV].
  2. ^"Competence-based curriculum learning for neural machine translation". RetrievedMarch 29, 2024.
  3. ^abBengio, Yoshua; Louradour, Jérôme; Collobert, Ronan; Weston, Jason (2009)."Curriculum Learning".Proceedings of the 26th Annual International Conference on Machine Learning. pp. 41–48.doi:10.1145/1553374.1553380.ISBN 978-1-60558-516-1. RetrievedMarch 24, 2024.
  4. ^"Curriculum learning of multiple tasks". RetrievedMarch 29, 2024.
  5. ^Ionescu, Radu Tudor; Alexe, Bogdan; Leordeanu, Marius; Popescu, Marius; Papadopoulos, Dim P.; Ferrari, Vittorio (2016)."How Hard Can It Be? Estimating the Difficulty of Visual Search in an Image".2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)(PDF). pp. 2157–2166.doi:10.1109/CVPR.2016.237.ISBN 978-1-4673-8851-1. RetrievedMarch 29, 2024.
  6. ^"Baby Steps: How "Less is More" in unsupervised dependency parsing"(PDF). RetrievedMarch 29, 2024.
  7. ^"Self-paced learning for latent variable models". 6 December 2010. pp. 1189–1197. RetrievedMarch 29, 2024.
  8. ^Tang, Ye; Yang, Yu-Bin; Gao, Yang (2012)."Self-paced dictionary learning for image classification".Proceedings of the 20th ACM international conference on Multimedia. pp. 833–836.doi:10.1145/2393347.2396324.ISBN 978-1-4503-1089-5. RetrievedMarch 29, 2024.
  9. ^"Curriculum learning with diversity for supervised computer vision tasks". RetrievedMarch 29, 2024.
  10. ^"Self-paced Curriculum Learning". RetrievedMarch 29, 2024.
  11. ^Soviany, Petru; Radu Tudor Ionescu; Rota, Paolo; Sebe, Nicu (2021). "Curriculum learning: A Survey".arXiv:2101.10382 [cs.LG].
  12. ^Narvekar, Sanmit; Peng, Bei; Leonetti, Matteo; Sinapov, Jivko; Taylor, Matthew E.; Stone, Peter (January 2020)."Curriculum Learning for Reinforcement Learning Domains: A Framework and Survey".The Journal of Machine Learning Research.21 (1): 181:7382–181:7431.arXiv:2003.04960. RetrievedMarch 29, 2024.
  13. ^"A Curriculum Learning Method for Improved Noise Robustness in Automatic Speech Recognition". RetrievedMarch 29, 2024.
  14. ^Bengio, Yoshua; Louradour, Jérôme; Collobert, Ronan; Weston, Jason (2009)."Curriculum Learning".Proceedings of the 26th Annual International Conference on Machine Learning. pp. 41–48.doi:10.1145/1553374.1553380.ISBN 978-1-60558-516-1. RetrievedMarch 24, 2024.
  15. ^Elman, J. L. (1993). "Learning and development in neural networks: the importance of starting small".Cognition.48 (1):71–99.doi:10.1016/0010-0277(93)90058-4.PMID 8403835.
  16. ^"Learning the Curriculum with Bayesian Optimization for Task-Specific Word Representation Learning". RetrievedMarch 29, 2024.
  17. ^Gong, Yantao; Liu, Cao; Yuan, Jiazhen; Yang, Fan; Cai, Xunliang; Wan, Guanglu; Chen, Jiansong; Niu, Ruiyao; Wang, Houfeng (2021)."Density-based dynamic curriculum learning for intent detection".Proceedings of the 30th ACM International Conference on Information & Knowledge Management. pp. 3034–3037.arXiv:2108.10674.doi:10.1145/3459637.3482082.ISBN 978-1-4503-8446-9. RetrievedMarch 29, 2024.
  18. ^"Visualizing and understanding curriculum learning for long short-term memory networks". RetrievedMarch 29, 2024.
  19. ^"An empirical exploration of curriculum learning for neural machine translation". RetrievedMarch 29, 2024.
  20. ^"Reinforcement learning based curriculum optimization for neural machine translation". RetrievedMarch 29, 2024.
  21. ^"A curriculum learning method for improved noise robustness in automatic speechrecognition". RetrievedMarch 29, 2024.
  22. ^Zhang, Yang; Mohamed, Amr; Abdine, Hadi; Shang, Guokan; Vazirgiannis, Michalis (2025). "Beyond Random Sampling: Efficient Language Model Pretraining via Curriculum Learning".arXiv:2506.11300 [cs.CL].
  23. ^Huang, Yuge; Wang, Yuhan; Tai, Ying; Liu, Xiaoming; Shen, Pengcheng; Li, Shaoxin; Li, Jilin; Huang, Feiyue (2020). "CurricularFace: Adaptive Curriculum Learning Loss for Deep Face Recognition".2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). pp. 5900–5909.arXiv:2004.00288.doi:10.1109/CVPR42600.2020.00594.ISBN 978-1-7281-7168-5.
  24. ^"Curriculum self-paced learning for cross-domain object detection". RetrievedMarch 29, 2024.
  25. ^"Automatic curriculum graph generation for reinforcement learning agents". 4 February 2017. pp. 2590–2596. RetrievedMarch 29, 2024.
  26. ^Gong, Chen; Yang, Jian; Tao, Dacheng (2019)."Multi-modal curriculum learning over graphs".ACM Transactions on Intelligent Systems and Technology.10 (4):1–25.doi:10.1145/3322122. RetrievedMarch 29, 2024.
  27. ^Qu, Meng; Tang, Jian; Han, Jiawei (2018).Curriculum learning for heterogeneous star network embedding via deep reinforcement learning. pp. 468–476.doi:10.1145/3159652.3159711.hdl:2142/101634.ISBN 978-1-4503-5581-0. RetrievedMarch 29, 2024.
  28. ^Self-paced learning for matrix factorization. MIT Press. 25 January 2015. pp. 3196–3202.ISBN 978-0-262-51129-2. RetrievedMarch 29, 2024.

Further reading

[edit]
Retrieved from "https://en.wikipedia.org/w/index.php?title=Curriculum_learning&oldid=1301041453"
Category:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp