Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Advertisement

Springer Nature Link
Log in

Large Language Model for Multiobjective Evolutionary Optimization

  • Conference paper
  • First Online:

Abstract

Multiobjective evolutionary algorithms (MOEAs) are major solutions for solving multiobjective optimization problems (MOPs). However, to design efficient and powerful operators in MOEAs, a tedious process of trial-and-error with domain experts is usually required. This paper investigates a novel approach that leverages large language models (LLM) to design MOEA operators. Firstly, with proper prompt engineering, we directly employ an LLM to serve as a black-box search operator for decomposition-based MOEA in a zero-shot manner. In this way, we can get rid of the time-consuming manual operator design process. In addition, we further design a white-box operator to interpret and approximate the behaviour of LLM and propose a new version of decomposition-based MOEA, termed MOEA/D-LMO. This white-box operator improves generalization and reduces runtime by eliminating costly LLM interaction. Experimental studies on different test benchmarks show that our proposed method can achieve competitive performance with widely used MOEAs. Furthermore, the operators only learned from LLMs on a few instances have good generalization performances on unseen problems with different patterns and settings.

This is a preview of subscription content,log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 6634
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 8293
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

References

  1. Blank, J., Deb, K.: Pymoo: multi-objective optimization in python. IEEE Access8, 89497–89509 (2020)

    Article MATH  Google Scholar 

  2. Chen, A., Dohan, D.M., So, D.R.: EvoPrompting: language models for code-level neural architecture search. arXiv preprintarXiv:2302.14838 (2023)

  3. Das, I., Dennis, J.E.: Normal-boundary intersection: a new method for generating the pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim.8(3), 631–657 (1998)

    Article MathSciNet MATH  Google Scholar 

  4. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput.6(2), 182–197 (2002)

    Article MATH  Google Scholar 

  5. Emmerich, M.T.M., Deutz, A.H.: A tutorial on multiobjective optimization: fundamentals and evolutionary methods. Nat. Comput.17(3), 585–609 (2018).https://doi.org/10.1007/s11047-018-9685-y

    Article MathSciNet MATH  Google Scholar 

  6. Jin, Y., Wang, H., Chugh, T., Guo, D., Miettinen, K.: Data-driven evolutionary optimization: an overview and case studies. IEEE Trans. Evol. Comput.23(3), 442–458 (2018)

    Article MATH  Google Scholar 

  7. Kasneci, E., et al.: ChatGPT for good? on opportunities and challenges of large language models for education. Learn. Individ. Differ.103, 102274 (2023)

    Google Scholar 

  8. Lange, R., et al.: Discovering evolution strategies via meta-black-box optimization. In: Proceedings of the Companion Conference on Genetic and Evolutionary Computation, pp. 29–30 (2023)

    Google Scholar 

  9. Li, H., Zhang, Q.: Multiobjective optimization problems with complicated pareto sets, MOEA/D and NSGA-II. IEEE Trans. Evol. Comput.13(2), 284–302 (2008)

    Article MATH  Google Scholar 

  10. Li, K., Zhang, T., Wang, R.: Deep reinforcement learning for multiobjective optimization. IEEE Trans. Cybern.51(6), 3103–3114 (2020)

    Article MATH  Google Scholar 

  11. Li, X., Wu, K., Zhang, X., Wang, H., Liu, J.: OptfFrmer: beyond transformer for black-box optimization (2022)

    Google Scholar 

  12. Lin, X., Yang, Z., Zhang, Q.: Pareto set learning for neural multi-objective combinatorial optimization (2022)

    Google Scholar 

  13. Liu, F., et al.: Evolution of heuristics: Towards efficient automatic algorithm design using large language model. In: Forty-first International Conference on Machine Learning (2024)

    Google Scholar 

  14. Liu, F., et al.: A systematic survey on large language models for algorithm design. arXiv preprintarXiv:2410.14716 (2024)

  15. Liu, S., Lin, Q., Li, J., Tan, K.C.: A survey on learnable evolutionary algorithms for scalable multiobjective optimization. IEEE Trans,. Evol. Comput.27, 1941–1961 (2023)

    Google Scholar 

  16. Liu, X., Sun, J., Zhang, Q., Wang, Z., Xu, Z.: Learning to learn evolutionary algorithm: a learnable differential evolution. IEEE Trans. Emerg. Top. Comput. Intell.7, 1605–1620 (2023)

    Article MATH  Google Scholar 

  17. Meyerson, E., Nelson, M.J., Bradley, H., Moradi, A., Hoover, A.K., Lehman, J.: Language model crossover: variation through few-shot prompting. arXiv preprintarXiv:2302.12170 (2023)

  18. Miettinen, K.: Nonlinear Multiobjective Optimization, vol. 12. Springer Science & Business Media (2012).https://doi.org/10.1007/978-1-4615-5563-6

  19. Min, S., et al.: Rethinking the role of demonstrations: What makes in-context learning work? arXiv preprintarXiv:2202.12837 (2022)

  20. Mukhopadhyay, A., Maulik, U., Bandyopadhyay, S., Coello, C.A.C.: Survey of multiobjective evolutionary algorithms for data mining: part II. IEEE Trans. Evol. Comput.18(1), 20–35 (2013)

    Article MATH  Google Scholar 

  21. Penghui, L., Wu, K., Liu, J.: DECN: evolution inspired deep convolution network for black-box optimization (2022)

    Google Scholar 

  22. Ponsich, A., Jaimes, A.L., Coello, C.A.C.: A survey on multiobjective evolutionary algorithms for the solution of the portfolio optimization problem and other finance and economics applications. IEEE Trans. Evol. Comput.17(3), 321–344 (2012)

    Article MATH  Google Scholar 

  23. Shao, Y., et al.: Multi-objective neural evolutionary algorithm for combinatorial optimization problems. IEEE Trans. Neural Netw. Learn. Syst.34, 2133–2143 (2021)

    Article MATH  Google Scholar 

  24. Tanabe, R., Ishibuchi, H.: An easy-to-use real-world multi-objective optimization problem suite. Appl. Soft Comput.89, 106078 (2020)

    Article MATH  Google Scholar 

  25. Tian, Y., Cheng, R., Zhang, X., Jin, Y.: PlatEMO: a MATLAB platform for evolutionary multi-objective optimization [educational forum]. IEEE Comput. Intell. Mag.12(4), 73–87 (2017)

    Article MATH  Google Scholar 

  26. Trivedi, A., Srinivasan, D., Sanyal, K., Ghosh, A.: A survey of multiobjective evolutionary algorithms based on decomposition. IEEE Trans. Evol. Comput.21(3), 440–462 (2016)

    MATH  Google Scholar 

  27. Wang, Z., et al.: Multiobjective optimization-aided decision-making system for large-scale manufacturing planning. IEEE Trans. Cybern.52(8), 8326–8339 (2021)

    Article MATH  Google Scholar 

  28. Wei, J., et al.: Larger language models do in-context learning differently. arXiv preprintarXiv:2303.03846 (2023)

  29. Xiong, M., et al.: Can LLMS express their uncertainty? An empirical evaluation of confidence elicitation in LLMS. arXiv preprintarXiv:2306.13063 (2023)

  30. Yang, C., et al.: Large language models as optimizers. arXiv preprintarXiv:2309.03409 (2023)

  31. Yao, Y., Liu, F., Cheng, J., Zhang, Q.: Evolve cost-aware acquisition functions using large language models. arXiv preprintarXiv:2404.16906 (2024)

  32. Yu, C., Liu, X., Tang, C., Feng, W., Lv, J.: GPT-NAS: neural architecture search with the generative pre-trained model. arXiv preprintarXiv:2305.05351 (2023)

  33. Zhang, Q., Li, H.: MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput.11(6), 712–731 (2007)

    Article MATH  Google Scholar 

  34. Zhang, S., Gong, C., Wu, L., Liu, X., Zhou, M.: AutoML-GPT: automatic machine learning with GPT. arXiv preprintarXiv:2305.02499 (2023)

  35. Zhang, Y., Zhou, K., Liu, Z.: What makes good examples for visual in-context learning? arXiv preprintarXiv:2301.13670 (2023)

  36. Zhang, Z., Wu, Z., Zhang, H., Wang, J.: Meta-learning-based deep reinforcement learning for multiobjective optimization problems. IEEE Trans. Neural Netw. Learn. Syst.34, 7978–7991 (2022)

    Google Scholar 

  37. Zhao, W.X., et al.: A survey of large language models. arXiv preprintarXiv:2303.18223 (2023)

  38. Zheng, M., et al.: Can GPT-4 perform neural architecture search? arXiv preprintarXiv:2304.10970 (2023)

  39. Zhou, A., Qu, B.Y., Li, H., Zhao, S.Z., Suganthan, P.N., Zhang, Q.: Multiobjective evolutionary algorithms: a survey of the state of the art. Swarm Evol. Comput.1(1), 32–49 (2011)

    Article MATH  Google Scholar 

  40. Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evol. Comput.8(2), 173–195 (2000)

    Article MATH  Google Scholar 

Download references

Acknowledgments

The work described in this paper was supported by the Research Grants Council of the Hong Kong Special Administrative Region, China [GRF Project No. CityU11215723 ] and Natural Science Foundation of China (Project No: 62276223).

Author information

Authors and Affiliations

  1. City University of Hong Kong, Hong Kong, China

    Fei Liu, Xi Lin, Shunyu Yao & Qingfu Zhang

  2. The City University of Hong Kong Shenzhen Research Institute, Shenzhen, China

    Qingfu Zhang

  3. Southern University of Science and Technology, Shenzhen, China

    Zhenkun Wang

  4. Huawei Noah’s Ark Lab, Shenzhen, China

    Xialiang Tong & Mingxuan Yuan

Authors
  1. Fei Liu

    You can also search for this author inPubMed Google Scholar

  2. Xi Lin

    You can also search for this author inPubMed Google Scholar

  3. Shunyu Yao

    You can also search for this author inPubMed Google Scholar

  4. Zhenkun Wang

    You can also search for this author inPubMed Google Scholar

  5. Xialiang Tong

    You can also search for this author inPubMed Google Scholar

  6. Mingxuan Yuan

    You can also search for this author inPubMed Google Scholar

  7. Qingfu Zhang

    You can also search for this author inPubMed Google Scholar

Corresponding author

Correspondence toFei Liu.

Editor information

Editors and Affiliations

  1. University of New South Wales, Canberra, NSW, Australia

    Hemant Singh

  2. University of New South Wales, Canberra, NSW, Australia

    Tapabrata Ray

  3. SLB Cambridge Research, Cambridge, UK

    Joshua Knowles

  4. RMIT, Melbourne, VIC, Australia

    Xiaodong Li

  5. University of Warwick, Coventry, Warwickshire, UK

    Juergen Branke

  6. University of New South Wales, Canberra, NSW, Australia

    Bing Wang

  7. Japan Aerospace Exploration Agency, Tokyo, Japan

    Akira Oyama

Rights and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, F.et al. (2025). Large Language Model for Multiobjective Evolutionary Optimization. In: Singh, H.,et al. Evolutionary Multi-Criterion Optimization. EMO 2025. Lecture Notes in Computer Science, vol 15513. Springer, Singapore. https://doi.org/10.1007/978-981-96-3538-2_13

Download citation

Publish with us

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 6634
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 8293
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only


[8]ページ先頭

©2009-2025 Movatter.jp