Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Advertisement

Springer Nature Link
Log in

Performance Analysis Strategies for Software Variants and Versions

  • Chapter
  • Open Access
  • First Online:

You have full access to thisopen access chapter

Abstract

This chapter is devoted to the performance analysis of configurable and evolving software. Both configurability and evolution imply a high degree of software variation, that is a large space of software variants and versions, that challenges state-of-the-art analysis techniques for software. We give an overview on strategies to cope with software variation, which mostly focuses either on configuration (variants) or evolution (versions). Interestingly, we found several directions where research on variants and versions can profit from one another.

Similar content being viewed by others

References

  1. Mustafa Al-Hajjaji et al. “IncLing: Efficient Product-line Testing Using Incremental Pairwise Sampling”. In:Proceedings of the 2016 ACM SIGPLAN International Conference on Generative Programming: Concepts and Experiences. GPCE 2016. Amsterdam, Netherlands: ACM, 2016, pp. 144–155.isbn: 978-1-4503-4446-3.url:http://doi.acm.org/10.1145/2993236.2993253.

  2. Mustafa Al-Hajjaji et al. “Tool Demo: Testing Configurable Systems with FeatureIDE”. In:Proceedings of the 2016 ACM SIGPLAN International Conference on Generative Programming: Concepts and Experiences. GPCE 2016. Amsterdam, Netherlands: ACM, 2016, pp. 173–177.isbn: 978-1-4503-4446-3.url:http://doi.acm.org/10.1145/2993236.2993254.

  3. Sven Apel et al.Feature-Oriented Software Product Lines: Concepts and Implementation Springer Publishing Company Incorporated, 2013.isbn: 9783642375200.

    Google Scholar 

  4. Andreas Brunnert et al.Performance-oriented DevOps: A Research Agenda. Tech. rep. SPEC-RG-2015-01. SPEC Research Group — DevOps Performance Working Group, Standard Performance Evaluation Corporation (SPEC), Aug. 2015.url:http://arxiv.org/abs/1508.04752.

  5. Krzysztof Czarnecki and Ulrich W. Eisenecker.Generative Programming: Methods, Tools, and Applications. New York, NY, USA: ACM Press/Addison-Wesley Publishing Co., 2000.isbn: 0-201-30977-7.

    Google Scholar 

  6. Alexander Grebhahn et al. “Experiments on Optimizing the Performance of Stencil Codes with SPL Conqueror”. In:Parallel Processing Letters24.3 (2014).url:https://doi.org/10.1142/S0129626414410011.

    Article MathSciNet  Google Scholar 

  7. Alexander Grebhahn et al. “Performance-influence models of multigrid methods: A case study on triangular grids”. In:Concurrency and Computation: Practice and Experience 29.17 (2017).url:https://doi.org/10.1002/cpe.4057.

    Article  Google Scholar 

  8. Jianmei Guo et al. “Variability-aware Performance Prediction: A Statistical Learning Approach”. In:Proceedings of the 28th IEEE/ACM International Conference on Automated Software Engineering. ASE’13. Silicon Valley, CA, USA: IEEE Press, 2013, pp. 301–311.isbn: 978-1-4799-0215-6.url:https://doi.org/10.1109/ASE.2013.6693089.

  9. Wilhelm Hasselbring et al. iObserve:Integrated Observation and Modeling Techniques to Support Adaptation and Evolution of Software Systems. Forschungsbericht. Kiel, Germany: Kiel University, Oct. 2013.

    Google Scholar 

  10. Robert Heinrich et al. “Integrating Run-Time Observations and Design Component Models for Cloud System Analysis”. In:Proceedings of the 9th Workshop on Models@run.time. Vol. 1270. Workshop Proceedings. CEUR, Sept. 2014, pp. 41–46.

    Google Scholar 

  11. Robert Heinrich et al. “Software Architecture for Big Data and the Cloud”. In: Elsevier, 2017. Chap. An Architectural Model-Based Approach to Quality-aware DevOps in Cloud Applications, pp. 69–89.

    Google Scholar 

  12. Reiner Jung, Robert Heinrich, and Eric Schmieders. “Model-driven Instrumentation with Kieker and Palladio to forecast Dynamic Applications”. In:Proceedings Symposium on Software Performance: Joint Kieker/Palladio Days 2013 (KPDAYS 2013). Vol. 1083. CEUR Workshop Proceedings. CEUR, Nov. 2013, pp. 99–108.url:http://eprints.uni-kiel.de/22655/.

  13. Reiner Jung and ChristianWulf. “Advanced Typing for the Kieker Instrumentation Languages”. In:Symposium on Software Performance 2016. Nov. 2016.url:http://eprints.uni-kiel.de/34626/.

  14. K. C. Kang et al.Feature-Oriented Domain Analysis (FODA) Feasibility Study. Tech. rep. Carnegie-Mellon Univ Pittsburgh Pa Software Engineering Inst, 1990.

    Book  Google Scholar 

  15. Matthias Kowal et al. “Scaling Size and Parameter Spaces in Variability-Aware Software Performance Models (T)”. In:Proceedings of the 2015 30th IEEE/ACM International Conference on Automated Software Engineering (ASE). ASE ’15. Washington, DC, USA: IEEE Computer Society, 2015, pp. 407–417.isbn: 978-1-5090-0025-8.url:https://doi.org/10.1109/ASE.2015.16.

  16. Matthias Kowal, Ina Schaefer, and Mirco Tribastone. “Family-Based Performance Analysis of Variant-Rich Software Systems”. In:International Conference on Fundamental Approaches to Software Engineering. Vol. 8411. Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2014.

    Chapter  Google Scholar 

  17. Jens Meinicke et al.Mastering Software Variability with FeatureIDE. BerlinHeidelberg: Springer, 2017.isbn: 978-3-319-61442-7.

    Google Scholar 

  18. Vivek Nair et al. “Using Bad Learners to Find Good Configurations”. In:Proceedings of the 2017 11th Joint Meeting on Foundations of Software Engineering. ESEC/FSE 2017. Paderborn, Germany: ACM, 2017, pp. 257–267.isbn: 978-1-4503-5105-8.url:http://doi.acm.org/10.1145/3106237.3106238.

  19. Atri Sarkar et al. “Cost-Efficient Sampling for Performance Prediction of Configurable Systems (T)”. In:Proceedings of the 2015 30th IEEE/ACM International Conference on Automated Software Engineering (ASE). ASE ’15. Washington, DC, USA: IEEE Computer Society, 2015, pp. 342–352.isbn: 978-1-5090-0025-8.url:http://dx.doi.org/10.1109/ASE.2015.45.

    Google Scholar 

  20. Norbert Siegmund et al. “Predicting Performance via Automated Feature-interaction Detection”. In:Proceedings of the 34th International Conference on Software Engineering. ICSE ’12. Zurich, Switzerland: IEEE Press, 2012, pp. 167–177.isbn: 978-1-4673-1067-3.url:http://dl.acm.org/citation.cfm?id=2337223.2337243.

  21. Norbert Siegmund et al. “Scalable prediction of non-functional properties in software product lines: Footprint and memory consumption”. In:Information & Software Technology 55.3 (2013), pp. 491–507.url:https://doi.org/10.1016/j.infsof.2012.07.020.

    Article  Google Scholar 

  22. Norbert Siegmund et al. “Performance-influence Models for Highly Configurable Systems”. In:Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering. ESEC/FSE 2015. Bergamo, Italy: ACM, 2015, pp. 284–294.isbn: 978-1-4503-3675-8.url:http://doi.acm.org/10.1145/2786805.2786845.

  23. Thomas Thüm et al. “FeatureIDE: An Extensible Framework for Feature-oriented Software Development”. In:Sci. Comput. Program. 79 (Jan. 2014), pp. 70–85.issn: 0167-6423.url:http://dx.doi.org/10.1016/j.scico.2012.06.002.

  24. Jürgen Walter et al. “An Expandable Extraction Framework for Architectural Performance Models”. In:Proceedings of the 3rd International Workshop on Quality- Aware DevOps (QUDOS’17). ACM, Apr. 2017.

    Google Scholar 

  25. Jürgen Walter et al. “Online Learning of Run-time Models for Performance and Resource Management in Data Centers”. In:Self-Aware Computing Systems. Springer Verlag, 2017.

    Google Scholar 

  26. Murray Woodside, Greg Franks, and Dorina C. Petriu. “The Future of Software Performance Engineering”. In:2007 Future of Software Engineering. FOSE ’07. Washington, DC, USA: IEEE Computer Society, 2007, pp. 171–187.isbn: 0-7695-2829-5.url:http://dx.doi.org/10.1109/FOSE.2007.32.

    Google Scholar 

Download references

Author information

Authors and Affiliations

  1. Institute for Software Engineering and Automotive Informatics, TU Braunschweig, Brunswick, Germany

    Thomas Thüm, Matthias Kowal & Ina Schaefer

  2. Institute of Software Technology, University of Stuttgart, Stuttgart, Germany

    André van Hoorn

  3. Chair of Software Engineering I, Department of Informatics and Mathematics, University of Passau, Passau, Germany

    Sven Apel

  4. Technische Universität Darmstadt, Fachbereich Elektrotechnik und Informationstechnik, Fachgebiet Echtzeitsysteme, Darmstadt, Germany

    Johannes Bürdek & Malte Lochau

  5. Institut für Informatik, Johann-von-Neumann-Haus, Humboldt-Universität zu Berlin, Berlin, Germany

    Sinem Getir

  6. Institute for Program Structures and Data Organization, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany

    Robert Heinrich

  7. Software Engineering Group, Department of Computer Science, Kiel University, Kiel, Germany

    Reiner Jung

  8. Chair of Computer Science II, Universität Würzburg, Würzburg, Germany

    Jürgen Walter

Authors
  1. Thomas Thüm

    You can also search for this author inPubMed Google Scholar

  2. André van Hoorn

    You can also search for this author inPubMed Google Scholar

  3. Sven Apel

    You can also search for this author inPubMed Google Scholar

  4. Johannes Bürdek

    You can also search for this author inPubMed Google Scholar

  5. Sinem Getir

    You can also search for this author inPubMed Google Scholar

  6. Robert Heinrich

    You can also search for this author inPubMed Google Scholar

  7. Reiner Jung

    You can also search for this author inPubMed Google Scholar

  8. Matthias Kowal

    You can also search for this author inPubMed Google Scholar

  9. Malte Lochau

    You can also search for this author inPubMed Google Scholar

  10. Ina Schaefer

    You can also search for this author inPubMed Google Scholar

  11. Jürgen Walter

    You can also search for this author inPubMed Google Scholar

Corresponding author

Correspondence toThomas Thüm.

Editor information

Editors and Affiliations

  1. Institute for Program Structures and Data Organization, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany

    Ralf Reussner

  2. paluno, Universität Duisburg-Essen, Essen, Germany

    Michael Goedicke

  3. Software Engineering Group Dept. Computer Science, Kiel University, Kiel, Germany

    Wilhelm Hasselbring

  4. Institute of Automation and Information Systems, Technische Universität München, Garching, Germany

    Birgit Vogel-Heuser

  5. Institute for Program Structures and Data Organization, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany

    Jan Keim

  6. Institute for Programming and Reactive Systems, Technische Universität Braunschweig, Braunschweig, Germany

    Lukas Märtin

Rights and permissions

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Reprints and permissions

Copyright information

© 2019 The Author(s)

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Thüm, T.et al. (2019). Performance Analysis Strategies for Software Variants and Versions. In: Reussner, R., Goedicke, M., Hasselbring, W., Vogel-Heuser, B., Keim, J., Märtin, L. (eds) Managed Software Evolution. Springer, Cham. https://doi.org/10.1007/978-3-030-13499-0_8

Download citation

Publish with us


[8]ページ先頭

©2009-2025 Movatter.jp