Preview

Russian Technological Journal

Advanced search

Framework for experimental evaluation of software solutions in a virtual environment

https://doi.org/10.32362/2500-316X-2022-10-5-16-27

Abstract

Objectives. Ready-made information technology solutions used when developing software have various characteristics depending on the objectives to be experimentally obtained. While the selection of appropriate technologies and software tools used in experimental software engineering can be time-consuming, experimental complexity can be reduced by providing the researcher with domain-specific tools. The aim of the study is to design and develop a domain-specific software framework for experimental evaluation of the characteristics of information technology solutions in a virtual environment.

Methods. To determine the required characteristics of the software framework, an analysis of software tools for conducting experimental studies to evaluate the characteristics of information technology solutions in a virtual environment was conducted. Methods of decomposition, structural design, and software development were applied to design and develop the framework.

Results. A software framework for conducting experimental research has been developed. The design results, key features of the framework and a description of the functionality are presented. The implementation of the framework comprises commands for managing virtual machines and commands for scaffolding. A technique for conducting experimental studies using the framework is proposed.

Conclusions. The developed domain-specific software framework addresses shortcomings of existing tools to reduce labor costs when conducting experiments to evaluate information technology solutions. The developed framework and proposed methodology allows the number of programming and markup languages required for setting up a software experiment to be reduced from 3 to 1.

About the Author

D. Ilin
https://www.researchgate.net/profile/Dmitry-Ilin-2
MIREA - Russian Technological University
Russian Federation

Dmitry Ilin - Cand. Sci. (Eng.), Associate Professor, Department of Data Processing Digital Technologies, Institute of Cybersecurity and Digital Technologies, MIREA - Russian Technological University.

78, Vernadskogo pr., Moscow, 119454.

ResearcherID J-7668-2017, Scopus Author ID 57203848706


Competing Interests:

The author declares no conflicts of interest.



References

1. Barrett E., Bolz-Tereick C.F., Killick R., Mount S., Tratt L. Virtual machine warmup blows hot and cold. In: Proc. ACM Program. Lang. 2017;1:52:1-52:27. https://doi.org/10.1145/3133876

2. Eismann S., Bezemer C.-P, Shang W., Okanovic D., van Hoorn A. Microservices: A performance tester's dream or nightmare? In: Proceedings of the ACM/SPEC International Conference on Performance Engineering, Edmonton AB Canada: ACM; 2020. P. 138-149. https://doi.org/10.1145/3358960.3379124

3. Curino C., Godwal N., Kroth B., Kuryata S., Lapinski G., Liu S., et al. MLOS: An infrastructure for automated software performance engineering. In: Proceedings of the Fourth International Workshop on Data Management for End-to-End Machine Learning. 2020:1-5. https://doi.org/10.1145/3399579.3399927

4. Jiang Z.M., Hassan A.E. A survey on load testing of large-scale software systems. IEEE Transactions on Software Engineering. 2015;41(11):1091-1118. https://doi.org/10.1109/TSE.2015.2445340

5. Alankar B., Sharma G., Kaur H., Valverde R., Chang V. Experimental setup for investigating the efficient load balancing algorithms on virtual cloud. Sensors. 2020;20(24):7342. https://doi.org/10.3390/s20247342

6. Spanaki P., Sklavos N. Cloud Computing: security issues and establishing virtual cloud environment via Vagrant to secure cloud hosts. In: Daimi K. (Ed.). Computer and Network Security Essentials. Springer, Cham; 2018. P. 539-553. https://doi.org/10.1007/978-3-319-58424-9_31

7. Saingre D., Ledoux T., Menaud J.-M. BCTMark: a framework for benchmarking blockchain technologies. In: 2020 IEEE/ACS 17th International Conference on Computer Systems and Applications (AICCSA). Antalya, Turkey: IEEE; 2020. P. 1-8. https://doi.org/10.1109/AICCSA50499.2020.9316536

8. Potdar A.M., Narayan D.G., Kengond S., Mulla M.M. Performance evaluation of Docker container and virtual machine. Procedia Computer Science. 2020;171:1419-1428. https://doi.org/10.1016/j.procs.2020.04.152

9. Kucek S., Leitner M. An empirical survey of functions and configurations of open-source Capture the Flag (CTF) environments. J. Network Comput. Appl. 2020;151:102470. https://doi.org/10.1016/j.jnca.2019.102470

10. Chirigati F., Rampin R., Shasha D., Freire J. ReproZip: Computational reproducibility with ease. In: Proceedings of the 2016 International Conference on Management of Data. New York, USA: Association for Computing Machinery; 2016. P. 2085-2088. https://doi.org/10.1145/2882903.2899401

11. Steeves V., Rampin R., Chirigati F. Using ReproZip for reproducibility and library services. IASSIST Quarterly. 2018;42(1):14-14. https://doi.org/10.29173/iq18

12. Jimenez I., Sevilla M., Watkins N., Maltzahn C., Lofstead J., Mohror K., et al. The Popper convention: making reproducible systems evaluation practical. In: 2017 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW). 2017. P. 1561-1570. https://doi.org/10.1109/IPDPSW.2017.157

13. Papadopoulos A.V., Versluis L., Bauer A., Herbst N., von Kistowski J., Ali-Eldin A., et al. Methodological principles for reproducible performance evaluation in cloud computing. IEEE Trans. Software Eng. 2021;47(8): 1528-1543. https://doi.org/10.1109/TSE.2019.2927908

14. Artac M., Borovssak T., Di Nitto E., Guerriero M., Tamburri D.A. DevOps: introducing infrastructure-as-code. In: 2017 IEEE/ACM 39th International Conference on Software Engineering Companion (ICSE-C). 2017. P. 497-498. https://doi.org/10.1109/ICSE-C.2017.162

15. Marquardson J. Infrastructure tools for efficient cybersecurity exercises. Inform. Systems Education. J. 2018;16(6):23-30.

16. Simec A., Drzanic B., Lozic D. Isolated environment tools for software development. In: 2018 International Conference on Applied Mathematics Computer Science (ICAMCS). 2018. P. 48-484. https://doi.org/10.1109/ICAMCS46079.2018.00016

17. Stillwell M., Coutinho J.G.F. A DevOps approach to integration of software components in an EU research project. In: Proceedings of the 1st International Workshop on Quality-Aware DevOps. New York, USA: Association for Computing Machinery; 2015. P. 1-6. https://doi.org/10.1145/2804371.2804372

18. Magomedov S., Ilin D., Nikulchev E. Resource analysis of the log files storage based on simulation models in a virtual environment. Appl. Sci. 2021;11(11):4718. https://doi.org/10.3390/app11114718

19. Staubitz T., Brehm M., Jasper J., Werkmeister T., Teusner R., Willems C., et al. Vagrant virtual machines for hands-on exercises in massive open online courses. In: Uskov V.L., Howlett R.J., Jain L.C. (Eds.). Smart Education and e-Learning 2016. Springer, Cham; 2016. P. 363-373. https://doi.org/10.1007/978-3-319-39690-3_32

20. Berger O., Gibson J.P., Lecocq C., Bac C. Designing a virtual laboratory for a relational database MOOC. In: Proceedings of the 7th International Conference on Computer Supported Education. Lisbon, Portugal: SCITEPRESS - Science and and Technology Publications; 2015. P. 260-268. https://doi.org/10.5220/0005439702600268

21. Hobeck R., Weber I., Bass L., Yasar H. Teaching DevOps: a tale of two universities. In: Proceedings of the 2021 ACM SIGPLAN International Symposium on SPLASH-E, New York, USA: Association for Computing Machinery; 2021. P. 26-31. https://doi.org/10.1145/3484272.3484962

22. Shah J., Dubaria D., Widhalm J. A survey of DevOps tools for networking. In: 2018 9th IEEE Annual Ubiquitous Computing, Electronics Mobile Communication Conference (UEMCON). 2018. P. 185-188. https://doi.org/10.1109/UEMCON.2018.8796814

23. Sandobalm J., Insfran E.,Abrahao S. On the effectiveness of tools to support infrastructure as code: Model-driven versus code-centric. IEEE Access. 2020;8:17734-17761. https://doi.org/10.1109/ACCESS.2020.2966597


Supplementary files

1. Links to the configuration files of the project for conducting an experimental study
Subject
Type Исследовательские инструменты
View (107KB)    
Indexing metadata ▾
  • Shortcomings of using existing tools for conducting experimental research aimed to evaluate the characteristics of information technology solutions in a virtual environment were identified.
  • A domain-specific software framework was designed and developed. The key features of the framework that differentiate it from the incorporated software tools were presented.
  • A technique for conducting experimental studies using the framework was proposed.

Review

For citations:


Ilin D. Framework for experimental evaluation of software solutions in a virtual environment. Russian Technological Journal. 2022;10(5):16-27. https://doi.org/10.32362/2500-316X-2022-10-5-16-27

Views: 438


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2782-3210 (Print)
ISSN 2500-316X (Online)