The High-Performance Linpack (HPL) Evaluation on MIHIR High Performance Computing Facility at NCMRWF
Research Paper | Journal Paper
Vol.13 , Issue.3 , pp.1-8, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.18
Abstract
The National Centre for Medium Range and Weather Forecasting (NCMRWF) has, the MIHIR High Performance Computing (HPC) Facility with total computing capacity of 2.8 Petaflops to run Numerical Weather Prediction (NWP) models, enabling accurate and timely weather forecasting. These models need computations performed in PFLOPS (Peta Floating Point Operations Per Second). The HPC nodes are interconnected by the high speed, low latency Cray Aries network. High Performance Linpack (HPL) version 2.3 has been compiled and installed on the system for the study. The purpose of running HPL is to demonstrate the current computing performance of the HPC, and to assess the systems efficiency by analyzing the actual calculated performance (Rmax) and the theoretical peak performance (Rpeak, derived from system specifications). We present a performance evaluation of the HPL benchmark on MIHIR, conducting a detailed analysis of HPL parameters to optimize performance. The aim is to identify the best-optimized parameter values for MIHIR and determine the maximum achievable performance of the compute nodes, utilizing up to 300 nodes available for research.
Key-Words / Index Term
HPL, HPC, Aries interconnect, MIHIR, NCMRWF, PFLOPS
References
[1] A. Petitet, R. C. Whaley, J. Dongarra, A. Cleary, “HPL - A Portable Implementation of the High-Performance Linpack Benchmark for Distributed-Memory Computers,” Innovative Computing Laboratory, University of Tennessee, pp.1-10, 2018.
[2] A. Smith, B. Johnson, "Performance Analysis of HPL on Intel`s Icelake Architecture," Proceedings of the International Conference on High-Performance Computing Systems, pp.123-130, 2021.
[3] C. Lawson, R. Hanson, D. Kincaid, and F. Krogh, “Basic Linear Algebra Subprograms for Fortran usage,” ACM Transactions on Mathematical Software, Vol.5, Issue.4, pp.308–323, 1979.
[4] I. M. Jelas, N. A. W. A. Hamid, M. Othman, “The High-Performance Linpack (HPL) Benchmark on the Khaldun Sandbox Cluster,” Journal of High-Performance Computing, pp.1-15, 2013.
[5] Intel Corporation, "Intel® Math Kernel Library: Reference Manual," 2020.
[6] J. J. Dongarra, J. R. Bunch, G. B. Moler, G. W. Stewart, “LINPACK Users` Guide,” Society for Industrial and Applied Mathematics (SIAM), USA, pp.1-250, 1979.
[7] J. J. Dongarra, P. Luszczek, A. Petitet, “The LINPACK Benchmark: Past, Present, and Future,” Concurrency and Computation: Practice & Experience, Vol.15, pp.803-820, 2003.
[8] J. J. Dongarra, H. W. Meuer, E. Strohmaier, "TOP500 Supercomputer Sites," International Journal of High Performance Computing Applications, Vol.11, No.3, pp.90-94, 1997.
[9] Khang T. Nguyen, "Performance Comparison of OpenBLAS and Intel oneAPI Math Kernel Library in R," International Journal of Computational Science and Engineering, Vol.5, No.3, pp.123-130, 2020.
[10] M. Snir, S. Otto, S. Huss-Lederman, D. Walker, J. J. Dongarra, “MPI: The Complete Reference,” MIT Press, USA, pp.1-500, 1996.
[11] M. Fatica, "Accelerating Linpack with CUDA on Heterogeneous Clusters," Proceedings of the 2nd Workshop on General-Purpose Processing on Graphics Processing Units (GPGPU-2), pp.46-51, 2009.
[12] Wong Chun Shiang, Izzatdin Abdul Aziz, Nazleeni Samiha Haron, Jafreezal Jaafar, Norzatul Natrah Ismail, Mazlina Mehat, “The High-Performance Linpack (HPL) Benchmark Evaluation on UTP High-Performance Cluster Computing,” Jurnal Teknologi, Vol.78, Issue.9, pp.21–30, 2016.
Citation
Shivali Gangwar , B. Athiyaman, Preveen Kumar D., "The High-Performance Linpack (HPL) Evaluation on MIHIR High Performance Computing Facility at NCMRWF," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.1-8, 2025.
AI’s Transformative Role in Healthcare Data Management: Enhancing Governance, Security, and Interoperability
Research Paper | Journal Paper
Vol.13 , Issue.3 , pp.9-15, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.915
Abstract
Artificial intelligence is revolutionizing health data management. It strengthens governance, security, and interoperability. With the explosion of data in medical treatment, AI-driven solutions greatly facilitate data processing speed, reduce errors, and ensure compliance with standards. By automating quality control processes, AI is transforming data governance. Security tokens obstruct unwanted access to network assets (VPNs and anomaly detection systems are completed). They also enable dialogue between incompatible healthcare systems, allowing them to interact with each other even when one system cannot recognize the commands or parameters sent by another system to achieve communication within heterogeneous environments. Furthermore, through real-time clinical decision-making, AI addresses problems that may arise from integrating data from multiple sources or attempting to standardize everything in order to create better patient care outcomes. For all these reasons, the potential of AI to build a healthcare ecosystem that is resilient for the future and ready for tomorrow emerges clearly.
Key-Words / Index Term
Artificial Intelligence (AI), Healthcare, Data Management, Data Governance, Security and Privacy and Regulatory Compliance
References
[1] Topol E. J. High-performance medicine: the convergence of human and artificial intelligence, Nature Medicine, Vol.25, Issue.1, pp.44–56, 2019. https://doi.org/10.1038/s41591-018-0300-7
[2] Picard R. W., Affective Computing, MIT Press, 1997.
[3] Rieke N., Hancox J., Li W., et al. The future of digital health with federated learning, npj Digital Medicine, Vol.3, No.119, 2020. https://doi.org/10.1038/s41746-020-00323-1
[4] Mandl K.D., Kohane I.S., Time for a patient-driven health information economy?, New England Journal of Medicine, Vol.374, pp.595–598, 2016. https://doi.org/10.1056/NEJMp1511931
[5] Wang F., Casalino L. P., Khullar D., Deep learning in medicine promise, progress, and challenges, Circulation: Cardiovascular Quality and Outcomes, Vol.11, Issue.10, 2018. https://doi.org/10.1161/CIRCOUTCOMES.118.004723
[6] Van der Schaar M., et al. How artificial intelligence is changing clinical development, The Lancet Digital Health, Vol.3, Issue.11, pp.e599–e610, 2021. https://doi.org/10.1016/S2589-7500(21)00170-3
[7] Wong T.Y., Bressler N. M., Artificial intelligence in ophthalmology: A review, Progress in Retinal and Eye Research, Vol.72, 2019, https://doi.org/10.1016/j.preteyeres.2019.04.003
[8] Koller D., Friedman N., Probabilistic Graphical Models: Principles and Techniques, MIT Press, 2009.
[9] Johan H., Federated Learning in Healthcare: Decentralized Intelligence for Data Privacy, In Proceedings of the 2022 IEEE International Conference on Healthcare Informatics, Singapore, pp.112–118, 2022.
[10] McCradden M. D., et al. Ethical concerns around use of AI in health care, Canadian Medical Association Journal (CMAJ), Vol.191, Issue.9, pp.E257–E258, 2019. https://doi.org/10.1503/cmaj.181947
Citation
Ravikumar Vallepu, "AI’s Transformative Role in Healthcare Data Management: Enhancing Governance, Security, and Interoperability," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.9-15, 2025.
Evaluating The Digital Performance and Accessibility of IIT Library Websites Using Google Lighthouse
Research Paper | Journal Paper
Vol.13 , Issue.3 , pp.16-23, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.1623
Abstract
This study presents a comprehensive performance analysis of the library websites of 23 Indian Institutes of Technology (IITs) using the Lighthouse tool. IIT libraries are essential information hubs that support the academic and research needs of students, faculty, and external researchers. To assess the digital performance of these websites, Lighthouse, a popular open-source tool by Google, was employed to evaluate key metrics such as performance, accessibility, best practices, and SEO. The analysis revealed significant variation in the quality of IIT library websites, with top performers like IIT Indore and IIT Mandi excelling in performance and accessibility. At the same time, IIT Tirupati and IIT (BHU) Varanasi lagged behind, especially in terms of loading speed and adherence to best practices. The study highlights areas for improvement, such as optimizing resource management, enhancing accessibility for users with disabilities, and improving search engine optimization (SEO) to increase discoverability. Actionable recommendations help IIT libraries improve their digital presence, ensuring a more efficient, accessible, and user-friendly user experience.
Key-Words / Index Term
IIT Libraries, Google Lighthouse, Web Performance, Accessibility, SEO, Best Practices, Digital Libraries, Website Analysis.
References
[1] O. Alhadreti, “Accessibility, performance and engagement evaluation of Saudi higher education websites: A comparative study of state and private institutions”, Universal Access in the Information Society, Vol.23, No.4, pp.1671–1688, 2024.
[2] M. L. Lehat, N. F. Abu Bakar, A. A. Jamil, C. M. N. Mohd Shafee, P. Shamala, and S. Rosnan, “Assessing web performance of Malaysian university website”, 2023 IEEE 8th International Conference on Recent Advances and Innovations in Engineering (ICRAIE), India, pp.1–5, 2023.
[3] E. Ogbuju, B. Ayodeji, and A. Azeez, “Performance and accessibility evaluation of university websites in Nigeria”, 2022 5th Information Technology for Education and Development (ITED), Nigeria, pp.1–7, 2022.
[4] S. Sumedrea, C. I. Maican, I. B. Chi?u, E. Nichifor, A. S. Tec?u, R. C. Lix?ndroiu, and G. Br?tucu, “Sustainable digital communication in higher education—A checklist for page loading speed optimization”, Sustainability, Vol.14, No.16, pp.10135, 2022.
[5] A. Giannakoulopoulos, N. Konstantinou, D. Koutsompolis, M. Pergantis, and I. Varlamis, “Academic excellence, website quality, SEO performance: Is there a correlation?”, Future Internet, Vol.11, No.11, pp.242, 2019.
[6] S. Sahoo and K. C. Panda, “Web content analysis of Indian Institute of Technology (IIT) library websites: An evaluative study,” Library Philosophy and Practice (e-journal), No.3949, 2019.
[7] S. F. Verkijika and L. De Wet, “Accessibility of South African university websites”, Universal Access in the Information Society, Vol.19, No.1, pp.201–210, 2018.
[8] T. Jiang, J. Yang, C. Yu, and Y. Sang, “A clickstream data analysis of the differences between visiting behaviors of desktop and mobile users”, Data and Information Management, Vol.2, No.3, pp.130–140, 2018.
[9] K. Devi and A. K. Sharma, “Framework for evaluation of academic website”, International Journal of Computer Sciences and Engineering, Vol.3, No.2, pp.1–5, 2016.
[10] N. Menzi-Çetin, E. Alemda?, H. Tüzün, and Y. M. Merve, “Evaluation of a university website’s usability for visually impaired students”, Universal Access in the Information Society, Vol.16, No.1, pp.151–160, 2017.
Citation
Vikrant Dubey, Shilpi Verma, "Evaluating The Digital Performance and Accessibility of IIT Library Websites Using Google Lighthouse," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.16-23, 2025.
The Dual Edge of Backdoors: Accuracy Analysis and Preventive Strategies for Secure Systems
Research Paper | Journal Paper
Vol.13 , Issue.3 , pp.24-32, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.2432
Abstract
While digital transformation`s benefits are reciprocal, we have vulnerabilities with rapid technological developments, one of which is malware, one of the biggest dangers to digital security. It’s harmful software that can mess up, damage, or sneak into computer systems without permission. In this article, we are going to use Kali Linux backdoor attacks, as we know that backdoor vulnerabilities have emerged as a critical threat to cybersecurity, with recent reports indicating a 45% increase in backdoor-related incidents over the past year. Hence, with the availability of free online tools like VirusTotal and Hybrid analysis, detection remains challenging, but it can detect up to an average detection rate of only 72% for sophisticated backdoors. As such, backdoors are covert methods for attackers to access systems that bypass typical security barriers and represent a major weakness to the integrity, confidentiality, and availability of information systems. This paper defines the implementation of a backdoor and analyzes existing mitigation techniques. It also introduces a holistic approach that combines anomaly detection and code analysis on how we implemented this backdoor using two operating systems. It covers methodologies for monitoring insider activities, detecting anomalous behavior (with the help of free tools) that may indicate the presence of backdoors, and protective actions to reduce the threat posed by trusted users. In this paper, we focus on insiders and their backdoor exploitation capabilities, discussing real-world scenarios in which insiders exploited backdoors for data exfiltration, sabotage, or espionage.
Key-Words / Index Term
Backdoor, Malware, Hackers, Implementation, Cyber Attacks
References
[1] Kaung Myat Thu, "Types of Cyber Attacks and Incident Responses," presented at the 37th Semi-Annual Dr. Janet Liou-Mark Honors & Undergraduate Research Poster Presentation, December 1, 2022.
[2] Orson Mengara, Anderson R. Avila, and Tiago H. Falk, "Backdoor Attacks to Deep Neural Networks: A Survey of the Literature, Challenges, and Future Research Directions," IEEE Access, Vol.12, pp.29004–29023, 2024.
[3] Aniruddha Saha, Akshayvarun Subramanya, and Hamed Pirsiavash, "Hidden Trigger Backdoor Attacks," AAAI Conference on Artificial Intelligence, pp.11957–11965, 2020.
[4] Baoyuan Wu, Hongrui Chen, Mingda Zhang, Zihao Zhu, Shaokui Wei, Danni Yuan, and Chao Shen, "BackdoorBench: A Comprehensive Benchmark of Backdoor Learning," Neural Information Processing Systems (NeurIPS), 2022.
[5] Georgios Syros, Gökberk Yar, Simona Boboila, Cristina Nita-Rotaru, and Alina Oprea, "Backdoor Attacks in Peer-to-Peer Federated Learning," ACM Transactions on Privacy and Security, Vol.28, No.1, pp.1–28, 2025.
[6] Robin Buchta, George Gkoktsis, Felix Heine, and Carsten Kleiner, "Advanced Persistent Threat Attack Detection Systems: A Review of Approaches, Challenges, and Trends," Digital Threats: Research and Practice, Vol.5, No.4, 2024.
[7] Rashid Hussain Khokhar, Windhya Rankothge, Leila Rashidi, Hesamodin Mohammadian, Brian Frei, Shawn Ellis, Iago Freitas, and Ali Ghorbani, "A Survey on Supply Chain Management: Exploring Physical and Cyber Security Challenges, Threats, Critical Applications, and Innovative Technologies," International Journal of Supply and Operations Management, Vol.11, No.3, pp.250–283, 2024.
[8] Mohammed Saadoon and Suhad Faisal, "Malware Detection Using Machine Learning Techniques: A Review," Basrah Journal of Sciences, Vol.42, No.2, 2024.
[9] Ghazaleh Shirvani, Saeid Ghasemshirazi, and Behzad Beigzadeh, "Federated Learning: Attacks, Defenses, Opportunities, and Challenges," arXiv preprint, March 2024.
[10] Antonio Emanuele Cinà, Kathrin Grosse, Sebastiano Vascon, Ambra Demontis, Battista Biggio, Fabio Roli, and Marcello Pelillo, "Backdoor Learning Curves: Explaining Backdoor Poisoning Beyond Influence Functions," International Journal of Machine Learning and Cybernetics, 2024.
[11] M. D’Onghia, F. Di Cesare, L. Gallo, M. Carminati, M. Polino, and S. Zanero, "Lookin` Out My Backdoor! Investigating Backdooring Attacks Against DL-driven Malware Detectors," ACM Workshop on Artificial Intelligence and Security (AISec), pp.209–220, 2023.
[12] Xiaobo Yu, Weizhi Meng, Yining Liu, and Fei Zhou, "TridentShell: An Enhanced Covert and Scalable Backdoor Injection Attack on Web Applications," Journal of Network and Computer Applications, Vol.223, 2024.
[13] Congcong Chen, Lifei Wei, Lei Zhang, Yuxiang Peng, and Jianting Ning, "DeepGuard: Backdoor Attack Detection and Identification Schemes in Privacy-Preserving Deep Neural Networks," Security and Communication Networks, Vol.2022, 2022.
[14] Shuai Zhao, Meihuizi Jia, Zhongliang Guo, Leilei Gan, Xiaoyu Xu, Xiaobao Wu, Jie Fu, Yichao Feng, Fengjun Pan, and Luu Anh Tuan, "A Survey of Recent Backdoor Attacks and Defenses in Large Language Models," arXiv preprint, June 2024.
[15] Quentin Le Roux, El Mahdi Bourbao, Yannick Teglia, and Karim Kallas, "A Comprehensive Survey on Backdoor Attacks and Their Defenses in Face Recognition Systems," IEEE Access, Vol.12, pp.47433–47468, 2024.
[16] Ryan Williams, Carla P. Gomes, and Bart Selman, "Backdoors to Typical Case Complexity," International Joint Conference on Artificial Intelligence (IJCAI), pp.1173–1178, 2003.
[17] Claude Crépeau and Alain Slakmon, "Simple Backdoors for RSA Key Generation," Topics in Cryptology — CT-RSA 2003, Lecture Notes in Computer Science, Vol. 2612, pp.403–416, 2003.
[18] Zhou Yang, Bowen Xu, Jie M. Zhang, Hong Jin Kang, Jieke Shi, Junda He, and David Lo, "Stealthy Backdoor Attack for Code Models," arXiv preprint, January 2023.
[19] Johannes Klaus Fichte, Arne Meier, and Irena Schindler, "Strong Backdoors for Default Logic," ACM Transactions on Computational Logic, Vol.25, No.3, 2024.
[20] Jimmy K. W. Wong, Ki Ki Chung, Yuen Wing Lo, Chun Yin Lai, and Steve W. Y. Mung, "Practical Implementation of Federated Learning for Detecting Backdoor Attacks in a Next-word Prediction Model," Scientific Reports, Vol.15, No.1, pp.2328, 2025.
[21] Xiaoyu Yi, Gaolei Li, Wenkai Huang, Xi Lin, Jianhua Li, and Yuchen Liu, "LateBA: Latent Backdoor Attack on Deep Bug Search via Infrequent Execution Codes," Asia-Pacific Symposium on Internetware, pp.427–436, 2024.
[22] Wenkai Yang, Yunzhuo Hao, and Yankai Lin, "Exploring Backdoor Vulnerabilities of Chat Models," International Conference on Computational Linguistics (COLING 2025), pp.933–946, 2025.
Citation
Arushi Gupta, Safdar Tanweer, Syed Sibtain Khalid, Naseem Rao, "The Dual Edge of Backdoors: Accuracy Analysis and Preventive Strategies for Secure Systems," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.24-32, 2025.
A Bridging Platform for Students and their Alumni using a Social Media Platform
Research Paper | Journal Paper
Vol.13 , Issue.3 , pp.33-40, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.3340
Abstract
Alumni Interconnect is a dynamic social networking platform designed to connect students, alumni, and faculty, fostering mentorship, career guidance, and professional networking. It bridges the gap between students and successful alumni, enabling knowledge-sharing and industry insights. The platform offers an intuitive user experience with features like profile management, an interactive feed, messaging, and video meetings for seamless engagement. A robust authentication system ensures security by preventing fake accounts, while admins regulate content to maintain professionalism. By integrating these features, Alumni Interconnect enhances alumni-student engagement, promoting career growth, mentorship, and a strong institutional community. Its focus on authenticity, security, and interactive learning makes it a premier hub for academic and professional development.
Key-Words / Index Term
User authentication, Content moderation, Profile management, Real-time messaging, Interactive feed, Data Security and Privacy, Mentorship and Career Guidance, Video Calling Integration.
References
[1] M. Bhuvana, J. Hemalatha, S. Abinaya, R. Hemavarshini -“Reconnectify: An Alumni Association Platform,” International Journal of All Research Education and Scientific Methods (IJARESM),ISSN: 2455-6211, Vol.12, Issue.11, pp.1917-1923, 2024.
[2] P.P. Sawai, P.V. Chambhare, A.N. Jaysingpure, A.G. Karhe, D.Rathod, Dr V.S Gulhane - “Alumni Connect Hub:A comprehensive Alumni management system,” International journal of Ingenious Research, Invention and Development, Vol.3, pp.56-64, 2024.
[3] M.Arora, A.Negi, Md. S.Khan- “Student-alumni network web app ,” International Journal Of Advance Research, Ideas And Innovations In technology, Vol.8, Issue.1, pp.3171-3181, 2022.
[4] S.Kathane, E.Dandge, V. Sahu, K. Shapane, J. Zaidi, R. Mathane - “AlumniConnect: Building Bridges Between Alumni and Students,” International Journal of Research Publication and Reviews ,Vol.5, Issue.4, pp.3639-3643, 2024.
[5] N.Barudwale, C.Pandey, A.Wagh, G.Bhasme, Prof.M.Khade - “Survey On Alumni Connect Forum ,” International Research Journal of Modernization in Engineering Technology and Science, Vol.5, Issue.5, pp.3201-3205, 2023.
[6] R. Singh, H. Shukla, P. Flodya, S. Singh , R. Rastogi -“An Efficient System for Interconnecting Alumni after Their Studies: A Multipurpose App for Public Assimilation,” 5th International Conference on Advances in Computing, Communication Control and Networking (ICAC3N), pp.420-425, 2023.
[7] Akash J, Harshavardhan J, Janani B. - “Campus Connect to Facilitate Student-Alumni Engagement,”International Journal of Progressive Research in Science and Engineering, Vol.4, Issue.21, pp.150-156, 2023.
[8] R. Malhotra, M. Massoudi, R.Jindal. - “An Alumni-Based Collaborative Model to Strengthen Academia and Industry Partnership: The Current Challenges and Strengths,” Education and Information Technologies, Vol.28, Issue.15 , pp.2263-2289, 2022.
[9] A. Patil, P. Kamble,S. Patil, V. Kesarkar ,M.A.Pardesi-“Alumni Management System,” International Research Journal of Modernization in Engineering Technology and Science, Vol.3, Issue.5, pp.4540-4543, 2025.
[10] P. Veluvali, J. Surisetti, - “Alumni Engagement in Higher Education Institutions: Perspectives from India,” In Higher Education - Reflections From the Field, pp.1-8, 2023.
[11] P. Deshwal, P.Ghuli - “DevOps: Concept, Technology and Tools,” International Journal of Computer Sciences and Engineering, Vol.8, Issue.6, pp.73-78, 2020.
[12] M. Patil, P. Mote, S. Nanaware, V. Todkar, R. Phuge - “Developing a Comprehensive Website for ‘Gore English School’: An Analytical Study,”International Journal of Computer Sciences and Engineering, Vol.13, Issue.1, pp.33-40, 2025.
Citation
Leena Patil, Aayush Vaibhaw, Shaurya Tripathi, Ananya Ambade, Mansi Sonekar, Vaishali Rajak, "A Bridging Platform for Students and their Alumni using a Social Media Platform," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.33-40, 2025.
Enhancing Chronic Diseases Prediction through Machine Learning and Data Pre-Processing Strategies
Research Paper | Journal Paper
Vol.13 , Issue.3 , pp.41-48, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.4148
Abstract
Leveraging machine learning for the early detection and prevention of chronic diseases, including diabetes, stroke, cancer, cardiovascular conditions, kidney failure, and hypertension, holds significant promise as emphasized by the WHO. This review systematically examines the application of machine learning techniques to predict these conditions using medical records and general health checkup data, with a focus on enhancing prediction accuracy through meticulous error minimization. Critical to this endeavor is the quality of input data, where challenges such as outlier detection, missing value imputation, feature selection, data normalization, and class imbalance pose substantial obstacles to model performance. Effective data preprocessing is thus paramount, ensuring high-quality inputs that facilitate robust model selection. Techniques explored encompass supervised learning, ensemble learning, deep learning, and reinforcement learning. Performance evaluation utilizes metrics like accuracy, recall, precision, and F1-score to gauge model efficacy. Furthermore, this study identifies open research challenges and proposes future directions to improve prediction performance via advanced preprocessing and machine learning methodologies, aiming to optimize data-driven approaches for improved healthcare outcomes.
Key-Words / Index Term
Chronic Disease Prediction, Machine Learning, Data Preprocessing, Feature Selection, Model Evaluation, Healthcare Analytics, Medical Data, Supervised Learning, Deep Learning, Ensemble Learning, Outlier Detection, Missing Value Imputation, Data Normalization, Class Imbalance, Reinforcement Learning.
References
[1] R. Ghorbani and R. Ghousi, ‘‘Predictive data mining approaches in medical diagnosis: A review of some diseases prediction,’’ Int. J. Data Netw. Sci., Vol.3, No.2, pp.47–70, 2019.
[2] F. Gorunescu, Data Mining: Concepts, Models and Techniques. India: Springer, 2011.
[3] H. C. Koh and G. Tan, ‘‘Data mining applications in healthcare,’’ J. Healthc. Inf. Manag., Vol.19, No.2, pp.65, 2011.
[4] I. Kavakiotis, O. Tsave, A. Salifoglou, N. Maglaveras, I. Vlahavas, and I. Chuvarada, ‘‘Machine learning and data mining methods in diabetes research,’’ Comput. Struct. Biotechnol. J., Vol.15, pp.104–116, 2017.
[5] B. S. Ahamed, M. S. Arya, and A. O. V. Nancy, ‘‘Diabetes mellitus disease prediction using machine learning classifiers with oversampling and feature augmentation,’’ Adv. Hum.-Comput. Interact., Vol.2022, pp.1–14, 2022.
[6] P. Theerthagiri, A. U. Ruby, and J. Vidya, ‘‘Diagnosis and classification of diabetes using machine learning algorithms,’’ Social Netw. Comput. Sci., Vol.4, No.1, pp.72, 2022.
[7] R. R. Kadhim and M.Y. Kamil, ‘‘Comparison of machine learning models for breast cancer diagnosis,’’ IAES Int. J. Artif. Intell. (IJ-AI), Vol.12, No.1, pp.415, 2023.
[8] G. Kumawat, S. K. Vishwakarma, P. Chakrabarti, P. Chittora, T. Chakrabarti, and J. C.-W. Lin, ‘‘Prognosis of cervical cancer disease by applying machine learning techniques,’’ J. Circuits, Syst. Comput., Vol.32, No.1, 2023.
[9] R. Huang, J. Liu, T. K.Wan, D. Siriwanna,Y. M. P.Woo, A.Vodenicarevic, C. W. Wong, and K. H. K. Chan, ‘‘Stroke mortality prediction based on ensemble learning and the combination of structured and textual data,’’ Comput. Biol. Med., Vol.155, 2023.
[10] P. B. Dash, ‘‘Efficient ensemble learning based CatBoost approach for early-stage stroke risk prediction,’’ in Ambient Intelligence in Health Care: Proceedings of ICAIHC 2022. Singapore: Springer, pp.475–483, 2022.
[11] W. Chang, Y. Liu, Y. Xiao, X. Yuan, X. Xu, S. Zhang, and S. Zhou, ‘‘A machine-learning-based prediction method for hypertension outcomes based on medical data,’’ Diagnostics, Vol.9, No.4, pp.178, 2019.
[12] M. A. J. Tengnah, R. Sooklall, and S. D. Nagowah, ‘‘A predictive model for hypertension diagnosis using machine learning techniques,’’ in Telemedicine Technologies. Mauritius: Academic, pp.139–152, 2019.
[13] S. Revathy, ‘‘Chronic kidney disease prediction using machine learning models,’’ Int. J. Eng. Adv. Technol., Vol.9, No.1, pp.6364–6367, 2019.
[14] K. R. A. Padmanaban and G. Parthiban, ‘‘Applying machine learning techniques for predicting the risk of chronic kidney disease,’’ Indian J.Sci. Technol., Vol.9, No.29, pp.1–6, 2016.
[15] I. V. Stepanyan, ‘‘Comparative analysis of machine learning methods for prediction of heart disease,’’ J. Mach. Manuf. Reliab., Vol.51, No.8, pp.789–799, 2022.
[16] P. S. Patil, ‘‘Heart disease prediction using machine learning techniques,’’ in Proc. Int. Conf. Commun. Signal Process. Control (ICCSPC), pp.1–6, 2022.
[17] M. A. Almustafa, M. A. Alrahim, and H. A. Aljamaan, ‘‘An efficient missing value imputation using fuzzy c-means clustering for diabetes disease prediction,’’ J. Healthc. Eng., Vol.2022, pp.1–11, 2022.
[18] S. Muthulakshmi and M. S. Parveen, ‘‘Heart disease prediction using machine learning techniques,’’ in Proc. 3rd Int. Conf. Intell. Commun. Technol. Virtual Mobile Netw. (ICICV), pp.1024–1028, 2021.
[19] M. A. Almustafa, M. A. Alrahim, and H. A. Aljamaan, ‘‘Handling class imbalance problem for predicting chronic kidney disease using machine learning,’’ J. Healthc. Eng., Vol.2022, pp.1–10, 2022.
[20] N. G. Ramadhan and A. N. Romadhony, ‘‘Imbalanced data handling in diabetes mellitus prediction using random forest algorithm,’’ in Proc. Int. Conf. Inf. Technol. Syst. Innov. (ICITSI), pp.1–6, 2021.
Citation
D.J. Samatha Naidu, A. Venkatesh, "Enhancing Chronic Diseases Prediction through Machine Learning and Data Pre-Processing Strategies," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.41-48, 2025.
Stress Management Strategies: An In-Depth Study of IT Professionals in Chennai
Research Paper | Journal Paper
Vol.13 , Issue.3 , pp.49-55, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.4955
Abstract
In the swiftly advancing information technology (IT) business, stress management is essential for maintaining productivity, mental health, and overall job happiness. This study examines the stress levels of IT professionals in Chennai, a prominent technological hub, to comprehend the issues encountered in this high-pressure business. The study utilizes primary data obtained from structured surveys and secondary research from existing literature to identify significant stressors, including excessive workloads, prolonged working hours, and inadequate work-life balance. The study investigates the efficacy of diverse stress management strategies, such as mindfulness practices, consistent physical activity, meditation, and yoga, in mitigating stress. The analysis examines the impact of organizational assistance, including flexible work schedules and counseling services, on mitigating workplace stress. The results underscore the imperative of adopting customized stress management measures in corporate settings to cultivate a healthier and more efficient workforce.
Key-Words / Index Term
IT Professionals, Stress Management, Workplace Stress, Employee Productivity, Mental Health, Stress Reduction Techniques, Mindfulness Practices, Work-life Balance, Exercise and Relaxation, Employee Welfare, Organizational Support, Counseling Services, Chennai IT Sector, Job Performance Enhancement, Employee Well-being.
References
[1] P. Deepa, “A Study on Stress Management of Employees in Workplace,” Jay Ushin Limited, Vol.4, Issue.8, 2015.
[2] J. M. Maheshkumar and M. Soundarapandian, “Stress Level and Stress Management Ability Among the Professionals of Information Technology in Chennai,” International Journal of Science and Research Archive, Vol.12, Issue.1, pp.795-799, 2024.
[3] S. Sujatha et al., “Stress and Stress Management,” Indian Journal of Natural Sciences, Vol.13, Issue.73, 2022.
[4] Dr. A. Shameem and S. Arun Kumar, “A Study on Stress Management Among IT Professionals in Chennai,” International Journal for Innovative Research in Multidisciplinary Field, Vol.3, Issue.4, pp.248-254, 2017.
[5] Maha Lakshmi, “Stress and Stress Management: A Review,” Indian Journal of Natural Sciences, Vol.13, Issue.73, 2022.
[6] R. S. Lazarus and S. Folkman, Stress, Appraisal, and Coping, Springer Publishing Company, 1984.
[7] B. S. McEwen, “Physiology and Neurobiology of Stress and Adaptation: Central Role of the Brain,” Physiological Reviews, Vol.87, No.3, pp.873–904, 2007.
[8] M. Padhy, K. Chelli, and R. A. Padiri, “Occupational Stress and Job Satisfaction Among IT Professionals,” Journal of Psychology, Vol.6, No.1, pp.53-60, 2015.
[9] N. Sharma and G. Kaur, “Stress Management in the Workplace: An Assessment of the Role of Employee Wellness Programs,” Journal of Occupational Health, Vol.62, No.1, pp.101-110, 2020.
[10] A. K. Srivastava and S. Sinha, “Analyzing Stress Management Techniques in Information Technology Professionals: A Comparative Study,” Indian Journal of Health & Well-being, Vol.12, No.4, pp.360-365, 2021.
[11] L. Kravitz, “The Role of Exercise in Stress Management,” IDEA Fitness Journal, Vol.4, No.6, pp.74-82, 2007.
[12] J. Kabat-Zinn, “Mindfulness-based Interventions in Context: Past, Present, and Future,” Clinical Psychology: Science and Practice, Vol.10, No.2, pp.144-156, 2003.
[13] D. S. Black and G. M. Slavich, “Mindfulness Meditation and the Immune System: A Systematic Review of Randomized Controlled Trials,” Annals of the New York Academy of Sciences, Vol.1373, No.1, pp.13-24, 2016.
[14] V. Khera and A. Malik, “Predictive Modelling of Stress in IT Professionals Using Machine Learning Techniques,” International Journal of Computer Applications, Vol.176, No.30, pp.1-5, 2020.
[15] T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd ed., Springer, 2009.
[16] C. M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006.
Citation
Kundan Mahadev Ayare, Sarthak Agarwal, Deepajothi S., "Stress Management Strategies: An In-Depth Study of IT Professionals in Chennai," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.49-55, 2025.
Machine Learning for Linux Kernel Optimization: Current Trends and Future Directions
Review Paper | Journal Paper
Vol.13 , Issue.3 , pp.56-64, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.5664
Abstract
The integration of Machine Learning into Linux Kernel optimization has revolutionized system performance by enabling dynamic resource allocation, adaptive scheduling, and intelligent power management. This paper explores current trends and future directions in machine learning driven kernel optimization, highlighting key applications such as reinforcement learning for CPU scheduling, predictive memory management, and ML-based congestion control in networking. We analyse the advantages of ML over traditional rule-based methods, demonstrating how data-driven optimization enhances efficiency and responsiveness. However, challenges such as interpretability, real-time constraints, and computational overhead pose significant barriers to widespread adoption. To address these, we discuss emerging solutions, including Explainable AI (XAI), federated learning for privacy-preserving model training, and AutoML for automated performance tuning. This study provides a comprehensive review of machine learning’s role in optimizing the Linux Kernel and outlines future research directions to maximize its potential in next-generation operating systems.
Key-Words / Index Term
Linux Kernel Optimization, Machine Learning in Operating Systems, Reinforcement Learning for CPU Scheduling, Memory Management using ML, Predictive Congestion Control, Explainable AI (XAI) in Kernel Optimization.
References
[1] H. Fingler, I. Tarte, H. Yu, A. Szekely, B. Hu, A. Akella, and C. J. Rossbach, "Towards a Machine Learning-Assisted Kernel with LAKE," in Proc. 28th ACM Int. Conf. Architectural Support for Programming Languages and Operating Systems, pp.846-861, 2023.
[2] H. Malallah, S. R. Zeebaree, R. R. Zebari, M. A. Sadeeq, Z. S. Ageed, I. M. Ibrahim, H. M. Yasin, and K. J. Merceedi, "A comprehensive study of kernel (issues and concepts) in different operating systems," Asian Journal of Research in Computer Science, Vol.8, No.3, pp.16-31, 2021.
[3] S. Krishnapriya and Y. Karuna, "A survey of deep learning for MRI brain tumor segmentation methods: Trends, challenges, and future directions," Health and Technology, Vol.13, No.2, pp.181-201, 2023.
[4] H. Lee, S. Jung, and H. Jo, "STUN: reinforcement-learning-based optimization of kernel scheduler parameters for static workload performance," Applied Sciences, Vol.12, No.14, pp.7072, 2022.
[5] H. Martin, M. Acher, J. A. Pereira, L. Lesoil, J.-M. Jézéquel, and D. E. Khelladi, "Transfer learning across variants and versions: The case of linux kernel size," IEEE Trans. Software Eng., Vol.48, No.11, pp.4274-4290, 2021.
[6] A. Hayat, "A Load-Balanced Task Scheduler for Heterogeneous Systems based on Machine Learning," M.S. thesis, CAPITAL UNIVERSITY, 2021.
[7] D. Singh, V. Bhalla, and N. Garg, "Load balancing algorithms with the application of machine learning: a review," MR Int. J. Eng. Technol., Vol.10, No.1, 2023.
[8] T. A. Rahmani, F. Daham, G. Belalem, and S. A. Mahmoudi, "HBalancer: A machine learning based load balancer in real time CPU-GPU heterogeneous systems," in Proc. 2022 Int. Conf. Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), IEEE, pp.674-679, 2022.
[9] Y. Qiu, H. Liu, T. Anderson, Y. Lin, and A. Chen, "Toward reconfigurable kernel datapaths with learned optimizations," in Proc. Workshop on Hot Topics in Operating Systems, pp.175-182, 2021.
[10] R. Mosaner, D. Leopoldseder, W. Kisling, L. Stadler, and H. Mössenböck, "Machine-Learning-Based Self-Optimizing Compiler Heuristics," in Proc. 19th Int. Conf. Managed Programming Languages and Runtimes, pp.98-111, 2022.
[11] Y. Kojima, R. Kazama, H. Abe, and C. Lee, "RNN-based Congestion Control in the Linux Kernel," in Proc. 2024 Twelfth Int. Symp. Computing and Networking Workshops (CANDARW), IEEE, pp.130-136, 2024.
[12] H. Qiu, W. Mao, C. W. H. Franke, Z. T. Kalbarczyk, T. Basar, and R. K. Iyer, "On the promise and challenges of foundation models for learning-based cloud systems management," in Workshop on Machine Learning for Systems at NeurIPS, Dec. 2023.
[13] S. Bian, C. Li, Y. Fu, Y. Ren, T. Wu, G. P. Li, and B. Li, "Machine learning-based real-time monitoring system for smart connected worker to improve energy efficiency," J. Manuf. Syst., Vol.61, pp.66-76, 2021.
[14] I. U. Akgun, A. S. Aydin, A. Shaikh, L. Velikov, and E. Zadok, "A machine learning framework to improve storage system performance," in Proc. 13th ACM Workshop on Hot Topics in Storage and File Systems , pp.94-102, 2021.
[15] V. K. Rayi, S. P. Mishra, J. Naik, and P. K. Dash, "Adaptive VMD based optimized deep learning mixed kernel ELM autoencoder for single and multistep wind power forecasting," Energy, Vol.244, pp.122585, 2022.
[16] V. Shankar, M. M. Deshpande, N. Chaitra, and S. Aditi, "Automatic detection of acute lymphoblastic leukemia using image processing," in Proc. 2016 IEEE Int. Conf. Advances in Computer Applications (ICACA), Coimbatore, India, pp.186-189, 2016. doi: 10.1109/ICACA.2016.7887948
[17] B. Herzog, F. Hügel, S. Reif, T. Hönig, and W. Schröder-Preikschat, "Automated selection of energy-efficient operating system configurations," in Proc. 12th ACM Int. Conf. Future Energy Systems, pp.309-315, 2021.
[18] V. Shankar, "Edge AI: A Comprehensive Survey of Technologies, Applications, and Challenges," in Proc. 2024 1st Int. Conf. Advanced Computing and Emerging Technologies (ACET), Ghaziabad, India, pp.1-6, 2024. doi: 10.1109/ACET61898.2024.10730112.
[19] J. Chen, S. S. Banerjee, Z. T. Kalbarczyk, and R. K. Iyer, "Machine Learning for Load Balancing in the Linux Kernel," in Proc. 11th ACM SIGOPS Asia-Pacific Workshop on Systems (APSys `20), pp.67-74, 2020. doi: 10.1145/3409963.3410492.
[20] C. Wang and J. Mou, "Linux Kernel Autotuning," in Proc. Linux Plumbers Conf., 2023.
[21] H. Dong, J. Appavoo, and S. Arora, "Tuning Linux Kernel Policies for Energy Efficiency with Machine Learning," Red Hat Research, 2023.
Citation
Vasuki Shankar, "Machine Learning for Linux Kernel Optimization: Current Trends and Future Directions," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.56-64, 2025.
Transforming Healthcare with Generative Artificial Intelligence: A Comprehensive Review
Review Paper | Journal Paper
Vol.13 , Issue.3 , pp.65-69, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.6569
Abstract
In today`s world, the role of Artificial Intelligence (AI) has become very dominant in all sectors. It is destroying applications in various fields. It is used in the medical world in various ways as a very important field. It is being used in recent days to improve efficiency, predict diseases and improve patient outcomes. The rise of Generative Artificial Intelligence (GenAI) is one of the most important. This cutting-edge technology is using sophisticated models such as power converters and diffusion models to revolutionize various systems in the healthcare sector. This article aims to review on the transformative impact of generative AI applications on healthcare, highlighting their immense potential, identifying existing challenges, and outlining future research directions.
Key-Words / Index Term
Artificial Intelligence, Healthcare Industry, Generative AI, GenAI, Prediction.
References
[1] Razzak, M. I., Naz, S., & Zaib, A., “ Deep learning for medical Image Processing: Overview, challenges and the future” , In Lecture notes in computational vision and biomechanics, pp.323–350, 2017.
[2] Chiu, T. K. F., “The impact of Generative AI (GenAI) on practices, policies and research direction in education: a case of ChatGPT and Midjourney”, Interactive Learning Environments, pp.1–17., 2023.
[3] Karbach, D. K., Haiduk, L., Bluher, G., Schreiter, M., & Lahmann, A., “The GenAI is out of the bottle: generative artificial intelligence from a business model innovation perspective”, Review of Managerial Science, Vol.18, Issue.4, pp.1189–1220, 2023.
[4] Kirova, V. D., Ku, C. S., Laracy, J. R., & Marlowe, T. J., “The Ethics of Artificial intelligence in the era of Generative AI”, Journal of Systemics, Cybernetics, and Informatics/Journal of Systemics Cybernetics and Informatics, Vol.21, Issue.4, pp.42–50, 2023.
[5] Xu, R., & Wang, Z., “Generative artificial intelligence in healthcare from the perspective of digital media: Applications, opportunities and challenges”, Helion, Vol.10, Issue.12, pp.e32364, 2024.
[6] Ferrara, E., “Fairness and Bias in Artificial intelligence: A brief survey of sources, impacts, and mitigation strategies” , Sci, Vol.6, Issue.1, pp.3, 2023.
[7] Kumar, D., Dhalwala, R., & Chaudhary, A., “Exploring the ethical implications of generative AI in healthcare”, In Advances in computational intelligence and robotics book series, pp.180–195, 2024.
[8] Small, W. R., Wiesenfeld, B., Bradfield-Harvey, B., Jonassen, Z., Mandal, S., Stevens, E. R., Major, V, “Large Language Model–Based responses to patients’ In-Basket messages”, JAMA Network Open, Vol.7, Issue.7, pp.e2422399, 2024.
[9] Agarwal, P., “Med Bot?: A GenAI based Chatbot for Healthcare”, Interantional Journal Of Scientific Research In Engineering And Management, Vol.8, Issue.6, pp.1–5, 2024.
[10] Seiferth, C., Vogel, L., Aas, B., Brandhorst, et al., “How to e-mental health: a guideline for researchers and practitioners using digital technology in the context of mental health”, Nature Mental Health, Vol.1, Issue.8, pp.542–554, 2023.
[11] David Ademola Oyemade, Diseimokumor Favour Seregbe, "A Machine Learning Model for the Classification of Human Emotions," International Journal of Computer Sciences and Engineering, Vol.12, Issue.4, pp.17-23, 2024.
[12] M.T. Stow, "Hybrid Deep Learning Approach for Predictive Maintenance of Industrial Machinery using Convolutional LSTM Networks," International Journal of Computer Sciences and Engineering, Vol.12, Issue.4, pp.1-11, 2024.
[13] Reddy S., “Generative AI in healthcare: an implementation science informed translational path on application, integration and governance”, Implement Sci. Mar 15; Vol.19, Issue.1, pp.27, 2024.
[14] Moulaei K, Yadegari A, Baharestani M, Farzanbakhsh S, Sabet B, Reza Afrash M, “ Generative artificial intelligence in healthcare: A scoping review on benefits, challenges and applications”, Int J Med Inform, 2024.
[15] Chen Y, Esmaeilzadeh P, “ Generative AI in Medical Practice: In-Depth Exploration of Privacy and Security Challenges”, J Med Internet Res. Mar 8; Vol.26, pp.e53008, 2024.
[16] Yim D, Khuntia J, Parameswaran V, Meyers A, “ Preliminary Evidence of the Use of Generative AI in Health Care Clinical Services: Systematic Narrative Review”, JMIR Med Inform. Mar 20;12:e52073, 2024.
[17] Zhang, P.; Kamel Boulos, M.N, “Generative AI in Medicine and Healthcare: Promises, Opportunities and Challenges”, Future Internet, 15, 286, 2023.
[18] Templin T, Perez MW, Sylvia S, Leek J, Sinnott-Armstrong N. “Addressing 6 challenges in generative AI for digital health: A scoping review”, PLOS Digit Health. May 23; Vol.3, Issue.5, pp.e0000503, 2024.
[19] Bhuyan SS, Sateesh V, Mukul N, Galvankar A, Mahmood A, Nauman M, Rai A, Bordoloi K, Basu U, Samuel J, “Generative Artificial Intelligence Use in Healthcare: Opportunities for Clinical Excellence and Administrative Efficiency”, J Med Syst. Jan 16; Vol.49, Issue.1, pp.10, 2025.
Citation
G.Arutjothi, K.Geetha, J.Nagapriya, "Transforming Healthcare with Generative Artificial Intelligence: A Comprehensive Review," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.65-69, 2025.
Advancements in AI-Based Compiler Optimization Techniques for Machine Learning Workloads
Review Paper | Journal Paper
Vol.13 , Issue.3 , pp.70-77, Mar-2025
CrossRef-DOI: https://doi.org/10.26438/ijcse/v13i3.7077
Abstract
This paper primarily explores the application of AI-driven compiler optimization techniques for machine learning (ML) workloads, with a focus on reinforcement learning and neural architecture search. It examines the performance of traditional compilers compared to AI-optimized compilers leveraging various ML models, including CNNs, RNNs, FNNs, and transformers. The results indicate that AI-driven compilers — particularly those using a hybrid RL + NAS approach—outperforms traditional compilers in energy consumption, memory usage, execution time and hardware utilization. Additionally, the findings suggest that AI-based optimization techniques can streamline ML pipeline development, enhancing efficiency and performance for both resource-constrained environments and large-scale applications.
Key-Words / Index Term
AI-based compilers, reinforcement learning, neural architecture search, machine learning, compiler optimization.
References
[1] M. Sponner, B. Waschneck, and A. Kumar, “AI-driven performance modeling for AI inference workloads,” Electronics, Vol.11, No.15, pp.2316, 2022. DOI: 10.3390/electronics11152316.
[2] M. K. Sheikh, “A Machine Learning Based Compiler Optimization Technique,” Sukkur IBA Journal of Emerging Technologies, Vol.7, No.1, pp.37-47, 2024.
[3] M. Trofin, Y. Qian, E. Brevdo, Z. Lin, K. Choromanski, and D. Li, "MLGO: A Machine Learning Guided Compiler Optimizations Framework," arXiv preprint, arXiv:2101.04808, 2021.
[4] C. Metz, “Towards Sustainable Artificial Intelligence Systems: Enhanced System Design with Machine Learning-Based Design Techniques,” Ph.D. dissertation, Universität Bremen, Germany, 2024.
[5] A. N. Mazumder, J. Meng, H. A. Rashid, U. Kallakuri, X. Zhang, J. S. Seo, and T. Mohsenin, “A survey on the optimization of neural network accelerators for micro-ai on-device inference,” IEEE Journal on Emerging and Selected Topics in Circuits and Systems, Vol.11, No.4, pp.532-547, 2021. DOI: 10.1109/JETCAS.2021.3120032.
[6] F. Ponzina, “Hardware-Software Co-Design Methodologies for Edge AI Optimization,” Ph.D. dissertation, EPFL, Switzerland, 2023.
[7] P. Gonzalez-Guerrero, A. Butko, G. Michelogianniakis, and J. Shalf, “AI-Enabled Analysis and Control for Enhancing Data Transition and Movement,” In Position Papers for the ASCR Workshop on Reimagining Codesign, March 2021.
[8] H. Bouzidi, “Efficient Deployment of Deep Neural Networks on Hardware Devices for Edge AI,” Ph.D. dissertation, Université Polytechnique Hauts-de-France, France, 2024.
[9] I. Hidalgo, F. Fenández-de Vega, J. Ceberio, O. Garnica, J. M. Velasco, J. C. Cortés, R. Villanueva, and J. Díaz, “Sustainable Artificial Intelligence Systems: An Energy Efficiency Approach,” [Preprint - Not Accepted for Final Publication], Authorea Preprints, 2023.
[10] K.K. Balasubramanian, M. Di Salvo, W. Rocchia, S. Decherchi, and M. Crepaldi, “Designing RISC-V Instruction Set Extensions for Artificial Neural Networks: An LLVM Compiler-Driven Perspective,” IEEE Access, 2024. DOI: 10.1109/ACCESS.2024.3290706.
[11] Ashouri, A. H., Manzoor, M. A., Vu, D. M., Zhang, R., Wang, Z., Zhang, A., ... & Gao, Y., “ACPO: AI-Enabled Compiler-Driven Program Optimization,” arXiv preprint arXiv:2312.09982, 2023.
[12] J. A. H. Klein, “Exploring High-Performance and Energy-Efficient Architectures for Edge AI-Enabled Applications,” Ph.D. dissertation, EPFL, Switzerland, 2024.
[13] S. S. Gill, M. Golec, J. Hu, M. Xu, J. Du, H. Wu, G. K. Walia, S. S. Murugesan, B. Ali, M. Kumar, and K. Ye, “Edge AI: A taxonomy, systematic review and future directions,” Cluster Computing, Vol.28, No.1, pp.1-53, 2025. DOI: 10.1007/s10586-024-04057-9.
[14] E. Kakoulli, “Latest Innovations in Intelligent Network-on-Chip Architectures: A Systematic Review,” 2024 17th IEEE/ACM International Workshop on Network on Chip Architectures (NoCArc), IEEE, Nov., pp.1-6, 2024.
[15] Wang, H., Tang, Z., Zhang, C., Zhao, J., Cummins, C., Leather, H., & Wang, Z., “Automating Reinforcement Learning Architecture Design for Code Optimization,” in Proceedings of the 31st ACM SIGPLAN International Conference on Compiler Construction, Mar., pp.129-143, 2022.
[16] Mammadli, R., Jannesari, A., & Wolf, F., “Static Neural Compiler Optimization via Deep Reinforcement Learning,” in 2020 IEEE/ACM 6th Workshop on the LLVM Compiler Infrastructure in HPC (LLVM-HPC) and Workshop on Hierarchical Parallelism for Exascale Computing (HiPar), Nov., pp.1-11, 2020.
[17] D. Alsadie, “A comprehensive review of AI techniques for resource management in fog computing: Trends, challenges and future directions,” IEEE Access, 2024. DOI: 10.1109/ACCESS.2024.3284783.
[18] V. Shankar, “Edge AI: A Comprehensive Survey of Technologies, Applications, and Challenges,” 2024 1st International Conference on Advanced Computing and Emerging Technologies (ACET), IEEE, Ghaziabad, India, pp.1-6, 2024. DOI: 10.1109/ACET61898.2024.10730112.
[19] Ashouri, A. H., Manzoor, M. A., Vu, D. M., Zhang, R., Wang, Z., Zhang, A., ... & Gao, Y., “ACPO: AI-Enabled Compiler-Driven Program Optimization,” arXiv preprint arXiv:2312.09982, 2023.
[20] V. Shankar, M. M. Deshpande, N. Chaitra, and S. Aditi, “Automatic detection of acute lymphoblastic leukemia using image processing,” 2016 IEEE International Conference on Advances in Computer Applications (ICACA), IEEE, Coimbatore, India, pp.186-189, 2016. DOI: 10.1109/ICACA.2016.7887948.
[21] Zhu, S., Yu, T., Xu, T., Chen, H., Dustdar, S., Gigan, S., ... & Pan, Y., “Intelligent Computing: The Latest Advances, Challenges, and Future,” Intelligent Computing, Vol.2, pp.0006, 2023.
Citation
Vasuki Shankar, "Advancements in AI-Based Compiler Optimization Techniques for Machine Learning Workloads," International Journal of Computer Sciences and Engineering, Vol.13, Issue.3, pp.70-77, 2025.