TY - GEN
T1 - Leakagedetector 2.0
T2 - 41st IEEE International Conference on Software Maintenance and Evolution, ICSME 2025
AU - Truong, Owen
AU - Zhang, Terrence
AU - Marchareddy, Arnav
AU - Lee, Ryan
AU - Busold, Jeffery
AU - Socas, Michael
AU - Alomar, Eman Abdullah
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - In software development environments, code quality is crucial. This study aims to assist Machine Learning (ML) engineers in enhancing their code by identifying and correcting Data Leakage issues within their models. Data Leakage occurs when information from the test dataset is inadvertently included in the training data when preparing a data science model, resulting in misleading performance evaluations. ML developers must carefully separate their data into training, evaluation, and test sets to avoid introducing Data Leakage into their code. In this paper, we develop a new Visual Studio Code (VS Code) extension, called leakagedetector, that detects Data Leakage - mainly Overlap, Preprocessing and Multi-test leakage - from Jupyter Notebook files. Beyond detection, we included two correction mechanisms: a conventional approach, known as a quick fix, which manually fixes the leakage, and an LLM-driven approach that guides ML developers toward best practices for building ML pipelines. The plugin and its source code are publicly available on GitHub at https://github.com/SE4AIResearch/DataLeakage_JupyterNotebook_Fall2024. The demonstration video can be found on YouTube: https://youtu.be/7YiYVBiID_8. The website can be found at https://leakage-detector.vercel.app/.
AB - In software development environments, code quality is crucial. This study aims to assist Machine Learning (ML) engineers in enhancing their code by identifying and correcting Data Leakage issues within their models. Data Leakage occurs when information from the test dataset is inadvertently included in the training data when preparing a data science model, resulting in misleading performance evaluations. ML developers must carefully separate their data into training, evaluation, and test sets to avoid introducing Data Leakage into their code. In this paper, we develop a new Visual Studio Code (VS Code) extension, called leakagedetector, that detects Data Leakage - mainly Overlap, Preprocessing and Multi-test leakage - from Jupyter Notebook files. Beyond detection, we included two correction mechanisms: a conventional approach, known as a quick fix, which manually fixes the leakage, and an LLM-driven approach that guides ML developers toward best practices for building ML pipelines. The plugin and its source code are publicly available on GitHub at https://github.com/SE4AIResearch/DataLeakage_JupyterNotebook_Fall2024. The demonstration video can be found on YouTube: https://youtu.be/7YiYVBiID_8. The website can be found at https://leakage-detector.vercel.app/.
KW - data leakage
KW - machine learning
KW - quality
UR - https://www.scopus.com/pages/publications/105022506862
UR - https://www.scopus.com/pages/publications/105022506862#tab=citedBy
U2 - 10.1109/ICSME64153.2025.00102
DO - 10.1109/ICSME64153.2025.00102
M3 - Conference contribution
AN - SCOPUS:105022506862
T3 - Proceedings - 2025 IEEE International Conference on Software Maintenance and Evolution, ICSME 2025
SP - 895
EP - 899
BT - Proceedings - 2025 IEEE International Conference on Software Maintenance and Evolution, ICSME 2025
Y2 - 7 September 2025 through 12 September 2025
ER -