Investigating potentials and pitfalls of knowledge distillation across datasets for blood glucose forecasting

Hadia Hameed, Samantha Kleinberg

Research output: Contribution to journalConference articlepeer-review

6 Scopus citations

Abstract

Individuals with Type I diabetes (T1D) must frequently monitor their blood glucose (BG) and deliver insulin to regulate it. New devices like continuous glucose monitors (CGMs) and insulin pumps have helped reduce this burden by facilitating closed-loop technologies like the artifical pancreas (AP) for delivering insulin automatically. As more people use AP systems, which rely on a CGM and insulin pump, there has been a dramatic increase in the availability of large scale patient-generated health data (PGHD) in T1D. This data can potentially be used to train robust, generalizable models for accurate BG forecasting which can then be used to make forecasts for smaller datasets like OhioT1DM in real-time. In this work, we investigate the potential and pitfalls of using knowledge distillation to transfer knowledge from a model learned from one dataset to another and compare it with the baseline case of using either dataset alone. We show that using a pre-trained model to do BG forecasting for OhioT1DM from CGM data only (univariate setting) has comparable performance to training on OhioT1DM itself. Using a single-step, univariate recurrent neural network (RNN) trained on OhioT1DM data alone, we achieve an overall RMSE of 19.21 and 31.77 mg/dl for a prediction horizon (PH) of 30 and 60 minutes respectively.

Original languageEnglish
Pages (from-to)85-89
Number of pages5
JournalCEUR Workshop Proceedings
Volume2675
StatePublished - 2020
Event5th International Workshop on Knowledge Discovery in Healthcare Data, KDH 2020 - Virtual, Santiago de Compostela, Spain
Duration: 29 Aug 202030 Aug 2020

Fingerprint

Dive into the research topics of 'Investigating potentials and pitfalls of knowledge distillation across datasets for blood glucose forecasting'. Together they form a unique fingerprint.

Cite this