TY - JOUR
T1 - Low-Latency Privacy-Preserving Outsourcing of Deep Neural Network Inference
AU - Tian, Yifan
AU - Njilla, Laurent
AU - Yuan, Jiawei
AU - Yu, Shucheng
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2021/3/1
Y1 - 2021/3/1
N2 - Efficiently supporting inference tasks of deep neural network (DNN) on the resource-constrained Internet-of-Things (IoT) devices has been an outstanding challenge for emerging smart systems. To mitigate the burden on IoT devices, one prevalent solution is to outsource DNN inference tasks to the public cloud. However, this type of 'cloud-backed' solutions can cause privacy breach since the outsourced data may contain sensitive information. For privacy protection, the research community has resorted to advanced cryptographic primitives to support DNN inference over encrypted data. Nevertheless, these attempts are limited by the real-time performance due to the heavy IoT computational overhead brought by cryptographic primitives. In this article, we proposed an edge computing-assisted framework to boost the efficiency of DNN inference tasks on IoT devices, which also protects the privacy of IoT data to be outsourced. In our framework, the most time-consuming DNN layers are outsourced to edge computing devices. The IoT device only processes compute-efficient layers and fast encryption/decryption. Thorough security analysis and numerical analysis are carried out to show the security and efficiency of the proposed framework. Our analysis results indicate a 99%+ outsourcing rate of DNN operations for IoT devices. Experiments on AlexNet show that our scheme can speed up DNN inference for 40.6times with a 96.2% energy saving for IoT devices.
AB - Efficiently supporting inference tasks of deep neural network (DNN) on the resource-constrained Internet-of-Things (IoT) devices has been an outstanding challenge for emerging smart systems. To mitigate the burden on IoT devices, one prevalent solution is to outsource DNN inference tasks to the public cloud. However, this type of 'cloud-backed' solutions can cause privacy breach since the outsourced data may contain sensitive information. For privacy protection, the research community has resorted to advanced cryptographic primitives to support DNN inference over encrypted data. Nevertheless, these attempts are limited by the real-time performance due to the heavy IoT computational overhead brought by cryptographic primitives. In this article, we proposed an edge computing-assisted framework to boost the efficiency of DNN inference tasks on IoT devices, which also protects the privacy of IoT data to be outsourced. In our framework, the most time-consuming DNN layers are outsourced to edge computing devices. The IoT device only processes compute-efficient layers and fast encryption/decryption. Thorough security analysis and numerical analysis are carried out to show the security and efficiency of the proposed framework. Our analysis results indicate a 99%+ outsourcing rate of DNN operations for IoT devices. Experiments on AlexNet show that our scheme can speed up DNN inference for 40.6times with a 96.2% energy saving for IoT devices.
KW - Deep neural network (DNN) inference
KW - Internet of Things (IoT)
KW - edge computing
KW - privacy-preserving outsourcing
UR - http://www.scopus.com/inward/record.url?scp=85101663558&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85101663558&partnerID=8YFLogxK
U2 - 10.1109/JIOT.2020.3003468
DO - 10.1109/JIOT.2020.3003468
M3 - Article
AN - SCOPUS:85101663558
VL - 8
SP - 3300
EP - 3309
JO - IEEE Internet of Things Journal
JF - IEEE Internet of Things Journal
IS - 5
M1 - 9120239
ER -