TY - JOUR
T1 - Latency Comparison of Cloud Datacenters and Edge Servers
AU - Charyyev, Batyr
AU - Arslan, Engin
AU - Gunes, Mehmet Hadi
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020
Y1 - 2020
N2 - Edge computing has become a recent approach to bring computing resources closer to the end-user. While offline processing and aggregate data reside in the cloud, edge computing is promoted for latency-critical and bandwidth-hungry tasks. In this direction, it is crucial to quantify the expected latency reduction when edge servers are preferred over cloud locations. In this paper, we performed an extensive measurement to assess the latency characteristics of end-users with respect to the edge servers and cloud data centers. We also evaluated the impact of capacity limitations of edge servers on the latency under various user workloads. We measured latency from 8,456 end-users to 6,341 Akamai edge servers and 69 cloud locations. Measurements of latencies show that while 58% of end-users can reach a nearby edge server in less than 10 ms, only 29% of end-users obtain a similar latency from a nearby cloud location. Additionally, we observe that the latency distribution of end-users to edge servers follows a power-law distribution, which emphasizes the need for non-uniform server deployment and load balancing by an edge provider.
AB - Edge computing has become a recent approach to bring computing resources closer to the end-user. While offline processing and aggregate data reside in the cloud, edge computing is promoted for latency-critical and bandwidth-hungry tasks. In this direction, it is crucial to quantify the expected latency reduction when edge servers are preferred over cloud locations. In this paper, we performed an extensive measurement to assess the latency characteristics of end-users with respect to the edge servers and cloud data centers. We also evaluated the impact of capacity limitations of edge servers on the latency under various user workloads. We measured latency from 8,456 end-users to 6,341 Akamai edge servers and 69 cloud locations. Measurements of latencies show that while 58% of end-users can reach a nearby edge server in less than 10 ms, only 29% of end-users obtain a similar latency from a nearby cloud location. Additionally, we observe that the latency distribution of end-users to edge servers follows a power-law distribution, which emphasizes the need for non-uniform server deployment and load balancing by an edge provider.
KW - Cloud computing
KW - Edge computing
KW - Fog computing
KW - Latency measurement
UR - http://www.scopus.com/inward/record.url?scp=85100403149&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85100403149&partnerID=8YFLogxK
U2 - 10.1109/GLOBECOM42002.2020.9322406
DO - 10.1109/GLOBECOM42002.2020.9322406
M3 - Conference article
AN - SCOPUS:85100403149
SN - 2334-0983
JO - Proceedings - IEEE Global Communications Conference, GLOBECOM
JF - Proceedings - IEEE Global Communications Conference, GLOBECOM
M1 - 9322406
T2 - 2020 IEEE Global Communications Conference, GLOBECOM 2020
Y2 - 7 December 2020 through 11 December 2020
ER -