Zhe Wu

Understanding the latency benefits of multi-cloud webservice deployments

Zhe Wu, Harsha V. Madhyastha
Appears in: 
CCR April 2013

To minimize user-perceived latencies, webservices are often deployed across multiple geographically distributed data centers. The premise of our work is that webservices deployed across multiple cloud infrastructure services can serve users from more data centers than that possible when using a single cloud service, and hence, offer lower latencies to users.

Public Review By: 
Katerina Argyraki

One of the benefits of hosting a web service in the cloud is closer proximity to the end-user: as a cloud typically consists of multiple datacenters, located in different places, a webservice provider can serve each client from the physically-closest datacenter, hence reduce user-perceived latency. This paper now tells us that hosting a web service in multiple clouds can further improve this benefit: by switching from a single-cloud to a multi-cloud deployment, 20-50% of IP prefixes would reduce their latency to the closest datacenter by more than 20%. These numbers are based on measurements from 265 PlanetLab nodes spanning multiple countries and all continents (with the exception of Africa). The authors report two reasons for this improvement: (i) cloud provider A may have a datacenter in a particular region, whereas cloud provider B does not; (ii) the routing toward cloud provider A's datacenter in a particular region may be significantly worse than the routing toward cloud provider B's datacenter in the same region. Whether these findings push webservice providers toward multicloud deployments or motivate ISPs to improve their routing, the reviewers agree that they are useful and, to some extent, unexpected. Moreover, the reviewers welcome more work on how to deal with latency fluctuations and how to improve quality of service in regions that are poorly served by all cloud providers.

Syndicate content