Content

Speaker:

Jorge Murillo

Abstract:

The rise of novel computing applications such as social media, generative AI and IoT devices are fueling an increase in datacenter energy consumption, which is expected to represent between 12% to 27% of the total energy production in the United States by 2028. This positions cloud and edge computing as key contributors to climate change, underscoring the need for solutions to address their environmental impact. In the last decade, these solutions have focused on lowering computing's energy consumption, either through the use of more energy-efficient hardware or by lowering the Power Usage Effectiveness (PUE) of datacenters. Unfortunately, the energy efficiency gains of hardware have started to plateau and datacenters have reached the practical limits of PUE, so new approaches are needed for sustainability.

In recent years, a new approach to decarbonization has emerged that focuses on optimizing for carbon emissions, rather than energy consumption. By exploiting the differences in carbon intensity of different electric grids, computing systems can make intelligent resource allocation and scheduling decisions to lower their carbon emissions and environmental impact. This thesis explores new spatial shifting techniques to reduce carbon emissions while maintaining performance. First, I propose CDN-Shifter, a spatial shifting algorithm for greening Content Delivery Networks (CDNs), which considers electricity costs and maintains latency constraints. CDN-Shifter is implemented as a multi-objective optimization technique and relies on two different types of shifting: load and capacity shifting. For load shifting, CDN-Shifter redistributes the incoming load between the edge datacenters constituting the CDN, in an effort to exploit the differences in carbon intensity between different grids worldwide, while respecting a user-specified limit in latency increase. Capacity shifting, on the other hand, works in tandem with load shifting, and redeploys the Virtual Machines in a CDN to allow for greater carbon savings. CDN-Shifter was evaluated using a workload from the Akamai CDN, and found that our approach decreased emissions by 35.5\% in the USA by only increasing the latency by 60 ms, with savings reaching 57\% when incorporating capacity shifting.

With Go with the Flow, I investigate the use of load shifting for a specific use case: video streaming. Composing more than 50\% of Internet traffic, video streaming is a ripe target for decarbonization techniques. Previous work has focused on compute intensive workloads, and has thus ignored the effect that transferring data over the network has on carbon emissions. By using an analytical model that incorporates the emissions of the server, network path and playback devices of a video stream, I showed how a naive application of carbon-aware load shifting can lead to lower, or even negative, carbon savings. In general, Go with the Flow's key takeaway is that holistic approaches are needed for smart decarbonization of specific computing applications: streaming from the greenest or the closest region are not always the best approaches.

Finally, I propose using my previous findings to build a carbon-aware video streaming system that decides in real time the optimal datacenter to stream from and uses overlay networks to influence the data transfer path to reduce carbon emissions.

Advisor:

Prashant Shenoy