Article ID Journal Published Year Pages File Type
445663 Computer Communications 2016 11 Pages PDF
Abstract

Computational offloading is the standard approach to running computationally intensive tasks on resource-limited smart devices, while reducing the local footprint, i.e., the local resource consumption. The natural candidate for computational offloading is the cloud, but recent results point out the hidden costs of cloud reliance in terms of latency and energy. Strategies that rely on local computing power have been proposed that enable fine-grained energy-aware code offloading from a mobile device to a nearby piece of infrastructure. Even state-of-the-art cloud-free solutions are centralized and suffer from a lack of flexibility, because computational offloading is tied to the presence of a specific piece of computing infrastructure. We propose AnyRun Computing (ARC), a system to dynamically select the most adequate piece of local computing infrastructure. With ARC, code can run anywhere and be offloaded not only to nearby dedicated devices, as in existing approaches, but also to peer devices. We present a detailed system description and a thorough evaluation of ARC under a wide variety of conditions. We show that ARC matches the performance of the state-of-the-art solution (MAUI), in reducing the local footprint with stationary network topology conditions and outperforms it by up to one order of magnitude under more realistic topological conditions.

Related Topics
Physical Sciences and Engineering Computer Science Computer Networks and Communications
Authors
, , ,