Vol. 10 No. 5 (2023): Applied AI Solutions on Edge Devices

ESS Vol 10 Iss5

Traditionally, AI solutions were cloud- or high-performance computing platform-driven and demanded powerful hardware to perform deep-learning / machine learning computational tasks and scale resources effortlessly. However, this involved offloading data to external computing systems for additional computations, which degraded latency, led to higher communication costs, boosted energy consumption, and triggered privacy concerns. Moreover, Inference is a comparatively less computationally demanding task than training, in which latency is more significant in delivering real-time results for a model. The majority of inference is still performed in the cloud or on a dedicated server, but through the increasing diversity of AI applications, the centralized training and inference paradigm is being challenged.

Published: 2023-01-05