TAAFT
Free mode
100% free
Freemium
Free Trial
Deals
Create tool

Edge Computing

[ɛdʒ kəmˈpjutɪŋ]
AI Infrastructure
Last updated: December 9, 2024

Definition

Processing data near the source of data generation rather than in a centralized data center or cloud.

Detailed Explanation

Edge computing moves computation and data storage closer to the location where it is needed, reducing latency and bandwidth usage. It enables real-time processing and decision-making by distributing computing resources across a network of edge devices and local servers. Edge computing architectures typically involve layers of processing capabilities, from IoT devices through edge nodes to central cloud resources, with each layer handling appropriate levels of computation based on latency requirements, processing needs, and power constraints.

Use Cases

Real-time video analytics, autonomous vehicles, industrial IoT, smart cities, augmented reality applications, mobile AI applications

Related Terms