What is Edge Computing?
Definition
A distributed computing paradigm that processes data closer to its source rather than in a centralized data center. Edge computing reduces latency for real-time applications like IoT sensors, autonomous vehicles, and content delivery.
Related Terms
A geographically distributed network of servers that caches and delivers web content from the location nearest to each user. CDNs reduce page load times, handle traffic spikes, and improve availability for global audiences.
A cloud execution model where the provider dynamically manages server allocation and billing is based on actual compute time consumed. Developers write functions without provisioning infrastructure, though cold-start latency and vendor lock-in are trade-offs.
The delivery of computing resources (servers, storage, databases, networking) over the internet on a pay-as-you-go basis. Cloud computing eliminates the need for organizations to own and maintain physical data centers.