Download this chapter (PDF)
Bringing computing resources closer to the ‘edge’

The rise of AI, IoT (Internet of Things), and future 6G networks is pushing computation away from a purely centralised model. This architectural shift involves moving data processing to the network edge to reduce latency, conserve bandwidth, and protect privacy. In practice, this means AI models need to run on-site on specialised hardware for real-time edge computing tasks such as field robotics (where a high degree of autonomy is required) or patient monitoring.

Concurrently, society is increasingly relying on and expecting complex, multimodal workflows that combine text, images, and sensor data. These workflows demand new, distributed architectures that seamlessly integrate edge devices for data collection and processing, cloud platforms for large-scale storage, and HPC backends for intensive AI model training and advanced data analysis.

Impact

education

Education

  • Students and staff alike need to be exposed to edge AI and network-driven architectures. This can be achieved through revision of existing modules/courses or the creation of focused content.
Research

Research

  • The architectural shift will enable new experimental approaches in robotics, medicine, and environmental monitoring.
Operations

Operations

  • Architecture shifts are set to drive investment in IoT, network infrastructure, and local compute.
More info about Cloud Computing?
Visit surf.nl
Link SURF icoon