The rise of AI, IoT (Internet of Things), and future 6G networks is pushing computation away from a purely centralised model. This architectural shift involves moving data processing to the network edge to reduce latency, conserve bandwidth, and protect privacy. In practice, this means AI models need to run on-site on specialised hardware for real-time edge computing tasks such as field robotics (where a high degree of autonomy is required) or patient monitoring.
Concurrently, society is increasingly relying on and expecting complex, multimodal workflows that combine text, images, and sensor data. These workflows demand new, distributed architectures that seamlessly integrate edge devices for data collection and processing, cloud platforms for large-scale storage, and HPC backends for intensive AI model training and advanced data analysis.