ALASCA Tech-Talk #14


ALASCA Tech-Talk #14 Explores DevOps Approach for AI Applications at the Edge

ALASCA Tech-Talks continue to provide a vital platform for discussions centered around projects aimed at enhancing the digital sovereignty of digital infrastructures and cloud services. These talks not only shed light on the latest developments but also delve into crucial use cases reliant on these digital sovereign infrastructures and services.

Looking ahead to the upcoming Tech Talk in March 2024, participants can anticipate an insightful presentation titled "A DevOps Approach for AI Applications at the Edge" by Danilo Ardagna, an Associate Professor at Politecnico di Milano's Department of Electronics, Information, and Bioengineering.


Exploring AI-SPRINT's Framework for AI Applications at the Edge

In an era where Artificial Intelligence (AI) integration is rapidly becoming ubiquitous, the availability of resources at the network's edge is paramount. While cloud computing provides essential processing power for handling vast datasets, edge computing assumes a critical role. Positioned at the data generation point, edge computing efficiently manages data in real-time, ensuring promptness and security.

This forthcoming talk will delve into the innovative tools developed under the AI-SPRINT H2020 European project. AI-SPRINT has established a comprehensive framework tailored for crafting AI applications across computing continua. This framework strikes a delicate balance between performance metrics, such as end-to-end latency and throughput, and the precision of AI models, all while upholding stringent security and privacy standards.

Key highlights of the AI-SPRINT toolkit include:

  • Simplified Programming Models: Designed to make developing AI software in computing continua more accessible to developers, thereby lowering barriers to entry.
  • Specialised AI Building Blocks: Offering components for distributed training, privacy preservation, and employing advanced machine learning models, these blocks significantly expedite the time-to-market for AI applications.
  • Automated Deployment and Dynamic Reconfiguration Policies: Engineered to reduce the operational costs of AI software, these policies ensure that applications remain efficient and adaptable amid evolving workloads.


Stay tuned for a discussion on the convergence of DevOps and AI at the edge, and discover how these advancements are shaping the future of AI applications. Find more information on the Alaska website!