Researchers at IMDEA Networks have developed a machine learning algorithm called AMR² to help optimize the performance of edge-deployed IoT sensors. AMR² utilizes an edge computing infrastructure to make decisions about which deep-learning tasks to assign to which sensors, while also ensuring that specific quality-of-service requirements, such as latency and inference accuracy, are met. This technology is still in the early stages of development, but it is a promising example of how AI can be used to optimize IoT applications.
Previous ArticlePalo Alto Bolsters Ai Support In Sase, Sd-wan Products
Next Article Cyber, Finance Join Forces For Good