Accelerate Artificial Intelligence IoT Use Cases with Storage Tiering and Shared Storage at the Edge

webinar

Author(s)/Presenter(s):

Joey Parnell

M K Jibbe

Library Content Type

Presentation

Library Release Date

Focus Areas

Abstract

Transmitting data from Internet-of-Things (IoT) edge devices to core data centers to perform resource intensive artificial intelligence (AI) use cases is costly in terms of network bandwidth and latency. Alternatively, placing hardware resources such as GPUs at the edge to perform those operations locally and reduce network congestion and latency can be prohibited by cost and power requirements. Through use of shared storage presented by the IoT device, more compute-intensive iterative refinement training can be performed either in the core data center or in cloud analytics platforms, and updated AI inference data transmitted back to IoT devices to provide customized training to improve reliability and reduce false positive rates. By adding flash storage and extending the shared storage between IoT devices, distributed applications can divvy up work to idle or under-utilized devices, store the results locally, and send metadata to the originating device about where to read the results. This allows an IoT device to coordinate and complete AI tasks that may exceed the computing capabilities or the latency requirements of the singular device and minimizes the data traveling to and from data centers and clouds. Finally, critical IoT data that is transmitted to core data centers can be either regularly archived to cloud or protected by disaster recovery solutions in the cloud. Minimize risk without significantly increasing cost by selectively using cloud resources. Store data on premises and mount that data as a target from the cloud via a gateway to perform analytics and transmit back only the results to the data center, or archive data for long term retention. Satisfy real-time IoT AI use cases with higher fidelity and without significantly increasing cost by providing shared storage in the IoT device to allow other devices to perform work and remotely update the local inference models. Add a layer of flash to IoT devices for data tiering provides the capability to perform AI tasks in a distributed fashion to utilize idle resources. Protect valuable IoT data at lower cost by selectively using cloud resources for archive and disaster recovery solutions.

Learning Objectives

Show how providing shared storage in IoT devices facilitates real-time AI use cases.,Show how bandwidth consumption and cost are reduced with the solution.,Add a fast storage tier to IoT devices and distribute workload to satisfy new use cases.,Increase resiliency and/or satisfy retention requirements with minimal cost and complexity by mounting on-premise data from cloud applications via a gateway.