/* ---- Google Analytics Code Below */

Monday, July 11, 2022

Intro to Edge Computing from NVIDIA

 Looks to be useful.  

An Introduction to Edge Computing: Common Questions and Resources for Success

By Troy Estes    Nvidia  

Vision / Video Analytics, edge computing, Fleet Command, News, Webinar

With the convergence of IoT and AI organizations are evaluating new ways of computing to keep up with larger data loads and more complicated use cases. For many, edge computing provides the right environment to successfully operate AI applications ingesting data from distributed IoT devices. 

But many organizations are still grappling with understanding edge computing. Partners and customers often ask about edge computing, reasons for its popularity in the AI space, and use cases compared to cloud computing. 

NVIDIA recently hosted an Edge Computing 101: An Introduction to the Edge webinar. The event provided an introduction to edge computing, outlined different types of edge, the benefits of edge computing, when to use it and why, and more. 

During the webinar, we surveyed the audience to understand their biggest questions about edge computing and how we could help. 

Below we provide answers to those questions, along with resources that could help you along your edge computing journey.  

What stage are you in, on your edge computing journey? 

About 51% of the audience answered that they are in the “learning” phase of their journey. At face value, this is not surprising given that the webinar was an introductory session. Of course most of the folks are at the learning phase, as opposed to implementing or scaling. This was also corroborated by the fact that many of the tools in the edge market are still new, meaning many vendors also have much to gain from learning more. 

To help in the learning journey refer to Considerations for Deploying AI at the Edge. This overview covers the major decision points for choosing the right components for an edge solution, security tips on edge deployments, and how to evaluate where edge computing fits into your existing environment. 

What is the top benefit you hope to gain by deploying applications at the edge? 

There are many benefits to deploying AI applications in edge computing environments, including real-time insights, reduced bandwidth, data privacy, and improved efficiency. For the participants in the session, 42% responded that latency (or real-time insights) was the top differentiator they were hoping to gain from deploying applications at the edge.

The four benefits to edge computing are real-time intelligence, reduced bandwidth, data privacy, and improved efficiency

Figure 1. Benefits of edge AI include reduced latency and bandwidth requirements, improved data sovereignty, and increased automation

Improving latency is a major benefit of edge computing since the processing power for an application sits physically closer to where data is collected. For many use cases, the low latency provided by edge computing is essential for success. 

For example, an autonomous forklift operating in a manufacturing environment has to be able to react instantaneously to its dynamic environment. It needs to be able to turn around tight corners, lift and deliver heavy loads, and stop in time to avoid colliding with moving workers in the facility. If the forklift is unable to make  .... 

No comments: