Machine Vision

“The techniques, technologies and applications have advanced tremendously since 2019”

Interview with Jeff Bier, Founder of the Edge AI and Vision Alliance and Organizer of Embedded Vision Summit

10.05.2022 - Upfront the Embedded Vision Summit, David Löh, Editor-in-chief of the machine vision magazine inspect talked with Jeff Bier, organizer of the embedded event in California about the Summit, the industry, and trends in embedded vision.

As president of the engineering consulting firm BDTI (Berkeley Design Technology, Inc.) Jeff Bier founded the Edge AI and Vision Alliance, the former Embedded Vision Alliance, in 2011, which has more than 100 member companies today, and organizes the Embedded Vision Summit since 2012. This year’s event is special, because it’s the first for three years. “The techniques, technologies, and applications have advanced tremendously since 2019,” says Bier. Besides that he expects over 1,000 participants, who can listen to around 100 lectures and visit a good 60 exhibitor stands. For those who are not planning to come yet, there is more information and the possibility to register here: https://embeddedvisionsummit.com

inspect: What do you think is different to the last on-site Embedded Vision Summit in 2019?

Jeff Bier: Three years is a long time in our field. The techniques, technologies, and applications have advanced tremendously since 2019. The 2022 Embedded Vision Summit will showcase impressive new applications, innovative building-block technologies and leading-edge techniques – many of which are new compared to 2019.

What can visitors expect?

Bier: The entire focus of the Embedded Vision Summit is providing system and application developers with the practical knowledge and connections they need to successfully incorporate embedded vision into their products. The two main elements of the Summit are the conference program with over 100 lectures and the exhibits with more than 60 booths this year. This carefully designed, high-quality program has enabled the Summit to consistently achieve a 98 percent approval rating from attendees which we are very happy about. The actual list of sessions can be found on our website.

Similarly, the exhibits are mainly focused on building-block technologies like processor, algorithms and software tools which system and application developers can use to build their solutions. You can see the current list of exhibitors also online.

What are the highlights of the Embedded Vision Summit?

Bier: The biggest highlight in my eyes is the perfect size: The Summit is large enough to provide a wide range of session topics and exhibitors, and plenty of opportunity to make new professional connections – but small enough so that an attendee can make very efficient use of their time, and have plenty of opportunities for 1:1 conversations with other attendees. But in terms of topics and technologies, some of the highlights of this year’s Summit include:

  • Neuromorphic (bio-inspired) sensing and processing: the future of perceptual AI?  
  • Low-code/no-code edge ML development: Arguably, a shortage of skilled engineers is the main bottleneck to developing edge computer vision/machine learning solutions to solve many real-world problems. Are low-code/no-code development tools the answer? Many companies think they are at least part of the answer, and there’s lots happening in this space.
  • MLOps: Implementing processes to more efficiently and reliably develop and deploy ML-based applications.
  • Combining multiple sensor types for perceptual AI: Humans and other creatures utilize a variety of senses, while most of today’s systems get by with just one (or one at a time). What are the advantages and challenges of incorporating audio as well as video into a perceptual AI system?
  • Making deep neural networks more reliable.
  • Making edge computer vision and machine learning ubiquitous: Rapid progress on better processors and tools, enabling edge CV and ML to be deployed even in very power- and cost-constrained devices, and enabling cloud-like performance in edge devices.

Now comes the Corona question: What do participants need to consider in terms of hygiene and safety?

Bier: We are of course monitoring Covid trends closely and adjusting our public health protocols as the situation evolves. At present, the main points of our plan are:

  • Attendees must be vaccinated
  • Masks are encouraged and provided
  • Hand sanitizer will be free and everywhere

Additionally, we have a color-coding scheme on badges which every attendee can signal to other attendees in a simple way whether they are comfortable with close contact.

Sales of embedded vision systems are increasing by a double-digit percentage every year. Even the Corona pandemic did not slow down growth. Do you expect this momentum to stay with us in the long term, and why?

Bier: Yes, I expect this growth to continue and accelerate. This is for two complementary reasons: First, Embedded vision systems are increasingly delivering value in every industry, for an incredible variety of applications, including medical diagnostics, retail store operations and automotive safety features.

Second, building-block technologies, such as sensors, algorithms, processors and development tools are improving at an unprecedented pace. This is making it possible for embedded vision to be deployed in cost-, power- and size-constrained applications (e.g., doorbell cameras), and making it possible for a wider range of product development groups to incorporate vision into their systems.

Will embedded vision systems eventually have completely replaced conventional PC-based vision systems?

Bier: PCs and their close cousins have some real advantages for the applications where they fit the performance, cost, power consumption and I/O requirements.  But as the range of deployed vision applications grow, other classes of systems will vastly outnumber PCs.

Vision systems are now being used in a large and fast-growing range of applications and markets.  These applications are very diverse.  For example, I use a 35 US-Dollars vision-enabled security camera to monitor the front door of my home.  At the other end of the spectrum, we have the Amazon Go checkout-free retail stores, which process data from hundreds of cameras and other sensors.  

Author
David Löh, Editor-in-chief of inspect

Contact

Edge AI and Vision Alliance

1646 North California Blvd., Suite 220
94596 Walnut Creek
California, United States

+1 925 954 14 11
+1 925 954 14 23

Digital tools or software can ease your life as a photonics professional by either helping you with your system design or during the manufacturing process or when purchasing components. Check out our compilation:

Proceed to our dossier

Digital tools or software can ease your life as a photonics professional by either helping you with your system design or during the manufacturing process or when purchasing components. Check out our compilation:

Proceed to our dossier