For 30+ years, Sensors Expo & Conference has established itself as North America’s premier event focused exclusively on sensors and sensor-integrated systems. Join us in the heart of Silicon Valley – June 26-28, 2018 at the McEnery Convention Center in San Jose, California – to be a part of THE only event where you can find solutions for your present sensing technologies while exploring sensing technologies that are driving tomorrow’s solutions.
It’s more than the industry’s leading tradeshow. And this June, there’s more opportunity than ever to connect with the industry and the technologies driving it. Will you be there?
Register with the promo code YOLE100
Ask for a meeting and meet Yole Developpement’s team on booth 343.
Feel free to attend our the presentations:
– “The perceptual era of IoT has begun, mobile phones first” on June 27 at 10AM
Guillaume Girardin, Photonics, Sensing and Display Head of Division at Yole Développement
Abstract: The move to perceptual era signals is a new era for sensing. It is driving major changes to the way depth sensing cameras, sensors, APUs, and neural networks are combined and work together to bring enhanced awareness to physical objects. The iPhone X has been the more advanced smartphone so far is this case. This session will review the past and the future of sensing in this market and envision what the future could be... More here
– “Autonomous Vehicle Sensors and Sensor Fusion” on June 26 – Automotive Workshop
Dr. Yohann Tschudi, Software & Market Analyst at Yole Développement
& Guillaume Girardin, Photonics, Sensing and Display Head of Division
Abstract: Autonomous vehicles are showing early signs of maturity and several companies have launched commercial products to be used on dedicated or private roads. Robotic taxis are set to hit the road by 2020 in an order of magnitude x100k units. To do so, sensors and sensor fusion are mandatory and have to be ready. Some of these technologies are used in ADAS for the detection of obstacles and the recognition of signs, traffic lights, cars, pedestrians and assorted others. The images come from a bank of cameras arranged on and around the car, while the training is performed in datacenters in dedicated computing machines, and the inference algorithm is embedded either in an ECU in the case of semi-autonomous cars, or, in a complete computer in the case of robotic taxis. We will give an overview of both markets, sensors and processing/software, and discuss what the future of transportation should look like... More here