Join Us !

Many thanks for your interest in ADAS Sensors 2022! It was a fantastic online conference with 210+ participants and we look forward to seeing you at our upcoming online events.

You can still register to access the recorded presentations and slides in electronic format from ADAS Sensors 2022. Click here to register and then you’ll receive a link to download the conference materials. For more information or if you have any questions, please contact Jessica Ingram at jessica@microtechventures.com.

Speakers and Discussion Leaders
Past Speakers
Call for Speakers

Speakers and Discussion Leaders

(listed in alphabetical order, by speaker’s company name; detailed abstracts are below)

Driver Monitoring System Evaluation and Rainfall Induced Degradation of ADAS Performance
Matthew Lum
Automotive Technical Engineer
AAA

LIDAR for ADAS Applications: Performance Optimization and Design for Manufacturability
Hod Finklestein, PhD
Chief R&D Officer
AEye

Optimizing Cameras for Challenging ADAS and AV Computer Vision Applications
Dave Tokic
VP, Marketing and Strategic Partnership
Algolux

New ADAS Features Give Way to ODD Expansion
Tom Toma
Head of Sales and Product
Arriver

The Future of Automotive LIDAR: High Performance in a Small Form Factor
Jim Kane
Vice President Automotive Engineering
Baraja

Sensors for Robotaxis: Emerging Marketplace and Pricing Dynamics
Itay Michaeli
Director US Autos and Auto Parts
Citi Research

Multi-Sensor Safety Calibration for ADAS Applications
Mohammad Musa
CEO and Co-Founder
Deepen

Emerging LIDAR and Navigation Technologies for ADAS and AV Applications
Lucas Benney
Program Manager
Draper

Ensuring ADAS Reliability with Ground Penetrating Radar
Byron Stanley
CTO
GPR (formerly WaveSense)

Overview of LIDAR Concepts and Optical Components
Jake Li
Business Development Manager
Hamamatsu Photonics

How Far Should Automotive LIDAR See and How to Extend Its Range?
Dima Sosnovsky
Principal System Architect
Huawei

Automotive Radar: Analysis of Trends and Developments
James Jeffs, PhD
Technology Analyst
IDTechEx

ADAS Sensors Data Edge Case Detection and Training
Sudeep George
Vice President of Engineering
iMerit

Automotive LIDAR: Key Patent Owners and Intellectual Property Strategies
Paul Leclaire, PhD
Technology and Patent Analyst
Knowmade

How Strategic Venture Capital Drives ADAS Innovation
Sean Simpson
Director, Technology and Investments
Magna

How Radar Could Displace LIDAR in ADAS and AV Applications
Harvey Weinberg
Director, Sensor Technologies
Microtech Ventures

ADAS Sensors for Trucks and Buses: Current Status and Emerging Trends
Srini Gowda
Vice President, Autonomous Vehicles
Navistar

Next-Generation MEMS Mirrors for LIDAR Applications
Eric Aguilar
CEO
Omnitron Sensors

Data Labeling for ADAS Applications: Challenges and Solutions
Renata Walker
Sr. Product Manager
Sama

Silicone Based Solutions for ADAS and Automotive Sensors
Chad Kobylanski
Western Regional Manager
Shin-Etsu Silicones of America

Emerging Radar Technologies for ADAS Applications
Cameron Gieda
Vice President of Business Development
Spartan Radar

Precise Positioning and Theft Detection Enabled by MEMS Sensors
Raed Shatara, PhD
Sr. Product Marketing, Analog MEMS and Sensors
STMicroelectronics

AI-Based Vision Perception Software for ADAS/AV Applications
Sunny Lee
Chief Operating Officer
StradVision

Vision-Only ADAS: Are Camera Based Solutions Good Enough?
Mark Fitzgerald
Director, Autonomous Vehicle Service
Strategy Analytics

Micro-Optics for Smart Lighting Applications
Davide Lomellini
Area Sales Manager
SUSS MicroOptics

Why Do Your ADAS and AV Sensors Need More Sensors?
Brandon Minor
CEO
Tangram Vision

Technology Solutions for Reliable ADAS Sensors Manufacturing
Jacques Renaud
MEMS Process Integration Manager
Teledyne

Beyond the Line of Sight: Fundamental Optical Sensing Limits for ADAS
Sebastian Bauer, PhD
CEO
Ubicept

AI-Enabled Touch User Interfaces for Smart Surfaces in Automotive Applications
Mo Maghsoudnia
Founder and CEO
UltraSense

AI Clustering of Sensor Data for ADAS Applications
Kathrin Trueller
Project Manager for AD, Cybersecurity, and EVs
Volkswagen

M&A and Investment Trends in the ADAS Sensor Market
Rudy Burger
Managing Partner
Woodside Capital

Thermal and SWIR Imaging for ADAS Applications
Axel Clouet, PhD
Technology and Market Analyst -- Imaging
Yole Developpement

LIDAR for ADAS Applications: Emerging Trends and Developments
Pierrick Boulay
Lighting Expert
Yole Developpement

Cost-Effective Radar Solutions for ADAS Applications: Comparison of Key Factors and Considerations
Jimmy Wang
CTO
Zendar


Past Speakers

Many thanks to our speakers from ADAS Sensors 2021

Repeatable Testing Strategies for ADAS and AV Development Utilizing Simulation and Real-World Testing
Mike McPike
Managing Director
AB Dynamics

Designing Cameras for Challenging ADAS/AV Computer Vision Edge
Dave Tokic
VP, Marketing and Strategic Partnership
Algolux

Physically-Based Sensor Simulation to Shorten Development Lifecycles
Javier Salado
Technical Product Manager
Anyverse

Automotive Safety and ADAS: Sensors and Infrastructure Working Together
Doug Campbell
President
Automotive Safety Council

Safety and Security of Automotive Sensors for Highly Automated Vehicles
Manjeet Singh Bilra
Functional Safety Expert, Autonomous Vehicles
BMW

Automotive Sensor Data Collection for Analytics and Machine Learning
Johann Prenninger
Head of Analytics
BMW

Insuring the Modern Vehicle: The Impact of ADAS on Automotive Insurance
Don Hendriks
Actuarial Scientist
Carfax

Leveraging Simulations to Validate ADAS Sensors
Honghao Tan
Principal Engineer
Changan Motors

Sensor Fusion in AV and ADAS Simulation
Danny Atsmon
CEO
Cognata

Sensors in ADAS Applications: Comparison of the Leading Automakers
Kelly Funkhouser
Head of Connected and Automated Vehicles
Consumer Reports

Multi-Sensor Safety Collaboration for L2+ Applications
Mohammad Musa
CEO and Founder
Deepen

Cost Effective LIDAR for ADAS Applications using GaN Devices
Alex Lidow
CEO
Efficient Power Conversion (EPC)

Emerging AI Technologies for ADAS Applications
David Ostrowski
Data Scientist, Product Analytics
Ford

Precise GNSS Localization for Automated Vehicles
Curtis Hay
Technical Fellow
General Motors

Emerging Sensing Technologies for ADAS Applications
Reza Zarringhalam
Global Technical Specialist, ADAS Lateral Controls
General Motors

Emerging Technologies for ADAS Applications: Imaging Radar, Thermal Sensors, and Lower-Cost LIDAR
Sam Abuelsamid
Principal Analyst, E-Mobility
Guidehouse Insights

Photodetectors for LIDAR: Challenges, Trends, and Selection Criteria
Jake Li
Business Development Manager
Hamamatsu

Manufacturing Perspectives on Balancing Cost, Reliability, and Performance of Lasers for ADAS Applications
Owen Wu, Ph.D.
Director of Sales Engineering
Hitronics Technologies

Comparison of ADAS and AV Sensor Types: Current Differences, Evolution of Applications, and Potential Convergence
Phil Amsrud
Senior Principal Analyst, Automotive Semiconductors
IHS Markit

Precision Temperature Compensation for ADAS Applications
Thomas Hall
Sales Engineer
Innovative Sensor Technology

Predicting Pedestrian Behavior: Potential ADAS Use Cases and Applications
Assam Alzookery
CEO
Intvo

LIDAR Optical Components: Recent Trends and Developments
Susan Wells
Sales Manager, IR Components, APDs, and PLDs
Laser Components

Sensor Fusion with V2X and 5G Connectivity: Emerging ADAS Applications
Radovan Miucic, PhD
Director, Connectivity Systems Engineering
Lear

Sensing Modalities, Perception, and Fusion Technology Trends for ADAS Applications
Pierre Olivier
CTO
LeddarTech

Luneburg Lens Radar Technology for ADAS Applications
Hao Xin
CTO
Lunewave

Impact of 5G on the Connected and Automated Vehicle Landscape
Gary Streelman
Director Advanced Engineering and New Concepts
Marelli

Sensors and System Interface Specifications for ADAS Applications
Peter Lefkin
Managing Director
MIPI Alliance

Practical Advice and Considerations for Faster Radar Testing
Arturo Vargas Mercado
ADAS and C-V2X Marketing Manager
NI

Light Sources and Sensing: Key Aspects for Successful Integration in ADAS Applications
Fotis Konstantinidis
Vice President
O-Net Communications

Automotive-Grade, Long-Range LIDAR
Mark Donovan
Vice President, Advanced Optics
Opsys Tech

Improved Safety with 3D Thermal Ranging for ADAS Applications
Chuck Gershman
President and CEO
Owl Autonomous Imaging

Accurate Speed and Position Anywhere for ADAS Applications
Wesley Hulshof
Principal Engineer
RaceLogic

Making Cars Safer with Non-Visual Sensors
Stuart Feffer
CEO
Reality AI

Specialty Thermoplastic Solutions for ADAS Sensors
Jeff Xu, PhD
Business Manager
SABIC

Getters for Hermetic Packaging of ADAS Sensors
Marco Moraja
Business Manager
SAES Group

AI Training Data Challenges for ADAS and AV Applications
Sveta Kostinsky
Director, Sales Engineering
Sama

Importance of mmWave Absorbers for Radar Applications
Hiro Masumoto
Sr. Manager, Business Development
Sekisui

Driving the Evolution of ADAS and Automotive Sensors with Silicone
Chad Kobylanski
Western Regional Manager
Shin-Etsu Silicones

Micro-Mirrors and IMUs for Laser Beam Scanning and SPAD Receivers
Stuart Fergusson
Automotive MEMS and Sensors Marketing Manager
STMicroelectronics

Market Opportunities for Sensors in ADAS Applications
Mark Fitzgerald
Associate Director
Strategy Analytics

Seeing Beyond the Visible: SWIR Imaging for ADAS Applications
Avi Bakal
CEO
TriEye

Digital Code Modulation Radar for ADAS Applications
Manju Hegde, PhD
CEO and Co-Founder
Uhnder

Radars and AI for Occupancy Monitoring in Cars
George Shaker, PhD
Associate Professor
University of Waterloo

Robust Implementation of LIDAR for Pedestrian Automatic Emergency Braking
Rajeev Thakur
Director Automotive Programs
Velodyne Lidar

Managing ADAS Scalability from Supervised to Unsupervised Systems
Tom Toma
Senior Global Product Manager, Vision Systems
Veoneer

Improving Sensor Performance with Advancements in Optical Bandpass Technology
Scott Rowlands
Program Manager, Automotive Programs
VIAVI Solutions

Leveraging Connectivity for ADAS: Practical Examples for Increased Traffic Safety
Jonas Fenn
Strategic Business Development, R&D
Volvo

Time-Stretch LIDAR as a Spectrally Scanned Time-of-Flight Ranging Camera
Cathy Jiang, PhD
Hardware Engineer
Waymo

LIDAR Performance Benchmarks for ADAS and AV Applications
Dmitry Solomentsev, PhD
Head of LIDAR
Yandex

The Increasing Need for Data Fusion in ADAS Enabled Vehicles
Pierrick Boulay
Lighting Expert
Yole Développement

Maintaining RADAR Performance for In-Body Integration
Max Himmel
Sensor Modelling and Simulation Engineer
ZF

Driving Robustness and Efficiency in ADAS Validation
David Wilson
Director, Data and Analytics Monetization
ZF

Venture Capital Panel Discussion: ADAS Investment and M&A Trends

Don Hendriks
Managing Director and Co-Founder
Autotech Ventures

Zhe Huang
Investment Manager, Corporate Ventures
Denso Ventures

Shahin Farschi
Partner
Lux Capital

Josh Berg
Director, Innovation Ventures
Magna Ventures

Sasha Ostojic
Operating Partner
Playground Ventures

Many thanks to our speakers from ADAS Sensors 2019

ADAS Sensors: Emerging Trends and Market Outlook
Phil Amsrud
Senior Principal Analyst
IHS Markit

ADAS applications became effective by using image, ultrasonic, and radar sensors. ADAS’ evolution from L1 to L2 saw an increase in the use of these sensors. The evolution from L3 through L5 will also see an increase in the number of sensors, but will also see increases in the capabilities of these sensors, as well as greater use of LIDAR and perhaps other sensor technologies like NIR and FIR. Although self-driving cars are getting a lot of attention today, most of the cars in production will continue to be L1 and L2 for the foreseeable future. However even in limited production, self-driving cars will usher in a new segment to support ride-hailing applications and will change the vehicle’s architecture from being ECU based to being domain controller based. How will the sensor suite morph to support these changes? Will it be more of the same or will LIDAR be the one technology to enable everything? Are there other sensor technologies that will have a place within the self-driving car? Will these technologies be in addition to or will they displace existing technologies? And finally, what’s the impact of the upcoming ride-hailing model? We’ll discuss the answers to these questions and show how these answers are reflected in our market outlook.

Biography: Phil Amsrud is a senior principal analyst for the IHS Markit's automotive semiconductor research area with a special focus on advanced driver-assistance systems (ADAS) and autonomous driving technologies. Phil began his career in automotive electronics as a design engineer at GM on their ABS systems. From there he joined Motorola Semiconductor Products Sector supporting Delphi Electronics, which became part of Freescale Semiconductor that was spun off from Motorola. At Freescale he managed the field sales and applications engineers supporting Continental Automotive. After obtaining a Master’s degree in business, Phil joined ON Semiconductor and was responsible for field sales and applications team in the Americas region supporting Continental Automotive. Phil also served as new business development manager for the Americas at BAE Systems/Fairchild Imaging prior to joining IHS Markit. Phil holds both a Bachelor of Science in Electrical Engineering and Master of Science in Business degrees from the University of Wisconsin-Madison, US.


ADAS Sensors Market: M&A and Venture Capital Outlook
Rudy Burger, PhD
Managing Partner
Woodside Capital

With advances across all sensor modalities for ADAS and autonomous vehicles, which sensors are going to dominate, and which ones will become niche market players? The market is segmenting into near term opportunities involving “robotaxis” (General Motors’ Cruise is planning on launching a robotaxi service in San Francisco this year) and longer-term opportunities involving mass market vehicles targeting consumers. This talk will provide perspective of where investments are being made and, more specifically, which new ADAS and autonomous vehicle sensor companies have been attracting the most venture capital funding and which sensor categories have generated the most M&A activity over the past few years. In addition to the leading automotive OEMs in the US, Europe, and Japan, all three largest technology firms in China (Baidu, Alibaba, and Tencent – collectively known as the BAT) and a number of very well-funded startups are getting into the self-driving car business. This talk will also review the impact these companies may have on the global market.

Biography: For over 25 years, Dr. Rudy Burger has worked with computer vision, digital imaging, and embedded camera technologies as a founder, operating executive, and advisor. He has developed both a deep technical expertise and an awareness of market opportunity dynamics in these sectors that he leverages to guide his clients towards strategic successes. He currently focuses on the advanced driver assistance systems (ADAS) sector. Rudy is the Managing Partner of Woodside Capital Partners and works with growth stage technology companies to execute local and cross-border M&A transactions and private placements. His professional experience also includes executive roles with NEC, Visioneer, and Xerox as well as founding the MIT Media Lab Europe. He currently serves on the board of Seeing Machines, plc (AIM: SEE). Rudy holds degrees from Yale (BSc and MSc in EE) and Cambridge (PhD in Digital Imaging).


Automotive Computer Vision Startups: Key Investment Factors
Tony Cannestra
Director of Corporate Ventures
DENSO

Over the past three years, there have been significant advances in computer vision technologies. Highly technical advancements in camera, radar, and LIDAR technologies have provided the transportation industry with unique potential combinations of sensors, to deliver complete computer vision around the vehicle for ADAS and autonomous driving applications. Although much has been achieved already by startups working in the computer vision space, there will continue to be impactful changes over the next five years as the auto industry begins deploying these technologies. We will provide our perspective on the important criteria that our investment venture group examines when considering an investment in a vision sensor startup company. Topics will include expected sensor performance, the overall market opportunity, competitive landscape, and how to work within the auto industry to achieve success. We will also provide remarks on what successful startups usually do well, as well as areas of improvement for many startups.

Biography: Tony Cannestra is the Director of Corporate Ventures for DENSO International America (DIAM). DIAM is the U.S. subsidiary of DENSO Corporation, one of the world’s largest auto parts and systems suppliers. Tony joined DENSO in early 2014, and he leads DENSO’s Corporate Venture Capital Group, which is primarily responsible for identifying, and investing in, startup companies working on new technologies that are of strategic interest to DENSO’s future product roadmap. During his tenure at DENSO, Tony has made 12 investments and led a team that made an acquisition of a startup company. DENSO’s CVC Group focuses on investments in the areas of autonomous driving, connected vehicles, cybersecurity, electrification, and new mobility. He currently serves as a Board Director for ThinCI, Canatu, DellFer, and MetaWave, all of which are DENSO portfolio companies. Tony also continues to assist with DENSO’s startup M&A efforts. Prior to joining DENSO, Tony worked as an independent consultant helping clients with startup technology discovery in Silicon Valley. Prior to his work as an independent consultant, Tony was Executive Vice President of Ignite Group, a venture capital company based in Silicon Valley. Tony received a BA in International Economics from the University of California at Berkeley. He also earned an MBA, with a Certificate in Management of Technology, from the University of California at Berkeley.


The Road to Mobile Robots: Emerging Opportunities and Megatrends
Jim DiSanto
Managing Director
Motus Ventures

In the past five years, many of us in the venture capital community have witnessed an explosion of startup companies developing mobile robots for a myriad of applications. Typically, these robots are designed to displace a human worker. Examples include self-driving vehicles, drones, as well as cleaning, agricultural, delivery, and food preparation robots. We see this as only the tip of the iceberg: roughly 3.2 billion jobs consisting mostly of manual labor could be replaced by robots in the coming years and decades. This talk will focus on the following topics and questions: (i) when we can really expect to see these robots in operation, (ii) how much AI do these robots need, (iii) how to achieve adequate levels of perception and cognition, and (iv) what are the “low hanging fruit” use cases and applications. We will also discuss some of the leading startups, as well as leading academia R&D groups which will soon be bringing their technologies to market.

Biography: Jim DiSanto is the Co-Founder and Managing Director of Motus Ventures, an early stage venture capital firm funding businesses focusing on AI, robotics, and IoT. Jim serves, or has served, as a Board Member, Observer, and Advisor to companies including Waze, Autotalks, Quanergy, Doorman, Metawave, Argus Cyber Security, and others. Prior and during his tenure at Motus, Jim has formed extensive executive and technical level relationships with auto OEMs, T1 suppliers, logistics firms, global industrials, fleet management providers, telecommunications operators and suppliers, as well as university and government research organizations. Additionally, Jim built collaborative partnerships with research groups within the UC system (UCSD, UC Berkeley, UC Riverside), Stanford, and the University of Michigan, and with private research institutions (e.g. PARC), resulting in the commercialization of technology and founding of new businesses. Jim earned his Bachelor’s degrees in aerospace and computer engineering at the University of Michigan, and an MBA at Stanford University. Jim resides in Portola Valley and has two daughters: one at Stanford and one at Michigan. He spends some of his spare time snowboarding, sailing, golfing, and tending his vineyard.


Localization for the Next Generation of Autonomous Vehicles
Joel Gibson
Executive Vice President of Automotive
Swift Navigation

This talk will provide a brief background of GNSS technology and explain why the smartphone GPS does not deliver the accuracy that is required for autonomous vehicles (e.g., finding a restaurant is not the same as knowing one’s precise location down to a centimeter in a lane on the road). We will also discuss the primary role of GNSS and inertial sensors in autonomous vehicles, for both obstacle avoidance (to not hit other objects) and localization (to precisely navigate). In early applications, these tasks were primarily achieved using cameras and LIDAR. These optical sensors work well for the first task of obstacle avoidance but have significant limitations for precision localization. Visual navigation systems also suffer from a number of “corner” cases. Additionally, they can have problems navigating in areas where there is ambiguity, such as when road markings are not present or are unclear. They can also have challenges in certain inclement weather conditions, and they have limits in terms of robustness and safety, as was seen in at least one recent fatality where a visual navigation system incorrectly detected a lane boundary. By better understanding the “corner” cases where optical sensors in autonomous vehicles can fail, it becomes clearer how important it is to integrate high-precision GNSS into the sensor suite. We as an industry find ourselves at a technology inflection point where all enablers are available for high-precision localization. It is now possible to deliver centimeter-level accuracy through GNSS solutions and cloud correction platforms at an affordable, fleet-friendly price to be adopted by automakers.

Biography: Joel Gibson is Executive Vice President of Automotive at Swift Navigation and heads Swift’s Detroit, Michigan office. Joel is responsible for automotive strategy, enabling automated driving, precision navigation, as well as ADAS and safety applications. Previously, he served a long tenure with Magna Electronics, a company that provides innovative electronic systems to the automotive industry. During the course of 15 years with the Magna, Joel fulfilled many roles in both Germany and Michigan and began a camera product line that grew to be the largest automotive camera Tier-1 supplier worldwide. He holds a BS in Systems Engineering from Oakland University.


Robo-Taxis and the Race for the Future of Networked Mobility
Itay Michaeli
Director, U.S. Autos and Auto Parts
Citi Research

How close are we to ditching our personal cars? While we may not be up to personal flying taxis yet, it does seem that reality may finally be catching up with the hype. A handful of companies are pursuing various robo-taxi services (where the car is totally in control and humans are just passengers) to build urban rideshare networks in the coming one to three years. These are being planned for cities and surrounding suburbs, and the race to launch and commercialize these robo-taxis is all about building a powerful network effect. This network effect is determined by who can introduce and scale safe, reliable, fast, and low-cost urban robo-taxi fleets. We estimate the U.S. robo-taxis addressable market alone could exceed $350 billion, with high margins for the network leaders, yielding a nearly $1 trillion enterprise value. We also we see the market for Tier1 suppliers in advanced driver-assistance systems (ADAS) and autonomous vehicles rising to more than $100 billion by 2030 from $5-6 billion today. This talk will explore these topics and share key insights from our most recent research.

Biography: Itay Michaeli is a Director at Citi covering the U.S. Autos & Auto Parts sector, having joined the firm in 2001. Itay also heads the Global Auto sector for Citi Research. In recent years Itay has ranked Runner-up in the Institutional Investor All-America Research Survey. In 2012, Business Insider named Itay as one of “The 36 Best Analysts on Wall Street”. In 2016 Itay was ranked #1 for stock picking accuracy in the Thompson Reuters Analyst Awards, after also achieving a #1 ranking in the 2014 Starmine awards.


ADAS Sensors for Military Vehicle Applications
Robert Sadowski, PhD
Chief Roboticist
U.S. Army CCDC Ground Vehicle Systems Center

ADAS sensors and components are critical for future military manned, optionally manned, and unmanned vehicles. Tactical truck convoys were under constant threat from improvised explosive devices during recent overseas operations. This led the U.S. Army to develop optionally manned “Leader Follower” technology, initially as an add-on feature. The Leader Follower system allows a leader vehicle to control up to three unmanned follower vehicles at tactically relevant distances and speeds. Most Army tactical wheeled vehicles are legacy platforms that lack many modern automotive features, from anti-lock brakes to cruise control, leading to some unique integration challenges. Active safety and driving aids included within the upgraded package are heavily reliant on commercial ADAS sensors and controllers. Later this year, the U.S. Army is issuing sixty optionally manned, heavy tactical logistics platforms to soldiers for operational evaluation. This talk explores the sensor fusion of camera, radar, LIDAR, GPS, INS, and V2V integration into the Leader Follower system and how those components enable limited semi-autonomy. We’ll also discuss the unique challenges of operating without a priori knowledge or infrastructure to support autonomous behavior.

Biography: Dr. Robert W. Sadowski is a member of the Scientific and Professional (ST) cadre of the Senior Executive Service and serves as the United States Army’s Senior Scientist for Robotics within the Research, Technology, and Integration Directorate at the United States Army CCDC Ground Vehicle Systems Center in Warren, Michigan. Dr. Sadowski previously served 29 years of active United States Army duty, culminating as the Electrical Engineering Program Director and Academy Professor at the United States Military Academy in the Department of Electrical Engineering and Computer Science, where he continues serving as adjunct faculty. Dr. Sadowski has over 40 months of operational experience in Southwest Asia in a variety of leadership, staff, and engineering positions, including Iraq and Afghanistan. He currently chairs RDECOM’s Community of Practice in Robotics and is guiding the development of robotics and autonomous systems technology as part of the third offset strategy. Dr. Sadowski is a graduate of the United States Military Academy with a Bachelor of Science in Electrical Engineering (BSEE) and received his MS and PhD in electrical engineering from Stanford University as a Fannie and John Hertz Fellow. He also holds a Master’s degree in Strategic Studies from the United States Army War College.


Analog AI Technologies for ADAS Applications
David Schie
CEO
AIStorm

Artificial intelligence for ADAS applications is a hot topic, with numerous sensor technologies vying to provide improved performance, lower power, and minimum latency. On the imager side, SPAD, SiPM, and CIS devices are improving our ability to increase the field of view, depth perception, and react quickly to external factors. Unfortunately, the quantity of information and the nature of the data, including extremely short duration pulses, can be challenging for existing solutions. These current solutions generally follow the input sensor with a mux/TIA combination, with multiple high performance ADCs and a DSP subsystem. Alternatives to such systems are emerging -- these are analog solutions, which can accept data directly from the sensor and perform required analysis completely in the analog domain. The result is reduced cost, power, and latency, and also reduced risk of missing or mis-reading pulses during digitization. This talk will include an overview of emerging imaging solutions including various approaches to LIDAR and details of CIS/optics, as well as how analog AI can be used to improve the processing of the imaging solutions by reducing the chance of missing key data, pruning data in real time, and eliminating expensive components such as ADCs. The talk will also highlight several leading companies focused on analog AI technologies, as well as several academia R&D groups.

Biography: David Schie is the CEO of AIStorm, a Silicon Valley based analog AI startup which recently emerged from stealth. AIStorm is focused on bringing the advantages of its AI-in-sensor technology to the market, providing standard AI architectures without the requirement for digitizing vast amounts of input data. AIStorm provides fully integrated solutions including sensors and AFE (e.g. SPAD, SiPM, CIS, mirror driver, mirror, biasing, etc.) which are provided as semi-custom solutions for its customers. Key investors in AIStorm include Egis Technology, Tower Semiconductor, Meyer Corporation, and Linear Dimensions Semiconductor. David was educated at the University of Toronto in analog IC design and has held several high-level management positions up to Senior VP with responsibilities including product definition, engineering, and P/L for leading companies including Maxim, Micrel, and Semtech.


V2X Technologies: Complementary Enhancement for ADAS Applications
Gary Streelman
Director Advanced Engineering and New Concepts
Magneti Marelli

Vehicles today are safer than they were in the past due in part to the addition of advanced driver-assistance systems, or ADAS. These systems “look” around the vehicle to determine where there might be something that would cause an unsafe situation, and then are able to alert the driver. Another approach to improve safety is to have vehicles communicate with each other and other objects wirelessly, which is called vehicle-to-vehicle (V2V) or vehicle-to-everything (V2X). In a way, V2X can act as another sensor input that adds to the current suite of sensors used for ADAS applications. There are even those in the industry (who maybe are a bit extreme) who think that if all vehicles are equipped with V2X, the need for other sensors could be substantially reduced. For example, in a situation where a vehicle is following a truck that blocks the view of the road ahead, if the vehicle in front of the truck starts to brake, then the vehicle driver behind the truck has no way of knowing this with an ADAS system since the sensors cannot see around the truck. If the vehicles were talking to each other using V2V, the vehicle braking in front of the truck would use V2V communications to let the vehicle behind the truck know they may need to be ready to brake. In this session, we will discuss how V2X technology works and why this technology is a game changer in improving safety. V2X will give vehicles the information to understand when driving conditions are potentially becoming unsafe and then enable corrective action to be implemented that will mitigate these unsafe situations.

Biography: Gary Streelman is Director Advanced Engineering and New Concepts for Magneti Marelli Electronics Systems. In this role he works with global teams to develop infotainment systems and vehicle connectivity programs, including connected services, media applications, and new technologies. He has developed wireless safety and mobility applications used in vehicle communications. Prior to his current role he served as Program Manager at Magneti Marelli Electronics Systems. Gary received his BS and MS degrees in engineering from the University of Michigan.


ADAS Sensors and Functional Safety: From Concept to Post-Production
Tom Tasky
Director, Smart Vehicle Team
FEV North America

Realizing functional safety in ADAS/AD systems requires that the system mitigate any action that could cause harm or damage. The ever-evolving complexity of such systems is dependent on the development and integration of ADAS/AD sensors that are cutting edge, as well as the challenge of ensuring that these sensors remain safe, secure, and functional. Never has functional safety and cybersecurity been more important than with the sensors used in today’s ADAS platforms and the rapidly approaching roll-out of autonomous vehicles. Several of the world’s major automakers have pegged 2021 as the launch year for new models with fully autonomous capabilities, and ensuring that these vehicles operate without fault will be vital. Functional safety includes addressing emerging challenges stemming from new sensor technologies, sensor fusion, advanced mobility trends, increasing connectivity, and new cyber threats. The optimized designs required by these systems and their sensors must be complemented by supervisory safety and threat protection measures in order to yield desired results for autonomous driving. For example, machine learning techniques are used to enhance the detection performance of ADAS sensors such as radar, LIDAR, and cameras. However, these techniques have the potential to introduce systematic faults through which safety is compromised in the development phases. Additionally, the verification plans of ADAS sensor technologies include testing corner cases, where a malfunction could present harm to the vehicle or traffic participant. This talk will cover some of the challenges of achieving functional safety in the development of ADAS systems and autonomous vehicles.

Biography: Tom Tasky is the Director of the Smart Vehicle Team at FEV North America. His responsibilities include ADAS/autonomous development, vehicle connectivity, functional safety, and cyber security for a wide range of industries and applications. Tom has over 28 years of industry experience ranging from powertrain systems development, vehicle diagnostics, and networking, as well as electrical systems development and integration for prototype and series production programs for gasoline, battery electric, hybrid, and automated vehicles.


Technologies for Autonomous Cars: Musings on Missing Pieces and Proposed Solutions
Harvey Weinberg
Division Technologist, Automotive Business Unit
Analog Devices

Autonomous cars have been promised as being “right around the corner” for several years. But there are still several significant remaining road blocks standing in the way, including sparse legislation governing autonomous vehicles, inadequate infrastructure to support autonomous driving, lack of standards, and several technical gaps that keep Level 4 and 5 vehicles out of reach. This talk will touch on several of these gaps, with particular attention to outstanding technical challenges. In particular, we will focus on two topics: (1) how to achieve cm-level localization in largely all situations, and (2) how to get reliable perception (i.e. camera, LIDAR, radar data) in poor visibility environments, with some proposed solutions for each topic. We will also discuss briefly a few startups which have recently come onto the marketplace with potential solutions to these technical challenges.

Biography: Harvey Weinberg is the Division Technologist for the Automotive Business Unit at Analog Devices. Over the past few years, he has been working on long-time horizon technology identification as it pertains to automotive; recently, this has been principally LIDAR. Prior roles at ADI have been System Application Engineering Manager for the Automotive BU and before that, leader of the Applications Engineering group for MEMS inertial sensors. He has 10 US patents in technologies varying from ultrasonic airflow measurement, to inertial sensor applications, to LIDAR systems. Harvey has been at ADI for 22 years. Before ADI, he worked for 12 years as a circuit and systems designer specializing in process control instrumentation. He holds a Bachelor of Electrical Engineering degree from Concordia University in Montreal, Canada..


(2019 Technology Showcase speakers – listed alphabetically, by company name)

Dan Dempsey
Sr. Director of Business Development
ACEINNA

Faroog Ibrahim, PhD
CTO
CalmCar Vision Systems

Thomas Carmody
Business Development Manager
Cambridge Consultants

Angus Mackay
Director, Marketing and Communications
Immervision

Steve Lemke
Principal Engineer, Advanced Automotive Platforms
LG Electronics

Dhiraj Bora
Executive
Silitronics Solutions

Eduard Sterzer, PhD
Strategy and Sales
Tau Industrial Robotics


(2019 Startup Showcase speakers – listed alphabetically, by company name)

Yonatan Dishon
Computer Vision and Machine Learning Group Manager
AdaSky

Florian Petit
Co-Founder
Blickfeld

Bo Zhu, PhD
CTO and Co-Founder
BlinkAI

Rony Ohayon, PhD
CTO, GM of Automotive, and Co-Founder
LiveU

Jason Corso
CEO
Voxel51


Many thanks to our speakers from ADAS Sensors 2018

Compute requirements for emerging automotive architectures
Soshun Arai
Director of ADAS/Automated Driving Platform Strategy
Arm

Many automotive OEMs and start-ups are developing autonomous vehicles for 2020, but their prototype systems use several thousand watts of power, a level of consumption that is not acceptable for a deployable system. However, new sensors and compute technologies are rapidly transforming the capabilities of our cars, and the fusing of data from technologies such as LIDAR, radar, and cameras will drive autonomy. Autonomous capabilities will soon radically change perception of our vehicles, both in terms of purpose and usage. In this session, we will examine these new capabilities and their applications, as well as how hypervisor technologies work to enable consolidation of automotive ECUs by combining different functions into the same processor. We will also discuss emerging trends in capabilities of next generation processors and compute architectures such as security features and functional safety, and examine how the power consumption required for autonomous drive can be reduced.  This presentation will also provide a brief overview of emerging automotive architectures and the key company in the platform ecosystem.

Biography: Soshun Arai is a Director of ADAS/Automated Driving Systems and Strategy at ARM. Mr. Arai has been working at ARM more than 8 years and has international working experiences in Japan, UK, and US. He works with car OEMs and Tier1 suppliers directly to propose automotive systems in terms of processor technologies and develops ARM ecosystem for the automotive segment. Earlier in his career, Mr. Arai worked at Infineon as a Senior Application Engineer for automotive and he has more than 15 years experiences for automotive applications such as body, EV, chassis, IVI and ADAS. Mr. Arai holds a Master’s of Electrical Engineering from Tokyo University of Science.


Packaging and assembly challenges for ADAS applications
Linda Bal
Senior Analyst
TechSearch International

Package choice, design, and materials impact the performance of ADAS sensors and the sensor fusion processors used to analyze sensor input. This presentation will discuss the type of semiconductor packages used for radar, camera, LIDAR, and sensor fusion functions in ADAS. Camera sensors packages include BGAs and WLPs. Future options include stacked die formats. The differences in design compared to similar types of packages found in other applications will be discussed. Radar modules packages have migrated from QFNs and QFPs to fan-out wafer level packages (FO-WLPs) and FC-CSPs. Radar modules from companies such as Infineon, NXP, and Calterah use FO-WLP, but package designs differ from those used in smartphones. TI and others use flip chip. Many sensor fusion processors are packaged in FC-PBGAs, but the materials used to fabricate the substrate and underfill materials differ from packages used for other applications. Power dissipation requires thermal materials and solutions that can also meet automotive reliability specifications. Advantages of each package type will be discussed, and packaging options and trends will be presented. Packaging challenges in meeting automotive specifications will also be discussed.

Biography: Linda Bal is a senior analyst with TechSearch International, Inc., which has provided market research and technology trend analysis in semiconductor packaging since 1987. Linda has more than 25 years of experience in the design, test, and manufacturing of electronic packaging for semiconductors and systems while at Freescale, Motorola, Microelectronics and Computer Technology Corportation (MCC), and Eastman Kodak. Linda has authored and co-authored numerous publications. She has experience in packaging for ADAS sensors. She is a co-author of “New Frontiers in Automotive Electronics Packaging,” a major research study published in 2018 by TechSearch International. Linda is a member of IEEE and IMAPS. She has been a member of the IMAPS technical committee since 2007, chaired the FC/WLP Track for the DPC in 2013 and 2014, co-chaired the FC track in 2009 and 2010, and has been a moderator for technical panels. Linda received her BSEE degree from Purdue University in 1985.


Emerging role of sensors in the transition towards autonomous vehicles
Frederic Bruneteau
Managing Director
Ptolemus Consulting Group

Leveraging our market research, we will discuss in this presentation the different technological deployment scenarios available to OEMs to build ADAS-enabled and automated vehicles. We will evaluate the technical and economic trade-offs to be made between radars, LIDARs, cameras, UWB sensors, high definition maps, high accuracy GNSS, ultrasonic sensors, and other types of technologies. The session will also highlight how redundancy can be obtained by combining multiple layers of independent sensors in the different road environments and weather conditions. In addition, we will discuss how high-definition maps contribute to determine the position of the vehicle. Finally, the most likely deployment scenarios in the different levels of automation will be covered as well as their impact on the sensor fusion process and computational requirements of the system. The session will also bring forecasts on the expected volumes of vehicles for the different automation levels and regions in the world. We will also provide an ecosystem map of the key players, as well as an overview of the most interesting and notable ADAS sensors startups.

Biography: Frederic Bruneteau has accumulated over 20 years of experience of the mobility domain. He has become an authority in the domain of connected and autonomous mobility and is often interviewed on the subject by the Wall Street Journal, the Financial Times and The Economist. As the firm's founder, he has helped many clients define and execute their strategy including Allianz, HERE, Kapsch, Liberty Mutual, Michelin, Qualcomm, Scania, Toyota, Vodafone, and WEX. He has led over 80 connected and autonomous mobility assignments. For example, he helped CNES, the French space agency, evaluate potential opportunities created by connected and autonomous vehicles for space technologies and markets. He also assisted a consortium of OEMs and map makers identify the market potential for HD maps ready for automated vehicles. Mr. Bruneteau co-authored the recent Autonomous Vehicle Global Study. He is also the President of The Autonomous Club (TAC).


Automated and connected vehicle technology: trends, promises, and perils
Eric Dennis
Systems and Policy Analyst
Center for Automotive Research (CAR)

Connected and automated vehicles continue to attract attention. Despite all the hype, there is real progress being made. This presentation will provide a broad overview of trends in the deployment of automated and connected vehicle systems and a discussion of future directions. Popular narratives imagine a near-future transportation system centered around autonomous vehicles, providing cheap, reliable, on-demand mobility for all. While the rapid development of automated and connected vehicle technologies certainly have the potential to revolutionize road travel, the transportation network is a complex adaptive system influenced by legal, social, economic, and other factors. This presentation will also provide an overview of policy and planning frameworks evolving in response to emerging automated and connected vehicle technologies. Chief among regulatory considerations is the emerging patchwork of regulatory approaches among the various states and the U.S. DOT, as well as the potential for over-riding federal legislation.

Biography: Eric Paul Dennis is a Transportation Systems Analyst at the Center for Automotive Research (CAR) in Ann Arbor. He is a Michigan-licensed Professional Engineer with a background in economic development, and policy analysis. Eric has been at CAR for over five years, researching and consulting to government and industry regarding the interaction of emerging transportation technologies and public policy. Eric has Master’s degrees from the University of Michigan in Urban and Regional Planning, as well as Environmental Engineering. He also has a Bachelor’s degree from Michigan State University in Civil Engineering.


AUTOSAR: an automotive software platform for intelligent mobility
John Gonzaga
Project Leader
General Motors

Automated driving causes completely new requirements for software platforms for the in-vehicle computers (IVC), which evolve from an isolated on-board to a broadly connected systems with high-performance computing capabilities. AUTOSAR, as the worldwide leading standardization organization for in-vehicle software platforms, bears this challenge and paves the way making the car an intelligent and adaptive vehicle. But how can a standard be that flexible and fast to adapt to the continuously evolving needs of those applications and use-cases? AUTOSAR has developed a completely new approach to cope with that challenging environment in order to make vehicles intelligent and adaptive. This new platform enables dynamic deployment of customer applications, provides an environment for applications that require high-end computing power, and connects deeply embedded and non-AUTOSAR systems seamlessly. This presentation will explain the challenges and approaches to master requirements for next generation cars and how AUTOSAR achieves fast reactions on market needs. Furthermore, this presentation will outline the current status and the plans for the upcoming releases AUTOSAR releases.

Biography: John Gonzaga is currently the Vehicle Communications Network Architect and the AUTOSAR Project Leader at General Motors. John has been with General Motors for more than 10 years and has been the AUTOSAR Project Leader for 3 years. John has a background in Computer Engineering and is experienced in embedded systems design and development of consumer and automotive products.


Hardware-in-the-loop implementation and validation techniques for autonomous vehicles
Adit Joshi
Research Engineer
Ford

The objectives of our research work are two-fold. The first objective focuses on a real-time powertrain-based hardware-in-the-loop (HIL) implementation and validation of an SAE Level 2 autonomous vehicle. The second objective focuses on studying the performance of SAE Level 2 autonomous vehicles during takeover scenarios due to subsystem faults. To accomplish these objectives, an acceleration-based adaptive cruise control (ACC) was combined with a path-following lateral control along with supervisory control for system mode transitions due to system deactivations and faults. This research presents system modes in which longitudinal control only and lateral control only are engaged as fallback states to the full autonomous system being faulted for lateral control and longitudinal control failures respectively. Simulations were conducted to evaluate the performance of the autonomous controls when subjected to these faults. A powertrain subsystem representative of the 2017 Ford Fusion Hybrid was used as the hardware simulation platform using a dSPACE HIL simulator and CarSim RT.

Biography: Mr. Adit Joshi is currently working as a Research Engineer in Automated Driving HIL Simulation in the Autonomous Vehicles and Controls department at Ford Motor Company. His current responsibilities include supporting the HIL simulation of Ford’s Autonomous Vehicle Platform alongside supporting CAE simulation activities related to electrification and connectivity. Mr. Joshi’s previous experience includes working as an Engineering Specialist working on HIL simulation and test automation at General Motors. Before joining automotive industry, Mr. Joshi graduated with a Master of Science in Mechanical Engineering specializing in Vehicle Dynamics and Controls, and a Bachelor of Science in Electrical Engineering specializing in Controls, both from The Ohio State University. He is passionate about autonomy, electrification, and connectivity. He has also authored three SAE International technical papers on HIL simulation.


The pursuit of the perfect ADAS and autonomous sensor set
Peter Labaziewicz
Director of Engineering
Texas Instruments

One of the most critical functions a vehicle will have in the age of ADAS and autonomous driving is to accurately perceive and understand the world around it. This ability to accurately sense and analyze surroundings in various conditions will be critical to the safe and successful implementation of higher levels of driver assistance and eventually to practical fully autonomous driving. To accomplish this, the vehicle will need a set of sensors of different modalities (camera, radar, lidar, and others) and varying fields of view. The need to operate at high speeds also drives increase in sensor resolutions to be able to sense objects at greater distances. Once the world around the vehicle is sensed, all the vast amounts of sensor data need to be processed and fused into a single world view. The next step of making sense of all the data, extracting and classifying objects using both traditional machine vision as well as newer deep learning techniques, involves massive amounts of compute and data movement. This is driving massive investment and breakthroughs in new compute cores and data management, as well as new SoC and system architectures.

Biography: Peter Labaziewicz is Director of Engineering, for imaging and vision for automotive processors at Texas Instruments. He has over 20 years of industry experience in the consumer product and industrial applications of camera systems, video, and vision technologies, including 12 years of expat experience in Japan and China. Peter started his career at Eastman Kodak, working on the development of some of the first consumer digital cameras. Joining Texas Instruments in 2007, Peter worked on imaging, video, and vision SoC solutions for consumer and industrial markets. He is currently directing imaging and machine vision technology development at TI, focusing on automotive advanced driver assistance and autonomous driving.


Emerging LIDAR concepts and sensor technologies for autonomous vehicles
Jake Li
Marketing Specialist, LiDAR and SiPM
Hamamatsu Photonics

Sensor technologies such as radar, camera, and LIDAR have become the key enablers for achieving higher levels of autonomous control in vehicles, from fleets to commercial. There are, however, still questions remaining: to what extent will radar and camera technologies continue to improve, and which LIDAR concepts will be the most successful? This presentation will provide an overview of the tradeoffs for LIDAR vs. competing sensor technologies (camera and radar); this discussion will reinforce the need for sensor fusion. We will also discuss the types of improvements that are necessary for each sensor technology. The presentation will summarize and compare various LIDAR designs -- mechanical, flash, MEMS-mirror based, optical phased array, and FMCW (frequency modulated continuous wave) -- and then discuss each LIDAR concept’s future outlook. Finally, there will be a quick review of guidelines for selecting photonic components such as photodetectors, light sources, and MEMS mirrors.

Biography: Jake Q. Li is in charge of research and analysis of various market segments, with a concentration in the automotive LiDAR market. He is knowledgeable about various optical components such as photodetectors -- including MPPC (a type of silicon photomultiplier or SiPM), avalanche photodiodes, and PIN photodiodes -- and light emitters that are important parts of LIDAR system designs. He has expert understanding of the upcoming solid-state technology needs for the autonomous vehicle market. Together with his experience and understanding of the specific requirements needed for LIDAR systems, he will guide you through the selection process of the best photodetectors and light sources that will fit your individual needs..


ADAS to autonomy: existing and emerging implementation challenges
Roger Smith
Senior Staff System Application Engineer
Infineon

As the automotive industry transitions from traditional ADAS systems to more advanced autonomous drive (AD) topologies, many system level challenges are introduced. Many of these challenges aren’t discovered until late in the design cycle leading to increased costs, time-line slips, last-second changes to the requirements, and improvised fixes. This talk will include an overview of the safety advantages of ADAS and AD systems, as well as architecture differences between traditional ADAS and AD topologies. We will also provide an overview on the required sensor diversity (with an emphasis on LIDAR and radar) and highlight some of the challenges for implementing an AD system, including power supply requirements, power dissipation, EMC, cabling, and vehicle styling. We will also look at the increased demands on micro-controller and fail operational requirements. We hope these insights will help the OEM and Tier1 system suppliers identify some of these problems up-front to bring these advanced systems to production in a more timely and cost effective manner.

Biography: Roger Smith is a Senior Staff System Application Engineer supporting braking, EPS, passive safety, and ADAS applications for several Tier1 and OEM customers. He has over 25 years of industry experience with designing, implementing, and debugging automotive electronics. Roger earned a BSEE from Michigan State University and a MSEE from Wayne State University and has worked at Infineon for the last 10 years. In his spare time, Roger enjoys running and personal fitness.


ADAS technology trends: sensing, lighting, safety, personalization and HMI
Rajeev Thakur
Regional Marketing Manager
OSRAM Opto Semiconductors

While the media is frenzied with the latest developments around the autonomous car and vague visions of the future, the ADAS market is growing rapidly to meet consumers' current desires for active safety features on the exterior and interior of the car. For example, it has also become painfully apparent that poorly executed human-machine interface (HMI) can strongly hinder adoption of otherwise useful ADAS technologies. This presentation will attempt to tie together the need for a clear sensor fusion strategy that is seamless to the driver with the correspondingly effective HMI. The talk will also discuss, for the car exterior sensing, the challenges and opportunities faced by LIDAR and long-range infrared camera illumination techniques. For the car interior sensing, we will discuss how the infrared camera for driver monitoring now has market traction and provide detail about how this technology is moving towards iris recognition. The talk also discuss head up displays (HUDs) which may also play an increasingly important role for HMI, while biometric sensing and gesture recognition struggle to leap from R&D concepts to mass adoption.

Biography: Rajeev Thakur is currently the Regional Marketing Manager at OSRAM Opto Semiconductors, responsible for infrared product management and business development in the NAFTA automotive market. His current focus is on LIDAR, driver monitoring, night vision, blind spot detection, and other ADAS applications. Mr. Thakur joined OSRAM Opto Semiconductor in 2014. He has prior experience in the Detroit automotive industry since 1990s working for companies such as Bosch, Johnson Controls, and Chrysler. He has concept-to-launch experience in occupant sensing, seating, power train sensors and infrared sensing technology. He holds a master's degree in Manufacturing Engineering from the University of Massachusetts, Amherst and a bachelor's degree in Mechanical Engineering from Guindy Engineering College in Chennai, India. He is a licensed professional engineer and holds a number of patents on occupant sensing. He is also a member of the SAE Active Safety Standards development committee.


Ensuring GNSS RF and MEMS signal coherence in location based systems
Jeff Warra
Senior Technical Marketing Engineer
Spirent

ADAS applications require a new level of location based system (LBS) technology which use fused sensor data from multiple sources to achieve reliable and safe autonomous control. Signal strength and multi-path conditions in large cities are one of the several challenges with GNSS signals. Accuracy of several meters of civilian GNSS signals is not adequate for critical autonomous vehicle functions such as lateral steering control. To address these challenges, multiple sensor sources are required. This presentation will address the GNSS and IMU sensor data fusion, and how to use simulation to test how well the data is combined to produce a more precise vehicle position. Technologies used today for active steering control via image recognition are limited to near-sight apertures, under all driving conditions such snow, fog, direct sunlight, heavy rain, dirt roads, and new roads. To control costs of developing repeatable and reliable control, algorithm testing must start at the bench level using hardware in the loop (HIL) simulation to test risky driving maneuvers, corner case scenarios, and redundant data flows.

Biography: Jeff Warra is a Senior Technical Marketing Engineer at Spirent where he works on developing the next advancements for the automotive, aerospace, and off-highway sectors. Jeff has over 18 years of experience in the engineering field at Tier1, OEM, and test equipment manufacturers. Jeff has become a specialist on advanced technologies and has held various positions in the industry such as development, test, applications, and project engineering. Being focused on safety critical systems early in his career has allowed him to build a solid foundation on electrical and software engineering principles and practices.


(2018 Technology Showcase speakers – listed alphabetically, by company name)

Santanu Basu
Senior Research Associate
Corning

Vivek Moudgal
Vice President of Sales
dSPACE

Nik Tryer
Key Account Manager
ePAK International

Ed Briggs
Regional Sales Manager
Indium

Stuart Ferguson
Automotive MEMS and Sensors Marketing
STMicroelectronics

Jim Hutter
Marketing Manager
Vector North America


(2018 Startup Showcase speakers – listed alphabetically, by company name)

Jeremy Agulnek
Vice President of Connected Car
HAAS Alert

Matthew Weed
Director, Technology Strategy
Luminar

John Xin
CEO
Lunewave

Kris De Meester
VP Sales and Business Development
XenomatiX

Vai Viswanathan
Co-Founder
Zenith Robotics


Many thanks to our speakers from ADAS Sensors 2017.

Sensor fusion: bridging the gap from ADAS to autonomous driving
Eric Balles, ScD
Managing Director, Transport and Energy
Draper

The functionality of ADAS sensors, many of which are based on MEMS components, is nothing short of astounding. In the past five years, vision systems with embedded object recognition and tracking have become commonplace and miniature radar modules are widely available with 3D object tracking capabilities for dozens of simultaneous targets. As both are deployed on more production vehicles, their costs continue to decrease and life-saving features are helping more consumers each year in Level 2 and Level 3 autonomous driving systems. It is only natural then to ask “Can these sensors meet the needs of Levels 4 and 5 autonomous systems?” In the past two years we have explored this question and discovered challenges and opportunities in adapting ADAS technologies to Level 4+ automated driving. At the same time, a new approach has arisen that breaks the current paradigm of using the on-board processing outputs of today’s ADAS sensors. This new approach instead favors pushing “raw” data from each sensor into a centralized high-powered processor where machine-learning algorithms perform all the required fusion tasks. We will review lessons learned from integrating ADAS vision and radar sensors into a prototype Level 4+ autonomous driving system which fuses the “object” tracking data provided by each ADAS sensor into a 360 degree map of cars and pedestrians during urban driving. We will compare this approach with the emerging trend of centralized high performance computing fusion and discuss the impacts both approaches have on future autonomous vehicle architectures.

Biography: Dr. Eric Balles joined Draper in 2011 to launch and build a commercial business focused on solving the most difficult issues facing the transportation and energy sectors. He has overall responsibility for these commercial areas at Draper including strategic planning, customer relationship, business development, project execution and financial performance. Dr. Balles’ career spans several industry sectors including automotive, electricity and oil & gas. He started his career in the automotive industry and has worked extensively in Europe with auto makers and tier one suppliers. Dr. Balles earned SB, SM and ScD degrees in mechanical engineering from the Massachusetts Institute of Technology.


Cooperative sensing for connected automation
Roger Berg
Vice President North America Research and Development
DENSO International America

Recent key initiatives and programs promoting the research and deployment of active safety features and automated transportation have been grabbing media headlines for the past few years. Most current high visibility developments for ADAS/automated vehicles focus on using traditional on-board sensing along with advanced machine learning for perception and computation to execute the driving task. This presentation and discussion reflects on some scenarios where such traditional on-board sensing, for example, multi-spectral devices such as radars, cameras, LIDARs, ultrasonic transducers, or their combination, doesn’t currently always meet the needs for highly automated operation, particularly in denser urban or suburban environments, where cooperative sensing and response might provide additional benefits for the occupants of a highly automated vehicle. We discuss the possible advantages of adding various types of wireless connectivity to optimize safety and efficiency of the onboard and roadside cooperative ADAS/AV systems, and their operation in the transportation domain.

Biography: Roger Berg is Vice President of DENSO’s North American Research and Development group. He is responsible for DENSO’s regional Research and Development of connected vehicles, automation and cyber security at their laboratories in both California and Michigan. He also coordinates closely with DENSO’s research and product groups in sister organizations located in the US, Japan, Germany and China. Mr. Berg earned a Bachelor of Science degree in Electrical Engineering from the University of Illinois in Urbana, Ill., and a Master of Science in Electrical Engineering from Illinois Institute of Technology in Chicago. He is the inventor or co-inventor on eight U.S. and international patents.


LIDAR augmentation techniques for affordable ADAS deployment
Raul Bravo
CEO
Dibotics

Autonomous driving and ADAS markets are fueling an unprecedented level of investment and energy focused on developing smaller, cheaper and better LIDAR sensors, allowing for a better perception and understanding of the environment around the car. However, the LIDAR hardware is only a part of the equation. Real-time 3D perception software for localization, segmentation and obstacle detection and tracking can solve the other part. This presentation will discuss how LIDAR-only data can be used to achieve a high level of reliability and application features for ADAS, de-correlated from other sources of information like radar, and how a combination of software and hardware is a mandatory condition to meet the challenge of both a high performance and low cost perception capabilities in ADAS applications. We'll present how a technique called "sensor augmentation" can help to meet the objective of affordable and accurate 3D perception.

Biography: Raul Bravo, CEO co-founder of Dibotics, creator of the Augmented LIDAR technology. Raul is a serial entrepreneur in mobile robotics & start-up coach, with an extensive 15 years background in both bootstrapped & VC-backed start-up creation and growth. An engineer from UPC (Barcelona, Spain) with an MBA from College des Ingénieurs (Paris, France), he’s filed 10 patents and obtained 27 different awards for his engineering and entrepreneur career, among them the “MIT Technology Review – Top 10 Innovators Under 35”.


Evolution of ADAS architectures, sensor fusion, and the future of the autonomous car
Jeremy Carlson
Principal Analyst and Manager
IHS Markit

The autonomous car figures prominently in the future vision of the automotive and mobility industries. Many OEMs have implemented intermediate milestones through advanced driver assistance (ADAS) functions on the path to achieving a fully autonomous car with significant growth across nearly all ADAS applications as the market marches from Level 0 into Level 2/3 with the latest vehicles coming to market. Challenges on that path are plenty, from the relatively simple but cost-effective democratization of features to more complex technical challenges in sensor technologies such as lidar, cost-efficient and computationally scalable sensor fusion computing, and vehicle architectures to connect everything together. This presentation will illustrate how the building blocks of autonomous driving are evolving: with an emphasis on Levels 3 to 5, a focus on the key additions to the architecture such as lidar, sensor fusion ECUs and in-vehicle architecture, and the changing landscape as autonomous technology shapes personal mobility.

Biography: Jeremy Carlson is a Principal Analyst and Manager with the automotive team at IHS Markit in the areas of autonomous driving, mobility and automotive technology. He has worked in automotive electronics market research and analysis with a focus on driver assistance, sensors, autonomous vehicles and mobility for 9 years in the analyst role. He now leads the Autonomous Driving practice for IHS Markit in addition to being a key contributor to emerging mobility topics. Mr. Carlson’s primary areas of focus include automated and autonomous driving and new mobility services following on years of experience in advanced driver assist systems, technologies and sensors. Complementary research includes technical topics, regulation and legislation, and the deployment of new technologies as they enter and are made more broadly available across the market. He has worked with a number of OEM, supplier and technology companies in supporting both syndicated and custom analysis to support critical decisions that shape the automotive and transportation business landscape.


Critical component for 77 GHz imaging automotive radar: metamaterial frequency-adaptive steerable technology (M-FAST)
Bernard Casse, PhD
Area Manager
PARC, a Xerox company

Today, there is no 77 GHz high-resolution 3D imaging radar – a crucial weatherproof sensor for autonomous driving, which is capable of sensing, discriminating, and tracking road objects. The state of the art in automotive radar is digital beam forming (DBF), which has overly complex hardware, is computationally intensive (slow), and cannot support high signal-to-noise ratio and high-resolution simultaneously. DBF is currently reaching its limits in terms of performance. At PARC, we developed M-FAST: meta material frequency-adaptive steerable technology – a disruptive technology based on engineered metamaterials that is capable of true analogue beamsteering, while being free from the limitations of DBF. M-FAST is poised to displace DBF-based technologies, and represents the next frontier for 77 GHz imaging radars. In this talk, we will provide an overview of this transformative technology, and also talk about Metawave, PARC’s VC-backed spinoff with the charter to accelerate development of M-FAST for both automotive and 5G communications.

Biography: Dr. Bernard Casse manages the Metamaterial Devices and Applications R&D area at PARC. He is responsible for making strategic investments in early-stage technology platforms, overseeing a portfolio of multi-year, multi-million dollars projects, supporting applied R&D operations, defining the strategic agenda for emerging technologies, and leading a team of word-class performers. Prior to his role at PARC, he was a program manager at Physical Sciences Inc. (PSI), a defense contractor company, where he led and managed U.S. Government-sponsored programs focused on developing disruptive technologies and advanced manufacturing. In earlier years, Bernard was a research scientist at the Electronic Materials Research Institute at Northeastern University. And, in parallel, he was a qualified cleanroom user of the Harvard Center for Nanoscale Systems (CNS) and the Center for Functional Nanomaterials (CFN) at Brookhaven National Laboratory. Bernard holds a PhD in Physics from the National University of Singapore, and was a member of the Technical Staff at the Singapore Synchrotron Light Source (SSLS).


Designing sensor algorithms for the automobile environment
Tony Gioutsos
Director of Sales and Marketing
TASS International

The difference between an automobile environment and other environments when designing an algorithm is substantial. Most important is the need for safety. When driving a vehicle at a high rate of speed, any error can produce tragic results. Because automobiles are expected to survive 15 years in all kinds of conditions, it is basically impossible to design sensor algorithms that are tested for all kinds of scenarios that could be encountered. Also, sensor variation occurs throughout the life of the automobile causing even more difficulty. In this presentation, we outline a generic approach to designing sensor algorithms that are robust to the real-world. We look back on techniques and testing methods used in other automobile sensing system algorithms. Some of the areas of discussion include: sensor noise, Monte Carlo techniques, the combination of real-world testing, test track testing and simulation, intelligent outside-the-box design, concatenated events, etc.

Biography: Mr Tony Gioutsos has been involved with automotive safety systems since 1990. As Director of Electronics R&D for both Takata and Breed Technologies, he was at the forefront of the safety revolution. His cutting edge work on passive safety algorithm design and testing led him to start the first automotive algorithm company in 1994. After receiving both his BSEE and MSEE (specializing in Communications and Signal Processing) from the University of Michigan, Mr Gioutsos worked on satellites and radar imaging for defense applications before joining Takata. He has been a consultant for various companies in areas such as biomedical applications, gaming software, legal expert advisory, and numerous automotive systems. Mr Gioutsos is currently Director of Sales and Marketing in the Americas for TASS International where he has continued to define active safety algorithm testing requirements as well as working on various other state-of-the-art approaches to enhance automated and connected car robustness. He has been awarded over 20 patents and presented over 75 technical papers.


Photonic technologies for LIDAR in autonomous and ADAS applications
Jake Li
Marketing Engineer
Hamamatsu

From fleet to commercial vehicles, there are a growing number of new and existing technologies that are important for the development of a fully autonomous vehicle. Aside from traditional sensors such as cameras, ultrasonic and radar, LIDAR technologies are becoming the key enabler in the fusion of sensors needed to achieve higher levels of autonomous control (levels 4/5). Today, there are already multiple designs of LIDAR systems whose key components are photonic devices such as light sources, photodetectors and MEMS mirrors. This presentation will provide an overview of the tradeoffs for LIDAR vs. competing sensor technologies (camera, radar, and ultrasonic) that re-enforce the need for sensor fusion, as well as summarize and compare various mechanical and solid-state LIDAR designs. Guidelines for selecting photonic components such as photodetectors, light sources, and MEMS mirrors will also be discussed.

Biography: Jake Q. Li is a marketing engineer at Hamamatsu Corporation, in charge of research and analysis for various market segments, with a concentration in the LIDAR-automotive market. He is knowledgeable about different optical components such as photodetectors – including MPPC (a type of silicon photomultiplier), avalanche photodiodes, and PIN photodiodes – and light emitters that are important parts of LIDAR system designs. He has expert understanding of the upcoming solid state technology needs for the autonomous automotive market. Together with his experience and understanding of the specific requirements needed for LIDAR systems, he will guide you through the selection process of the best photodetectors that will fit your individual needs. His expertise will assist in making important decisions with regards to various LIDAR design needs.


Millions of scenarios, thousands of requirements: managing the testing challenge for automated driving
Vivek Moudgal
Vice President of Sales
dSPACE

The success of future AD capable vehicles on the road rests, to a large extent, on the capability and availability of the sensing technology. Once the algorithms have been developed, the software will need to be put through a battery of tests with every imaginable scenario to confirm the operation of the software before it is deployed in production vehicles. Executing such tests in a vehicle on test tracks and public roads will be impossible in a reasonable time frame, given the sheer number of scenarios and environmental conditions that need accounting. A simulation-based approach is necessary, one that will allow for flexibility, reproducibility, and rigorous testing. Simulation-based testing offers a spectrum of capabilities from MIL, SIL and HIL-based testing. The SIL approach compliments HIL testing by offering the ability to maximize such testing in an offline simulation environment, using arrays of computers operating in parallel and crunching through the testing scenarios 24/7. This presentation will address the needed capabilities and fidelity in sensor and plant modeling to fulfill the needs for testing ADAS/AD features both in off line as well as real time environments while reducing the need for on-road testing.

Biography: Vivek Moudgal is the Vice President of Sales for dSPACE, responsible for sales operations in the company's North American market since 2003. Vivek joined dSPACE in 1993 as a technical support engineer and spent his first 10 years performing various roles for the engineering department, including supporting, executing, and managing software development projects. Throughout his tenure with the company, he has gained expertise in the application of model-based development tools for control software development and validation.


What is the role of inertial sensors in autonomous vehicles?
Ville Nurmiainen
Business Development Manager
Murata Electronics

Inertial sensors are today widely used in agricultural applications to improve GPS positioning down to sub-inch accuracy by compensating antenna position when vehicle is driving on a slope. In autonomous vehicles, inertial sensors can also be used to support accurate localization, but the concept is a bit different than in typical agricultural applications. GPS signal alone does not provide sufficient localization accuracy. Signals can be blocked or disturbed in “urban canyon” environments. When driving at slow speeds or in stop-and-go traffic, GPS is not able to track heading and velocity changes well. Inertial sensors can also provide a back-up system for a vehicle in case of visual sensor malfunction. For example, inertial sensors can guide a vehicle to a safety stop if the visual sensors (camera, radar, LIDAR) do not provide reliable information. We have investigated what level of localization accuracy can be achieved when using state of the art ESC (electronic stability control) sensors to compensate for GPS un-idealities and outages. Can existing ESC sensors be a reasonable cost solution that provides the required accuracy level for localization in a high-definition map environment? To demonstrate these sensors’ capabilities and limitations, we have developed and tested in-house sensor fusion algorithms that combine data from our 6-DOF IMU (ESC sensor based) with GPS and wheel speed information. In this presentation, we will share information the test setup and the results of the actual driving test.

Biography: Ville Nurmiainen has over 20 years’ experience of providing inertial sensors to the automotive industry. He started his career at VTI, a company which has been the market leader in low-g accelerometers for safety critical applications since early 1990s. Ville joined to Murata when the company acquired VTI in 2012. He has been working in various positions in VTI/Murata from product design to product management and marketing and has experience working with all the major Tier 1 and OEM companies worldwide. Currently Ville is working as a business development manager located in Novi, Michigan and he works closely with automotive industry to determine new business opportunities for sensors, especially in ADAS and autonomous driving applications.


Advanced packaging technologies for ADAS applications
Urmi Ray, PhD
Senior Director, Product Technology Marketing
STATS ChipPAC

Recent trends in the automotive market, such as semi and fully autonomous vehicles, have spiked consumers’ awareness and interest and, in turn, have created significant momentum across the semiconductor supply chain to deliver cost effective technology which promotes comfort and safety. A long list of semiconductor solution providers (such as makers of processors, sensors, and supporting chipsets) are playing a critical role in enabling reliable hardware for the ADAS software stack. In this presentation, we will take a look at the history of packaging solutions for ADAS sensor devices such as radars, LIDARs, and cameras. Then we will fast forward to today and review how the ADAS packaging technologies have advanced to enable high performance automotive radar and optical solutions. We will also discuss the latest packaging platforms, which can enable multi-chipset integration as a system-in-package (SiP), offering added value for ADAS device makers.

Biography: Dr. Urmi Ray is a Senior Director of Product and Technology Marketing for STATS ChipPAC, focusing on advanced system–in-package (SiP) solutions. Before joining STATS ChipPAC, she spent 11 years at Qualcomm, developing advanced technologies such as 2.5D and 3D solutions, multi-chip package integration and interactions. Prior to this, she worked at Lucent Technologies Bell Labs in multiple areas related to electronics assembly and reliability. Dr. Ray received her PhD from Columbia University in New York City. She has been actively involved in the semiconductor industry with several conference publications as well as patented inventions in the field of advanced packaging. She is also currently serving on the board of directors for the International Microelectronics Assembly and Packaging Society (IMAPS) as the Vice President of Technology.


Solid-state VCSEL illumination modules for flash LIDAR in ADAS and automated driving applications
Luke Smithwick
Chief Marketing Officer
TriLumina

LIDAR has been recognized as a pivotal technology which can be used in sensor fusion architectures with vision and radar to greatly increase the accuracy of object and free-space recognition. However, most existing LIDAR systems are too expensive, require rapidly spinning mirrors or other similar moving parts and are physically large. This necessitates the development of low-cost, solid-state, small form factor LIDAR systems providing sufficient optical output power while remaining eye safe. Although a few solid-state illumination alternatives have been proposed, they have not come to fruition in a meaningful way due to multiple limitations of the technologies. This presentation will compare current LIDAR illumination technologies including proven, cost-effective, solid-state illumination solutions based on back-emitting VCSEL arrays enabling eye-safe, flash LIDAR systems for multiple use cases in the vehicle. These same scalable VCSEL illumination modules can also be used to enable in-cabin monitoring and rear camera illumination.

Biography: Luke Smithwick has over 25 years of experience in business and technical leadership spanning semiconductors, software, hardware and core R&D with more than a decade focused on the automotive industry. He is currently the Chief Marketing Officer of TriLumina, the industry’s leading provider of solid-state VCSEL illumination modules for automotive applications. He joined TriLumina from Silicon Valley start-up CloudCar where he was VP of Marketing, Partnerships and Business Development. Prior to that, he was director of Automotive Infotainment Products at Qualcomm driving product management and focusing engineering on achieving automotive qualification of high-complexity application processors. Luke started his automotive career at Freescale (Now-NXP) as Director of Software and Solution Technologies where he started an automotive professional services, P&L, and founded Freescale’s automotive software and solutions team. This evolved into Luke taking leadership of the P&L, product marketing, solutions, strategy and vision for Freescale’s Automotive Driver Information and Infotainment business as Global Operations and Business Manager. Earlier in his career, Luke was focused on complex IP licensing at Aware as VP and GM of Licensing, he was CTO of NetBind and held a number of senior marketing and sales positions at GlobeSpan Semiconductor. Luke was an advanced data communications technology researcher at AT&T Bell Laboratories responsible for breakthroughs in VDSL technology. He did post graduate work at Princeton University and Columbia University and holds a BSEE and MS in electrical engineering from the University of Florida. He holds 14 technology patents and has published multiple industry and technical papers.


ADAS to autonomous: ongoing evolution of LIDAR and infrared camera sensors
Rajeev Thakur
Regional Marketing Manager, Infrared Business Unit
OSRAM Opto Semiconductors

The future of LIDAR and infrared camera in the automotive market continues to evolve rapidly. Low resolution flash LIDAR is being pushed into headlamps and tail lamps – as the corners of the car are premium locations for lighting and sensing. High resolution long range LIDAR is bifurcating to scanning and flash, i.e. 3D cameras. Lasers used are selected for wavelength, emitted power, driver, package, efficiency and other factors. They have to be matched to receivers with compatible peak wavelength sensitivity, gain, input voltage, packaging, etc. The selection of such key components drives the design into a time and resource tunnel of more than a year and a few million dollars. As camera technology is replicated around the car for 360 degree coverage, it still lacks range (80 to 200 m), resolution (1.2 to 8 MP) and sensitivity at low light (visible to IR). Concepts which use visible and infrared spectrum in the camera to extract information are being developed. Mechanical IR filters are giving way to newer software filters and hardware processing chips. This presentation will examine some of these technologies that are rapidly evolving to bring LIDAR and infrared camera technology to market for ADAS and autonomous applications.

Biography: Rajeev Thakur is currently the Product Marketing Manager at OSRAM Opto Semiconductors, responsible for infrared product management and business development in the NAFTA automotive market. His current focus is on LIDAR, driver monitoring, night vision, blind spot detection and other ADAS applications. Thakur joined OSRAM Opto Semiconductor in 2014. He has prior experience in the Detroit automotive industry since 1990 – working for companies such as Bosch, Johnson Controls, and Chrysler. He has concept-to-launch experience in occupant sensing, seating and power train sensors. He holds a Masters degree in Manufacturing Engineering from the University of Massachusetts, Amherst and a Bachelors degree in Mechanical Engineering from Guindy Engineering College in Chennai, India. He is a licensed professional engineer and holds a number of patents on occupant sensing. He is also a member of the SAE Active Safety Standards development committee.


Making sense of the automotive LIDAR marketplace
Harvey Weinberg
Division Technologist for Automotive Business Unit
Analog Devices

The development of affordable LIDAR parallels that of radar. It took roughly 50 years for radar system designers to develop technology that, eventually, standardized system approaches for particular applications. LIDAR technology is experiencing this same learning curve today – but trailing radar by over a decade. The result is a widely varying, and confusing, assortment of design approaches being proposed for automotive LIDAR. Fortunately, the principle of operation of the “whiz-bang” technologies being proposed for automotive LIDAR have already been examined by the very early technology adopters (military and aerospace) and the underlying physics is well understood. As usual, the new crop of LIDAR technologies offer promises of improved engineering solutions to these well-known basic principles of operation. This talk aims to aid the system designer or user understand what principles operation are available, and via physics and math, explain of what a well-engineered LIDAR system based on each particular principle of operation is capable.

Biography: Harvey Weinberg is the Division Technologist for the Analog Devices Automotive Business Unit. Over the past few years he has been working on long-time horizon technology identification as it pertains to automotive. Lately this has been principally LIDAR. Prior roles at Analog Devices have been System Application Engineering Manager for the Automotive BU and before that, leader of the Applications Engineering group for MEMS inertial sensors. He has 8 US patents in technologies varying from ultrasonic airflow measurement, to inertial sensor applications, to LIDAR systems. He has been at Analog for 19 years. Before Analog, he worked for 12 years as a circuit and systems designer specializing in process control instrumentation. He holds a Bachelor of Electrical Engineering degree from Concordia University in Montreal, Canada.


(2017 Technology Showcase speakers – listed alphabetically, by speaker’s last name)

Nidal Abbas
Business Development Manager
Excelitas Technology

Sagor Bhuiyan
Technology Strategist
AutoHarvest

Tony Gioutsos
Director of Sales and Marketing
TASS International

Makoto Motoyoshi
CEO
Tohoku-MicroTec


(2016 speakers – listed alphabetically, by speaker's last name)

GNSS for ADAS applications
Chaminda Basnayake, PhD
Principal Engineer
Renesas Electronics America

GPS/GNSS (global navigation satellite systems) will likely be the core positioning system in most, if not all ADAS. While other onboard sensors such as radar, vision and lidar sense and interpret the surrounding environment, GNSS and maps will enable the vehicles to see beyond the horizons of other onboard sensors. In addition, GNSS and vehicle connectivity may be required to enable robust sensing, especially in real life scenarios where the sensor vision is limited due to obstructions. Ongoing modernization of GPS (i.e., multiple civilian signals) and deployment of other GNSS such as Galileo, add more robustness to the GNSS solutions available to future ADAS implementations. This presentation will review key GNSS use cases in ADAS applications, GNSS technology options available, and possible limitations/challenges. Discussion will also include how complementary sensor technologies in ADAS can be used with GNSS to design systems that can meet the reliability, complexity, and cost constraints for mass deployment.

Biography: Dr. Chaminda Basnayake is a principal engineer with Renesas Electronics America where he leads all Renesas V2X activities in North America. Prior to that, he was with General Motors R&D and GM OnStar as a senior engineer. He served as the GPS/GNSS subject matter expert for GM and the USDOT-automotive OEM collaboration Crash Avoidance Metrics Partnership (CAMP) consortium during his 10 years at GM. He has authored/co-authored over 20 GM patents and numerous publications and presentations on GNSS, V2X, and telematics systems. His career highlights include winning GM’s highest award for excellence and being nominated as one of the 50 Leaders to Watch in the GNSS industry by the GPSWorld magazine. Dr. Basnayake holds a PhD in geomatics engineering from the University of Calgary, Canada.


Optimizing resolution and cost of lidar sensors in a fusion environment
Jean-Yves Deschênes
President
Phantom Intelligence

While the lidar industry is pursuing very high-density point clouds that are used in mapping applications, ADAS functions such as collision warning can rely on much less dense distance data for collision-warning estimation; within that, cameras can handle the high density required for obstacle classification. To optimize costs, a variety of densities can be used for different applications (i.e., high-density for frontal collision and navigation purposes and lower pixel densities for side or rear collision-warning). New lidar technology alternatives including solid-state, MEMS-scanned, mechanically-scanned and phase-array techniques, promise to make lidar sensing even more cost competitive. The problem now is choosing the right technology from the available alternatives. In addition to explaining the nuances of the available and emerging technologies, this presentation will discuss the selection criteria for various lidar solutions so both system designers and users can pursue the most cost-effective one for their design.

Biography: As co-founder and President of Phantom Intelligence, a Tier One company that develops collision warning sensors and components based exclusively on lidar technology, Jean-Yves Deschênes oversees the definition of strategic orientations and manages the resources required for the executing the company’s strategic objectives. In this role he contributes to the company’s international visibility and to the adoption of its solutions. Deschênes has a computer science degree from Université Laval in Canada and over 30 years of combined software, optics and now lidar technology experience as well as extensive experience on projects involving international collaborations and radical new uses of technology


ADAS sensing: a change in the road ahead?
Mark Fitzgerald​
Associate Director, Global Automotive Practice​
Strategy Analytics

Advanced driver assistance systems (ADAS) applications will continue to be the largest driver of automotive sensor growth through 2022. The growing demand for ADAS for mass-market vehicles comes from: (a) increasing consumer awareness and demand for ADAS technologies, (b) the potential for reduced insurance and repair costs when using ADAS, (c) increasing demand for highly-specified compact segment models, (d) growing OEM competition, using safety features as a means of differentiation, (e) mandates (or pseudo-mandates) requiring ADAS fitment of technologies such as autonomous emergency braking (AEB); these mandates will mean that new solutions are required to meet the cost/performance requirements of the compact model segments. Changes to the existing approach to ADAS sensing will occur based on current suppliers’ continued improvements of their systems as well as new players with innovative ideas. The talk will provide a comprehensive overview of sensor technologies for ADAS applications such as: camera, bolometer, lidar, ultrasonic, infrared, as well as short, medium and long range radar.

Biography: Mark Fitzgerald is the Associate Director for Global Automotive Practice at Strategy Analytics. He manages inquiries and analytical research of the North American market for the Automotive Electronics (AES), and Automotive Multimedia and Communications Service (AMCS) market segments. Mark's experience includes research, forecasting and consulting of automotive electronics and sensor applications in: powertrain control, passenger safety, vehicle stability, and in-vehicle infotainment and connectivity systems. Mr. Fitzgerald is the author of Strategy Analytics' Automotive Sensor Demand Forecast and Outlook report. Prior to joining Strategy Analytics, Mark was a marketing analyst for Pollak Engineered Products, a tier one supplier of sensor and switch products to the automotive industry. He was responsible for performing and managing technology development and competitive market analysis in the area of automotive electronics. Previously, Mark headed the North American Automotive Forecast Database Operations for IHS Automotive. Mr. Fitzgerald holds a BS in Business Management from Providence College.


Solid-state lidar for wide-spread ADAS and automated driving deployment
Axel Fuchs, PhD
Vice President of Business Development
Quanergy

Achieving automotive reliability and cost targets for wide deployment of ADAS requires significant technology advancements. Solid-state lidar and 3D perception has the capability to meet these challenges. It offers improvements in range, resolution, accuracy and speed of object detection. Sensor performance, form factor reduction and power consumption are also areas that need considerable improvement compared to existing mechanical lidar systems. Integrated solid-state lidar sensors can be small enough to fit in the palm of a hand and mounted behind a grill, inside a bumper, inside a side-view mirror or behind a rear-view mirror. However, sensing hardware is only a part of the system solution. 3D perception software for real-time object detection, tracking, and classification is the other part. The data collected is utilized to greatly improve the accuracy and reliability of the environment perception for advanced driver assistance and active safety systems. This presentation will discuss an integrated solid-state solution for adding value to ADAS systems and enabling safe autonomous driving.

Biography: Dr. Axel Fuchs is Vice President, Business Development at Quanergy Systems. Dr. Fuchs is a technology executive with 30 years of experience in managing business and innovation for Daimler, Motorola, and smaller high-tech companies. Before joining Quanergy, he held senior business development positions at Telenav, EZ4Media, and Universal Electronics. Earlier, Dr. Fuchs led the systems engineering department for Motorola Automotive Group where he collaborated with all major carmakers on integrating innovative electronics systems and worked for Daimler Benz where he was the co-founder of the Daimler-Benz research office in Palo Alto, California. There he pioneered the first Internet connected car and received the Smithsonian Award for Excellence. Dr. Fuchs holds more than 12 system level patents to help ensure technical leadership in digital media, mobile computing, navigation, automotive electronics, communications, and control engineering. Dr. Fuchs holds a Doctorate in electrical engineering from Darmstadt University of Technology, Germany.


Advanced physics based sensor simulation approaches for testing automated and connected vehicles
Tony Gioutsos
Director of Sales and Marketing
TASS International

In order to provide a "due care" testing approach to automated and connected vehicle technology, an advanced sensor simulation must be involved. Although real-world or field tests are required as well as test track testing, simulation can provide a bulk of the testing and also provide tests not producible via real or test track testing. Furthermore, to provide the most accurate and best validation, sensor simulation closest to "raw data" is preferred. Besides the deterministic set of data that a simulation program can produce, it is also important to develop probabilistic models that correlate to real-world data. In this talk, advanced physics based sensor models with deterministic and probabilistic components will be introduced. The models described include: camera, radar and V2X. These models can be used to produce receiver operating characteristic (ROC) curves and other measures of detection and estimation system performance. Using these measures allows for a robust system in real-world operating conditions.

Biography: Mr. Gioutsos has been involved with automotive safety systems since 1990. As Director of Electronics R&D for both Takata and Breed Technologies, he was at the forefront of the safety revolution. His cutting-edge work on passive safety algorithm design and testing led to a startup company that was purchased by Breed Technologies. After receiving his Master's Degree in Electrical Engineering from the University of Michigan, Mr. Gioutsos worked on satellites and radar imaging for defense applications before joining Takata. He has been a consultant for various companies in areas such as biomedical applications, gaming software, legal expert advisory, and numerous automotive systems. Mr. Gioutsos is currently Director of Sales and Marketing in the Americas for TASS International where he has continued to define active safety algorithm testing requirements as well as working on various other state-of-the-art approaches to enhance automated and connected car robustness. He has been awarded over 20 patents and presented over 40 technical papers.


Sensor fusion: the “holy grail” of advanced safety and driving automation
Moderator: David McNamara
Technology Mining
Magna International

Driving automation, especially level 3 and 4 systems, challenge today's ADAS systems to be robust under all driving conditions with 360 degree coverage to detect the "car that came out of nowhere". The New Car Assessment Program rating system will incentivize the OEMs to introduce new technology, such as Automatic Emergency Braking (AEB), to prevent intractable accident scenarios such as pedestrian collision warning, the "person who came out of nowhere". Our conventional wisdom is that a wide range of sensor technologies will be integrated, using sensor fusion: GPS, cameras, ultrasonics, radar, LIDAR and V2X connectivity. Automation requires sensor data to be fused from this diverse set of sensors to create a complete and trustworthy understanding of the driving environment. Engineers often evoke a "kitchen sink" system by suggesting we integrate all known sensor technologies. This approach, though enticing, raises issues of affordability and the practically of integrating a diverse sensor data set? We can easily envision a car with 8-12 vision sensors, two long range radar sensors, and four LIDARs at the corners with V2X connectivity. Sensor characterization, system engineering disciplines, and sensor fusion are the enablers to design a robust, affordable system, one that is available not only on luxury vehicles but offered on mass-market vehicles. This panel of leading industry experts will provide insights into how we build a robust level 3 and 4 automation from a set of affordable, high performance sensors. We will identify and examine sensor technologies, address development challenges, and integration issues. How do we reliably employ connectivity (e.g. V2X) and cloud based solutions? What are the system design approaches, centralized versus decentralized control, and the role of cloud-based data, that we should consider? And what are the sensor fusion strategies to adopt?

Biography: David A. McNamara is with Magna International R&D and has responsibilities for Technology Mining as part of Magna's Open Innovation Process for the identification and evaluation of promising companies and technologies. Innovation is a hallmark of Magna, as the leading NA automotive manufacturer. In the growing area of ADAS sensors, Magna recently announced our new vision system, the EYERIS® Generation 3.0 camera system, to enable future advanced safety and driving systems. Dave's career in automotive electronics spans 39 years with extensive experience in the design and launch of high-volume robust automotive systems: navigation, ACC radar-based systems, audio/multimedia, device connectivity, V2X and recently driving automation. His career at Ford Product Development and Ford Research from 1976 to 2006 upon retirement, included Ford firsts - 97MY Mondeo Navigation systems, 2000MY Jaguar radar-based ACC, THX Audiophile for Lincoln, and the 2006 Jaguar Audio Connectivity System, a forerunner of SYNC. From 2006-2015 prior to his position at Magna R&D, Dave as an automotive electronics consultant (MTS LLC) advanced a wide-range of safety and V2X related projects for clients that included USDOT, automotive OEMs and suppliers.


The next generation of optical time-of-flight technology: accelerating ADAS deployment with increased performance and affordability
Michael Poulin
Director of Product Management
LeddarTech

Optical time-of-flight sensors are seen as a key enabling technology for ADAS and autonomous cars. However, many challenges have slowed down their adoption for commercial deployments, as affordable optical sensors tend to fall short in performance, while higher-end solutions remain cost prohibitive for mainstream, high-volume implementations. This session will give you a better understanding of how a new generation of optical detection and ranging sensors is emerging based on an advanced optical time-of-flight technology. Packaged into chipsets, this technology enables highly optimized solid-state optical sensor systems that bridge the cost-performance gap, meeting the automotive industry’s stringent requirements for applications that span from cost-sensitive ADAS to high-performance autonomous vehicle systems. Examples of how these compact optical sensors can easily be integrated into standard automotive components, such as headlamps and rear lamps, as well as results from road trials featuring sensors mounted on a vehicle will be presented.

Biography: Michael Poulin is Director of Product Management at LeddarTech, a supplier of advanced detection and ranging solutions based on the patented, leading-edge Leddar® sensing technology. He has been with the company since 2010, notably as Director of Product Development and as Project Manager. Specializing in digital signal processing and embedded systems development, his previous work included leading and contributing to system and software engineering for the development of neurosensing and neurostimulation implantable medical devices, motorized prosthetics and embedded systems in telecommunications and other industries.


The role of imaging in ADAS safety systems
Narayan Purohit
Product Line Manager for Automotive Image Sensor Products
ON Semiconductor

Advanced Driver Assistance Systems (ADAS) are crucial to providing drivers with the safety, performance and comfort demanded for modern cars. The logical next step for this technology is to be able to effectively assist the driver if their course of action is deemed to be a risk to them or other road users. Today, ADAS technology provides a breadth of capability far beyond what was offered five years ago. The distinguishing feature of next generation ADAS will be their ability to ‘decide’ when to override driver control, or when to give the human a benefit of the doubt. Conversely, autonomous vehicles do not need to factor in human input beyond very basic start/stop functions and any necessary safety overrides. Autonomous cars are likely to require the fusion of many different sensing technologies including lidar, radar, camera systems, and others. This presentation will discuss the implication of high performance, yet cost effective image sensing in these next generation systems.

Biography: Narayan Purohit is currently the Product Line Manager for Automotive Image Sensor products at ON Semiconductor. He has been with ON Semiconductor/Aptina Imaging for over six years in the role of strategic and business development responsible for automotive image sensors business and product portfolio from product definition to market rollout of several new image sensor products. Narayan has over 20 years of experience in roles spanning from design, product development, applications, product management, business development and strategic marketing in various high performance products including image sensors. He holds an MSEE from the Michigan State University.


(2016 Technology Showcase speaker)

AutoHarvest ecosystem welcomes the engineer
Jayson Pankin
Co-founder, President and CEO
AutoHarvest Foundation

AutoHarvest Foundation, a 501(C)3 nonprofit, created and operates a unique innovation ecosystem led by some of the most highly respected figures in the automotive and manufacturing industries. In 2012, AutoHarvest.org was launched as the world’s only truly neutral and global on-line meeting place for innovators of all types with an interest in advanced manufacturing. This system allows users of all types to showcase capabilities, technologies and needs system-wide and then privately connect with fellow inventors and commercializers to explore technology and business development opportunities of mutual interest. The AutoHarvest interest group consists of over 300 prominent R&D and manufacturing organizations from industry, government and academia. Please visit www.autoharvest.org.

Biography: Jayson D. Pankin is a founder, President, and CEO of the nonprofit AutoHarvest Foundation. Jayson and his partner, Dr. David E. Cole, created a unique innovation ecosystem led by some of the most highly respected figures in the automotive and manufacturing industries. In 2012, AutoHarvest.org was launched as the world’s only truly neutral and global on-line meeting place for innovators of all types with an interest in advanced manufacturing intellectual property. From 2003-2010 he led Delphi Automotive’s commercialization activities targeting spin-outs of potentially disruptive technologies into start-up companies. Jayson has been a venture partner specializing in early stage and turnaround situations. He was named by IAM Magazine for two years running as one of the World’s Leading IP Strategists. He earned his BBA in Accounting and MBA in International Business at the George Washington University.


Call for Speakers

If you’d like to participate as a speaker, please call Jessica Ingram at 360-929-0114 or send a brief email with your proposed presentation topic to jessica@microtechventures.com.

Conference scope includes topics related to ADAS and automotive sensors, such as:

  • Business trends, market projections, M&A developments, and startup activity
  • Emerging trends with camera, radar, LIDAR, infrared, and ultrasonic sensing technologies
  • Impacts of enabling technologies such as artificial intelligence and augmented reality
  • Sensor fusion with advanced algorithms and computing power, connectivity and data transmission, contextual awareness and processing, and virtual sensors
  • Emerging HMI technologies and applications
  • Supply chain trends and challenges, government regulations, and mandates
  • Impact of advanced mobility technologies
  • Advanced IMU and GNSS based navigation technologies
  • Fabrication, packaging, and assembly techniques
  • Sensing and seeing beyond 200-300 meters (GPS and cloud data))
  • V2V/V2X connectivity and telematics (including 5G cellular and DSRC)
  • Reliability testing methodologies and techniques
  • Sensors and electronics for telematics applications
  • Component and system packaging technologies
  • Technology transfer, ecosystems and hubs, company formation