What camera technology will power the revolutionary 3D interactive user interface in the new iPhones?

First, let’s remind ourselves that one of the latest camera technology, the dual camera approach, got true market traction starting in April 2016 with Huawei’s P9 introduction and reinforced in September 2016 with the release of the iPhone 7. Now 3D sensing cameras are announced to be the next big thing to feature in the anniversary edition of the iPhone.

The wave of embedded “3D sensing cameras”, is approaching quicker than we thought. Indeed, mobile camera modules had been following the general roadmap Yole Développement (Yole) published in its Camera Module report back in September 2015. We’ve since tracked the trend in our ‘Status of the CMOS Image Sensor Industry’ report, published in June 2016, and the ‘3D Imaging and Sensing’ report in March 2017. A few weeks before this technology is likely to appear in Apple’s latest iPhone, it’s important to review what is coming.


Updated Yole best assumptions camera Yole DeveloppementSource: 3D Imaging and Sensing report 2017, Yole Développement

Until earlier this year everyone thought 3D cameras would appear on the rear of Apple’s new iPhones, as is the case in the Lenovo Phab 2 Pro, in order to develop Augmented Reality applications. This changed radically when Ming Chi Kuo, an analyst from KGI Securities, confirmed in April 2017 that there would be a front 3D camera. This has gained a lot of attention, and rumors suggest component suppliers include Lumentum for the VCSEL emitter, Viavi for the IR filters, and STMicroelectronics for the receiver chip. Most analysts speculate that this camera would exploit “structured light”, because of Primesense’s involvement. However, Yole maintains its original idea of a Time of Flight (ToF) camera on the front due to the nature of the application, which is to be a new revolutionary user interface. The structured light approach would require way too much processing power. Meanwhile, STMicroelectronics is currently investing $1.2B in extending its 12” diameter wafer fab for a ‘special customer project’. STMicroelectronics is currently better known for its ToF technology than the regular CMOS image sensors used for structured light. Either way, the integration of embedded 3D technology seems very early – we didn’t think it would be ready until 2020, but sometimes giant companies such as Apple can move mountains.


ToF Time of Flight camera facial recognition Yole Developpement
Source: 3D Imaging and Sensing report 2017, Yole Développement

In the focus on the front side, everyone seems to have forgotten that the rear should also be equipped with 3D imaging and sensing technology. That may be partly because there is no apparent hardware “slot” linked to the 3D sensing feature: there will be a similar number of units for the dual camera and flash as the iPhone 7. Where would the 3D sensing capability come from? Yole’s take is that the rear sides of the phones could use a structured light approach, which presents a few options. One of the receiving cameras could have a near infra-red (NIR) capability, either red, green, blue (RGB)-NIR, or just black and white (BW)-NIR similar to the one Huawei has used, in order to receive the 850nm light of the structured light emitter. Another option would be emission in the visible range, at 650nm, for example. All options would imply an emitter module with a lens, confirming the large implications of companies such as Himax and ams since the flash unit would also act as a SL emitter. Our view is that such companies would be be mainly involved on the rear side of the phones, for the emitter part. The great news is that this solution could span along the full handset range, the iPhone 7S, 7S Plus and 8/X.


Mobile Phone Market Trend Yole Developpement
Source: 3D Imaging and Sensing report 2017, Yole Développement


The structured light approach for the rear camera could also be good news for wafer-level focusing lens manufacturers such as Lens Vector, Polight, Wavelens and Optilux. The approach means the emitter needs a lens to focus a pattern. A fixed lens approach would probably be enough at first, but one can envision adaptable lenses in the near future. We will be actively waiting for the result in September.



Pierre Cambou - Yole Développement

In 1999 Pierre Cambou joined the imaging industry. He had earned an Engineering degree from Université de Technologie de Compiègne in parallel to a Master of Science from Virginia Tech in 1998. More recently he graduated from Grenoble Ecole de Management’s MBA. Pierre took several positions at Thomson TCS which became Atmel Grenoble in 2001 and e2v Semiconductors in 2006. In 2012 he founded the start-up Vence Innovation (now called Irlynx) in order to bring to market a disruptive Man to Machine interaction technology. He joined Yole Développement, the “More than Moore” market research and strategy consulting company, as Imaging Activity Leader in 2014.


Source: Yole Développement



3D Imaging and Sensing 2017 - Yole Développement3D Imaging and Sensing 2017
Beyond its traditional medical and industrial markets, 3D imaging & sensing is ready to conquer consumer and automotive, with an exponential growth pattern starting from $1.3B in 2016 and reaching $9B in 2022.
Find more here



Related presentations

Cet article vous a plu ?

Partagez-le sur vos réseaux sociaux