Fly-eye phones are coming
2 min read
This week we’ll be swamped with speculative products from CES as every manufacturer tries to show off their supposed next big thing; here’s some directional guidance on what to look for.
We have already long passed the good-enough point for displays in phones, and headed into absurd pixel densities, that you can only distinguish if your phone is a few inches from your eyes with magnifying lenses inbetween. No wonder Google Cardboard is popular with manufacturers.
Similarly, the cameras built into phones have reached the limits of useful resolution, and the differences in responsiveness have been competed away too. The next step will be multiple cameras on each side of the phone. I expect we’ll first see 2 cameras at opposite ends of the phone, so you can take stereoscopic images and videos with natural eye spacing.
However, having simultaneous spaced images means you can extract 3d information from the photo — Google’s camera app has done this for a while but you need to pan up and down. This means you can change depth of field synthetically to give nicer images by blurring unwanted foreground or background details out. This also means you can more easily compensate for lens distortion, making faces less spherical looking in close-ups.You can even reconstruct 3d objects, scanning smaller ones, or panning around a room to derive a more accurate 3d model.
Once you have an accurate 3d model of the room, doing Augmented Reality becomes much more practical — you can place elements on the walls or floors, and have them pass behind and in front of object in a more realistic fashion. Think of the gratuitous effects Snapchat can do with that — 3d halos, birds flying around your head.
Doing the same with microphones also makes sense — with multiple microphones you can effectively create direction and distance sensitive recordings — removing background noise, or separating multiple people or instruments in the scene. In effect we’re substituting processing power and more memory bandwidth for expensive optics and acoustics. We’ll be able to record 3d video and audio for VR remote viewing, or further AR processing for ourselves.
As ever, these features will show up on the flagship high end devices first, but given the relative costs of CPU vs precision optics, we can expect to see them be more widespread over time.
Originally published at known.kevinmarks.com on January 4, 2016.