New IOS SDK thoughts

Latest Iphone has been released this week. While most of the media focus on the outlook or hardware of the Iphone X, the update of the IOS 11 has been disregarded. In my point of view, I think these two updates of the SDK are the most significant ones during this release.

ARKit: It’s one of the most popular technologies these years. With Apple’s offering this SDK, many amazing Apps would come true with low expenses. For example:

AR navigation: Google map could navigate you to your destination, but what it would be like if navigation integrates with AR? It could provide you the signs of restaurant you want to go, or pass by, or comments of it on Yelps, or the area that’s under construction that you need to avoid, or calculate the moving object for you even you don’t have a smart auto. When smart auto is available, it may tell you where you would need to refill your gas on your way to your destination, etc.

AR with VR headset devices for Iphone: One important thing of the ARKit is that, it can capture your eyes’ movement, knowing whether you are looking down or up. What does that mean? Auto scrolling and switching page is coming! Buttons for selection/direction will be gone on the headset, what you need to do is to move your eyes.

AR for Product displaying: The concept has long come into live years before the product comes out. With ARKit providing a low cost for development, it would be popular in short. Whether displaying a furniture in your house, or showing feature products only on design models, or even simulating the interaction of this product for you. Or previewing decoration of some events like wedding.

As soon as the AR develops a easy 3D model capturing with low cost, further development of the AR, say, integrating with medical care, like helping doctors on surgery, signifying where should the cutting starts or which organic should be removed exactly, or integrating with education, especially the jobs that requires much experience or time on training like mechanic, nurse, doctors, etc, would be tangible in years.

Machine Learning Kit:

The most important thing for providing MLKit on mobile is that, collecting your data more easily since we could not leave mobile for even just one day. Machine learning is based on model training, and training is based on data, and data comes from interaction. What ideas could be on live? Well, it depends on how to use the data it collects.

Say, I am taking photos of the building with my phone while I am traveling to the NYC. My phone could distinguish what I am taking, a countryside view or city view, or human, or animals, and tell that whether I am traveling or not with the GPS comparing to the existing training model. With these data, some Apps could suggests me where to go to take a amazing photo of the Empire State Building that others would go. Or suggest me where for dinner matching my taste and the price by my dinning trained model. Or remind me to buy washing powder, which is running out in my house, on my way home based on my purchased behaviour model. Or to tell that whether I am happy or not and selecting the play list for me, etc.

Previously, one can only collect the data from one aspect, which is not enough for the App to actually provide some amazing experience. With the SDK coming alive, more amazing App would be available. I am really looking forward to it rather than the Iphone X.