The iPhone X notch has been a divisive piece of hardware design. Some people love the “bunny ears” look on the iPhone X with the bezel-less display while others feel that the notch is ugly and draws attention away from the rest of the device. Wherever you stand on the appearance of the iPhone X notch, you can’t argue with the functionality. The notch houses all of the front-facing sensors that allow the iPhone X to do some of the things it does best, like FaceID.
Now, artists are playing with the concept of a notch in other Apple products like the Apple AR Glasses concept shown above. We know that augmented reality is the future of smartphones and will probably make huge leaps and bounds over the next few years. Pokemon Go gave us a very basic look at what AR could accomplish. What if we could build powerful AR devices that made our lives easier, like these Apple AR Glasses concept?
Think about how our lives could be changed. The FaceID sensor could be used to recognize friends or co-workers and give you information about them like their latest social media updates or when their birthday is. Instead of saying, “Oh, hey Bill, how was your night?” You could say something like, “Hey Bill! I saw your daughter had her big dance recital last night! How did that go?” You and Bill have now made a better social connection and you can have an actual conversation rather than the usual sequence of asking how each other are, saying, “good,” no matter what, and then walking your separate ways.
Of course, having sensors packed into a device is one thing but being able to use those sensors to their fullest ability is another. With Apple’s A11 Bionic chip, Apple introduced a chip with dedicated machine learning cores. In the iPhone X, the cores are supposed to learn the intricacies of your face and the changes it may go through throughout your life. What about machine learning for augmented reality? Could we utilize machine learning in devices like the Apple AR Glasses concept to make them more powerful and useful? Of course! In fact, these devices probably wouldn’t be worth owning if they didn’t implement some sort of machine learning. For example, perhaps the devices can learn to read small facial expressions and help you identify the kind of mood people are in. This way you can know if today is the right day to sit down and ask the boss for a raise or if you should wait until they’re in a better mood. The applications are endless.
If all of this sounds a little like the episode of Black Mirror where everyone has AR enabled contact lenses and rates each other on each interaction then you’re kind of right. There is a potential for these devices to become ubiquitous and turn into essential parts of our lives. Now that we have the technology for complex facial recognition and machine learning that can be built into a handheld device, the real challenge will be implementing that technology in a way that protects privacy and respects personal boundaries. That will ultimately be the true test of AR.
What do you think about the Apple AR Glasses concept and the future of AR in general? Are you excited for what’s to come or worried about how it could change our lives? Let us know your thoughts!