Pokemon, Google and the rise of Augmented Reality wearables

You’re walking in the park, casually swiping around on your phone, looking for the telltale rustling leaves, ignoring the alarming rate your data is draining at, and checking how many more steps you have before your egg finally hatches.

There.

On your screen is what every Pokemon master dreams of. You’ve been looking for months, but finally, you’ve come across a Mew, only a few steps ahead of you. But suddenly you look up, and you make eye contact with the person across from you and you know that they too, see the Mew. So you start running, frantically trying to capture the Pokemon.

Such is a day in the life during the 2016 Pokemon GO! craze, a time that saw interest in augmented reality spike enormously.

You’ve probably heard of augmented reality (AR), whether that be around Pokemon GO! or around Google’s failed attempt at AR wearables, Google Glass (more on that later). But hearing about something doesn’t necessarily mean that you know what it is, so, first thing’s first- what is augmented reality?

AR in a nutshell

Augmented Reality comes in many different forms from the Pokemon GO! example to Snapchat filters. But put simply, AR is just any computer-generated image that interacts with the physical (“real”) world around it.

Snapchat’s iconic dog filter

How it works

Now, for the technical bit, how does AR work?

AR is only possible because of a lot of sensors. You have gyroscopes (which track motion), cameras, and accelerometers (which track speed). Based on the data received by these devices, the image in front of your eyes or on your phone changes based on your movements and your surroundings.

This is a concept called tracking; the reason the AR device knows to change its size based on your distance or disappear if you move behind a solid object. Tracking comes in two forms: either through an outside-in model or an inside-out model.

  • Outside-in tracking means that the system’s sensors are housed outside of the device, and then the data is transmitted to the AR device. This could mean cameras set up around the room.
  • Inside-out tracking means that sensors are within the actual device and it receives all of its data internally. This is the AR we are typically referring to, the kind that uses your phone’s sensors, or even a bulkier wearable headset such as Microsoft Hololens.
Different tracking types

Even more tracking terminology

Outside-in and inside-out tracking is only the beginning though. Inside of those two categories, there’s, shockingly, even more categories. So let’s dive into it, shall we?

  • Simultaneous Localization and Mapping (aka SLAM): Basically, the AR device creates a map of the space by figuring out the approximate locations of key objects in the room and the spacial differences between them.
  • Marker-based AR: In this type of tracking, virtual objects are tied to a specific part of the physical world. For example, if you had a virtual coffee mug, it would be anchored to a specific point on the physical table. A marker is just any easily recognizable object registered by the AR device.
  • Markerless AR: This type of tracking is basically the opposite of marker-based AR, meaning, it doesn’t need to be attached to a point in the physical world.

But just being able to transmit a real-time image is not enough. In order to seamlessly blend in with the real world around it, AR needs to comply with the same principles that objects in the real world do. This includes:

  • Place: This basically means that unless directly moved, objects need to remain in the same place, even if the user moves around the room.
  • Scale: Essentially as the user moves closer or farther away, the AR object changes its size to fit the new point of view.
  • Occlusion: This is when an object in the physical world blocks an AR object or vice-versa. The AR has to change its appearance to fit in. For example, if you were looking at an AR painting and there was a physical lamp in front of it, certain portions of the painting would not be visible.

And finally,

  • Lighting: Just like in the real world, depending on the lighting and the amount of light available, the shading on objects appears different.

All of these concepts come together to mesh both physical and virtual aspects to create one enhanced world.

Applications:

Now that you have a basic understanding of augmented reality, let’s talk about some cool contributions to the field. In the past few years, there have been a few major players, namely Google and Facebook. We’ll look at what they’re doing:

Google

Google, everyone’s favorite tech giant

Google has been working and has worked on a variety of different projects involving AR. A few of them have really taken off and some of them…not as much.

  • First up is an initiative that intends to link Google search results to AR. With this, you could search up the word ‘shark’ for example, and by clicking a button, your screen would show a shark in AR.
  • Additionally, Google is working on image recognition software called Google Lens that allows the users to get information related to the image seen on your camera. This, of course, uses the power of neural networks to figure out exactly what the image is.
An example of Google Lens
  • Another area is Google ARCore Google’s open development platform software for augmented reality developers to create their own AR.

Facebook

  • Facebook has created ARstudio, a drag and drop style platform that lets users create their own AR experiences while applying complex real-world logic to their experiences with a just click of a button.

Wearables

Another area that has really taken off, is wearables. Wearables are exactly what they sound like, unlike your phone, this is AR that you can, well, wear. Generally, this comes in the form of a headset or when the tech gets scaled-down, even glasses. This one the most exciting areas of AR because one day this tech is going to be so applicable that it will permeate literally all parts of our life, from school to work to play. We’re still a long way from that vision, but here’s what we have now:

Google

The infamous Google Glass itself

Ah, what to say about Google. Google’s first foray into wearables was a 2014 release called Google Glass. Now the idea behind Glass was that it would be a pair of smart glasses that would be navigated via voice commands and natural language processing (NLP). Eventually, the tech would become good enough to hopefully, replace the need for smartphones altogether. Which, is a really cool industry-changing idea. It’s almost 2020, nearly 6 years after the original release, so why isn’t everyone walking around with a pair of Google Glasses?

Well, that’s because it turned out to be a total flop.

Google underestimated the public’s willingness to adapt to this technology, the world really just wasn’t ready. The issue that overshadowed all of Glass’s possibility, was privacy.

When Glass was originally released, it had a camera feature that allowed it to record whatever the user saw in front of them…expect there was no indication that the device was recording. Yeah, see the issue? People didn’t take very well to the idea of them being recorded without their knowledge. Google quickly modified the device to show a red light when using the camera, but by that time it was too late- people had already been turned off from using it.

Later in 2015, Glass was discontinued for public use. In the future, another venture with smart glasses will have to take into account how important privacy is for people and find a solution to deal with it.

That isn't to say Glass failed completely- the tech has found a home in industrial environments in use cases such as giving the wearer the ability to check the specifications of a given object with just a look and a voice command. Remember Google’s image recognition software? It’s all coming together now.

Microsoft

Microsoft has one of the coolest devices for AR developers on the market today- Microsoft Hololens.

Example of Microsoft Hololens

Hololens is a head-mounted device (HMD) that packs literally everything on it. Remember the concept of tracking? Well, Hololens is the perfect example of inside-out tracking. Anyway let’s go over some of the coolest features of Hololens:

  • Holograms: One of the coolest things about Hololens is its ability to produce holograms. Yes, you heard that right, holograms…well, kind of. Of course, the whole idea of AR is being able to view virtual images in the real world, but Hololens actually allows you to manipulate those objects via hand gestures. These gestures are recognized by the various sensors and cameras on the device.
  • Eye-tracking: Another thing that makes Hololens stand out is its’ tracking. Each Hololens needs to be calibrated to that specific user’s gaze, which allows the device to be able to tell where you’re focusing, and it focuses on only that specific area, just like how our eyes naturally do.

The Hololens for all its cool features is not the future. It’s bulky, ugly, and isn’t very portable- it’s largely meant for industrial/ work use, not day-to-day activity. So let’s look at an option that is more consumer-friendly.

North

Focals by North is an interesting one. This company has been working on smart glasses, but unlike Google Glass and other competitors they’re doing something different- they’re making their glasses stylish. Oh wow right? Why is that a big deal exactly? As superficial as this sounds especially when compared to the heavy-hitting device that is Microsoft Hololens, if AR wearables want to become mainstream, that is extremely important. As sad as it sounds, it doesn’t matter how cool something is, if people look like dorks, they are not going to buy the product.

So that’s why what North is doing is so important. All of the companies interested in AR wearables haven’t been focused on this aspect of their product, and that is one reason their products are unlikely to take off.

But that’s not to say North’s glasses are amazing, the tech in them can be likened to that of a smartwatch. It has limited capabilities, few apps and really is only a heads-up display at this time. But the possibilities of this technology and the company’s ideas make this a company to look out for.

But why should this matter to you?

Now I realize, that was a lot of information. So, let’s just make it clear what all of this actually means. AR and AR wearables are going to completely change how we look at entire industries.

Education: Imagine a world where your classroom is half virtual. I wrote an article on this, check it out here.

Workplace: Instead of relying on your computer for notification or to write, imagine just seeing it pop up before your eyes.

Manufacturing: AR has and is having a significant effect on this industry already, with the workplace becoming a lot safer because of the ability for workers to view information, such as the specifications of a part, without having to avert their eyes, or divert their attention away from the task at hand.

Limiting factors for wearables

So right now what’s keeping us from all rocking our very own computers on our faces? Well, lots of things:

  • Cost: Currently wearables are expensive, but with emerging technology- the cost is always an issue at first. But as the tech scales down, the cost will as well.
  • Hardware/Size: The amount of space required to allow for inside-out tracking that a wearable would require is enormous. That currently leaves companies with the option of leaving out the awesome stuff we expect out of AR or making a really bulky device like Microsoft Hololens.

When the sensors and cameras can be scaled down, that’s when the real magic will happen and you can finally toss out that outdated block we call a smartphone.

Key Takeaways

  • Currently, our AR is still in its early stages with things like Pokemon GO! and Snapchat filters
  • Tracking is the reason AR is possible
  • Companies like Google and Facebook are making development environments for AR developers
  • Wearables are going to be a big area of AR, and companies like Google, Microsoft, and North are doing cool work in the space
  • AR is disrupting and will disrupt almost every industry in some way
  • Cost and hardware space are two areas that need to be addressed before AR wearables hit the streets

Hi, I’m Hana Samad, an 11th-grade student, and an Innovator at The Knowledge Society. I’m a VR and AR enthusiast with a special interest in AR wearables.

If you’re interested in my progress feel free to:
Add me on LinkedIn!👉 www.linkedin.com/in/hana-samad14
Shoot me an email at: hanasamad14@gmail.com

--

--

--

12th grade student, Activator at The Knowledge Society and Co-Founder of EC Urban Acres. Currently redefining equitable resident focused community development.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

The Most Important New Innovations in AR/VR for Architecture, Engineering and Construction

A man wearing a Magic Leap One headset views a virtual representation of a building being constructed on a tabletop.

Tide brand Watch Best waterproof LED watch-1

This Week in Proptech — Week of 8/17/20

Technology as a facilitator of community mobilisation and support.

Best Noise Cancelling Headphones in 2021 — NADYACORSCADDEN.COM

Our Investment in Aurora — A Safe Path to Scale Autonomous Vehicles

STREETH & STREETHERS: Utilities | Roadmap | Ecosystem

iPhone 12 will manufacture in India — How about that?

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Hana Samad

Hana Samad

12th grade student, Activator at The Knowledge Society and Co-Founder of EC Urban Acres. Currently redefining equitable resident focused community development.

More from Medium

Mojo Vision’s smart contact lens: ready for real-world testing

new year, new LOGO

When and where will we enter the metaverse?

In an uncertain future, augmented reality glasses offer a glimpse of hope