Designing for the Next Generation Spectacles

Snap Incs Next Generation Spectacles showcase what building and experiencing Augmented Reality will look like in the future.

Lauren Cason
Geek Culture

--

Last December I found myself among a group of 7 creators approached by Snap Inc. to be the first to build experiences for the Next Generation Spectacles. The journey I went on with this project changed the way I think about AR and the future of wearables — it took me across New Mexico, chasing daylight and cell service, and involved badass poets, road trips, mysterious hums, a lot of foam core, and only a little yelling at my computer in a highway pullout. So, what did I learn along the way, and what do the Next Generation Spectacles mean for the future of AR?

The Hardware: It all fits in there?

The Spectacles are a mixed reality waveguide headset. If you haven’t heard of these before, it’s a type of wearable computer that blends the physical and digital worlds, allowing you to see and interact with holograms in-world around you. You may have heard of a Magic Leap or Hololens, these are both mixed reality headsets.

So what makes the Next Generation Spectacles special?

A pair of spectacles sits on a clear acrylic head
The New Snap Spectacles

Opening the box, I was struck by just how much technology they packed in. Coming in at 134g, the Spectacles have a waveguide display, 6 degrees of freedom, and a 26.3 degree field of view. 2 cameras, 4 microphones, speakers, and a touchpad. Thats a lot for a pair of sunglasses. Now, don’t get me wrong - they are quite large sunglasses. I have a small head and they look pretty big on me, but I think some of the other creators like Leighton did make them look pretty cool. Even though they’re large on my face, they don’t feel big. They’re comfortable — I wore them for hours without getting that headset crick in my neck, and the stems of the glasses adjust to the size of my head nicely for a snug fit.

Getting people into them is also refreshingly easy. There are no straps, no calibration to go through, and they turn on when you take them out of the case. I watched friends put them on for the first time over the past few days as the project was announced, and have loved how easy those interactions were. One thing I would change, I wish I could have adjusted for my IPD (IPD is the distance between your irises)— I have close set eyes, so I struggle to focus my eyes with devices that rely on average IPDs.

Testing out the Spectacles on site at the Valles Caldera

Another thing that impressed me was just how well they worked outdoors with 2000 nits brightness, I was able to see my experiences clearly on bright days in the intense high desert sun. The tint of the glass helps with this, but also made them a little strange to work with indoors. Wearing sunglasses inside is weird. A solid trade off though— Seeing holograms actually working, outside, in the world, was definitely worth having sunglasses on in the studio.

The biggest limits were battery life, thermal, and the field of view. The battery lasted somewhere between 15 and 30 minutes of running continuously, but I would generally hit thermal limits and need to let the device cool down before I drained the battery (My files were FAR from efficient, so take that with a grain of salt). The field of view is small, a little smaller than the original Hololens, and interestingly, just barely vertical. I actually liked the vertical view; the things Im looking at in AR, at least for my projects, are often faces, bodies, posters, and those are generally vertical things. So, this felt pretty good.

The thing that makes this headset really interesting though isn’t the hardware.

The Software And The Ecosystem: The Real Strength Of The Next Generation Spectacles

We’ve seen some sunglasses form factor AR glasses in recent years. Nreal comes to mind- and they’re super cool! But what so many of these early headsets lack is a clear, accessible pipeline for creating experiences and getting them in peoples hands. This is where the Spectacles shine — they fit in seamlessly with the Snap AR Ecosystem, and that has huge implications.

Prototyping for the Spectacles in Lens Studio was lightning fast.

So first, let’s look at Lens Studio. Snap Lens Studio has been around now since 2017 — it’s a free tool that allows you to build and publish lenses on Snapchat. And it’s kind of ridiculously powerful. It’s like a streamlined version of Unity, just for AR. Iterating with it is so fast. When I’m working on a Unity project, I’ve often sketched it out in Spark (Facebooks Social AR tool) or Lens Studio first, because the time between having an idea and playing with a prototype can be minutes.

This reduced friction made working through ideas on the Spectacles absolutely fly in a way I’ve never experienced with a mixed reality headset. It was just…. easy. If I have my Spectacles, Macbook, and phone on the same wifi network, I can click a button in Lens Studio and send the experience to the device in seconds. The file sizes were small enough that I was able to work off of a hotspot in remote locations and be pushing to device every couple minutes. The preview in Lens Studio mimicked the waveguide behavior really well. Videos I recorded on the headset just, show up on my phone, no need to plug in or download. And with the Snapchat ecosystem, I already have an enormous audience for my work. No bespoke native apps to download in order to interact with the hardware, or the humans wearing the hardware. And it’s easy for people without the Spectacles to still see the work on their phone.

To me, this glimpse into a smooth, friendly creation and publishing experience felt way more futuristic than the headset itself. My personal takeaway working with the Spectacles is much more about what the future of AR creation and platforms looks like, than the future of AR glasses (but the glasses are super cool.).

So, what did creating for the Spectacles look like?

Design Exploration: The Future Is… Cubes! Weather Vanes? Roadsigns!

I knew from the outset of this project I wanted to make something tied to an object or a place. I love AR thats grounded in the physical world- experiences that expand my understanding, appreciation and love for the world I’m already in, rather than transporting me to entirely new realities.

A fidget toy that teaches square breathe

I went through a couple ideas before landing on roadsigns-I started with a mindfulness cube toy that taught square breathing ( it looked too much like the merge cube).

Weather Vane Sketch

Then I got really interested in weather vanes (I still want to make weather vane art, hit me up weather friends!). And then, over a dinner conversation with some very smart people, someone brought up historic road markers, and it clicked.

Ive always loved historic road markers- much to the despair of the people who go on road trips with me, I absolutely need to stop at all of them.

These funny signs in highway pullouts came about in the 30’s, when the idea of the great American road trip was just taking off. States were looking for ways to coax visiting motorists off of roadways, hoping they would spend their money and time at local landmarks. And so, they started putting up signs. New Mexico has over 700 signs, and in 2007, the New Mexico Historic Womens Marker Initiative was started, with 75 markers sharing the history of women erected through the state.

The signs have a lot going for them for an AR project — they are big and very high contrast, so they work great for image tracking. (If you aren’t familiar with image tracking, this is when a device, say, your phone or the Spectacles, can detect 2D images, and then place augmented reality content in the world based on the location of that image.)

The signs are also meant for you to pull over and experience — this is one of my critiques of many image tracking experiences. They happen in thoroughfares, hallways, busy streets, on billboards I’m supposed to be driving past at 75 mph…. not super ideal. But these rocky pullouts on state roads? Perfect!

So then, the question is, what happens at these road signs? There were three directions I was interested in exploring.

How do you expand on these signs?

The first was education. How could AR expand the educational content of a sign? Could you label the geography of the landscape it taught you about? See the historic figure next to their plaque? Reconstruct an archeological site?

The second was hidden layers. Many of these markers are, well, kind of boring (don’t come at me historic marker dads). They might tell me the population of a town, the governor who founded it, what year. But you know, New Mexico has some seriously cool weird history that probably isn’t gonna get a government funded plaque. What if there was an alternate historic marker universe, that revealed legends and lore and beautiful strange things about places?

And the third was leaning into the New Mexico Historic Womens Marker Initiative, and treating it as a curatorial project. Working with local New Mexican women artists to create site specific experiences that would elucidate the lives of these amazing historic women, through the art of women who live here today.

In the end, I decided to do one of each.

Art Direction: Leaning into Light

before I started building, I also had some particular things I wanted to explore in terms of art and style.

Wave guides displays work by diffracting and redirecting light to form an image which is then projected into the eye. The images you see when wearing these types of glasses are made of light.

This means that you cant render black, or make something look darker -only lighter. The pictures are additive, which can be limiting. That said, human perception is a funny thing. Just look at this image:

The grey bar is all one color.

How far could I push this kind of optical illusion to create the feeling of a greater range of luminosity?

I also wanted to explore a quieter, monochromatic, minimal style. So much AR is psychedelic maximalism (which I love, and did a lot of at Meow Wolf) and I wanted to challenge myself to explore other aesthetics. I settled on using white as my only color, leaning into a ghostly, ephemeral look, like light punching through a veil.

So, now I had my concept and my art direction, time to start building!

Production: Building the Lenses ( and some fake roadsigns)

The first lens I built was Caldera — the education based lens. Based around the Valles Grande Historic Marker, it would let you see a topographical map of the Valles Caldera, a 13 mile wide volcanic caldera formed in an eruption about 1.25 million years ago.

Caldera

As the first lens, this is where a LOT of learning happened. The first thing? Testing was hard at home. I needed to stand in front of the sign, walk around it, back up, in order to understand how the lens would feel, and driving out to the signs all the time wasn’t feasible.

My Beautiful Sign.

So, the first thing I did was make a big fake sign out of foam core that was the same dimensions (If I had thought a little more I might have made it pretty for everyone on the internet). I was able to set this up at home and reasonably iterate. I wasn’t going to get the same lighting conditions, but it let me work out things like scale and placement. If I could go back, I would have made sure it was the same color as the historic markers. The darker background on the fake sign definitely made me believe things were working visually that weren’t.

I also found that reading text in world space in the Spectacles wasn’t a great experience. Any sort of lag made reading pretty headachey, and moving around blocks of static text just didn’t feel particularly exciting. I had originally planned to have more text explaining geological features, but went with a voice over instead that explains the geography as you look at it.

Anita

The second lens that I built was Anita, which is based around Anita Scott Colemans sign in Silver City. Anita Scott Coleman was a poet and essayist who lived in New Mexico at the turn of the century, and wrote about the Black experience in the Southwest. For this piece, I worked with local Albuquerque poet Ebony Isis Booth. She performs one of Anita Scott Colemans poems, Portraiture. When you stand in front of the marker, its replaced with a portrait of Anita — a grove of trees fades in, and you hear Ebonys rendition of the poem. Anita and her work are remarkable in ways that just cant be fully conveyed in a road sign, and my hope with this lens was to bring more depth to the place where we commemorate her.

With the Anita lens, I had a lot of learnings about the art style. The aesthetic I was going for really leaned into the properties of a wave guide headset, but I had overlooked one of the key things about the Spectacles — the ease with which you could take video and share! This style relied heavily on the fact that the glasses are tinted, that black isn’t rendered, and that white is going to be bright additive light. The Spectacles video capture does allow you to render black, doesn’t have that layer of tint from the sunglasses, and the white isn’t brighter than the environment. Luckily, the Snap team worked with me to create a special camera in Lens Studio that allowed me to get captures that are closer to the real life experience. I absolutely love the way these lenses feel in the Spectacles, but if I could go back, I would think more about the way in which many people would experience these pieces, which is as a video or a Snap.

Blocking out the experience with a Scaniverse scan.

Another process thing that made working on the lenses easier was getting a photogrammetry scan of a sign using Scaniverse. I love using Scaniverse in my AR pipeline- I can get good enough scans off my phone in minutes, and start sketching things out in Maya based on the real environment. This helped so much when figuring out things like the placement of the trees for the Anita lens.

Hum

The last lens I built was Hum. Hum is located at the Taos historic marker, and is an audio visualizer of the Taos Hum, a legendary, low buzzing sound that about 2% of the population say they can hear throughout the town of Taos. The thing was, while I was building Hum, there wasn’t really a way to create an audio reactive lens in lens studio. The work-around I used was pre-rendering the visualization in Maya using MASH, and just bringing it in as an animation. Lens Studio 4.0 has some very cool new audio analysis tools, so I’ll definitely be revisiting this lens to make it more reactive.

Whats next?

Coming off this project, I see a lot of ways to expand it:

  • Updating signs with AR. One of the things that often disappoints me with historic markers is how many of them honor terrible histories. A conquistador who murdered hundreds, a dam that flooded and destroyed sacred land, a confederate general. Can we put this technology in the hands of the people on the other sides of those stories, and get their history on the historic markers?
  • Reconstructing and revealing archeological sites using AR. AR could let me see into a chamber thats too delicate to excavate, see what a mural looked like in its full glory 400 years ago.
  • Diving deeper into the New Mexico Historic Womens Marker Initiative, and creating a curated AR art road trip of works by contemporary New Mexican women based around these historic markers.
  • Expanding the project to other state marker programs or plaques.
  • Explore what geolocated land acknowledgements would look like in AR. Could AR create a layer to our vision that builds a deeper understanding of the stolen land we live on?

For most of these projects, I’m too lacking in either the cultural knowledge, the background, the funds, or some combination of the three to undertake them alone. But if you are a person or organization who does have some of those things, and you’d like to explore one of those ideas, lets talk.

At the end of the day, what makes AR exciting is that it can help you see things differently. It’s not VR- we aren’t going into fantastical new realities. We’re looking at our own flawed, beautiful, messy, real world from new perspectives. To me at least, thats whats special about it. Thats the way I want this technology to fit into our lives in the future. Working with the Next Generation Spectacles made building that future feel a little bit closer.

Thanks for reading!

Lauren Cason is an award winning Creative Technologist and XR Creative Director based in Santa Fe, New Mexico.

You can follow her on Instagram, Twitter, Snapchat, or contact her through her website.

--

--

Lauren Cason
Geek Culture

Lauren is an award winning Creative Technologist and XR Creative Director based in Santa Fe, New Mexico.