PCB Design for Virtual and Augmented Reality Headsets
We’ve come a long way in the AR/VR space. It seems like we’re going to have this stuff whether we want it at the moment or not. It’s kind of like the northern passage through the ice cap. It’s new. We’re not sure what the end result looks like but we’re charging ahead with a virtual and/or augmented future.
A Brief History of Virtual Reality
Set the way-back machine to 1939 when both my father and the View-Master stereoscope entered the room. This wasn’t long after Kodachrome was invented so it was cutting edge at the time. We put circular cards into the slot and could browse seven different views that somehow tricked the eye into seeing depth from isolating each eye on two similar slides.
Figure 1. Image Credit: Target - They still make these View-Masters 85 years after introduction.
Back in real reality, this technology still has a lot of room to grow. It was about a decade ago when virtual reality started to bubble up into the lexicon at Google. We knew that a new industry was coming into existence and wanted to at least provide a gateway to the content. A group that was adjacent to the Chrome team developed a product called “Cardboard” that reminded me of the Viewmaster.
The difference between the View-Master and Cardboard is that each eye gets an altered video instead of a slide show. Add in some audio tailored to each ear and presto, you’ve got virtual reality. Note that companies like HTC were already trailblazing the VR space back then. They continue to set the pace with a wide range of products for the consumer and enterprise applications.
Figure 2. Image Credit: RoadToVR.com - Just insert a smartphone and you’re in a form of virtual reality.
Creating the content for VR is pretty fascinating; not just a green screen but a green world for the actors to play in. The rigging, lighting and multiple camera placements capture the action in a way that allows the user to be immersed in the synthetic worldscape. It can be a little disorienting when wearing the headset for the first time.
It’s hard to say exactly where all of this is leading but immersive video games seem to be the essential starting place. While I was working on Daydream VR around 2015, I had the idea of remotely touring real estate as a possible application. Spurred by the pandemic, we now have that as an option in some apartment complexes. Still, I see virtual tours and remote learning as potential VR opportunities though Daydream itself was taken off the market in 2019.
Why AR/VR is Different in Terms of PCB Layout
Wearable technology has to deal with the human element and we’re not made out of rectangles. The printed circuit boards that fill the volume of the headset have to adapt to the shape of the average size person’s head with the means to accommodate both larger and smaller people. Parts of the system have to articulate as necessary.
Another particular aspect of streaming graphics is simply the amount of computational work that entails for the system. A device that warms your pocket is one thing. Warming up your forehead is quite another. Those little flat fans that inhabit most laptops also find their way into mixed reality headsets. They are about as thick as the printed circuit board and often take up residence in a cut-out that would remind one of a nautilus shell as those fans use the same golden ratio principles to move air. Passive cooling in addition to active heat spreaders is to be expected.
The main logic board of an AR/VR system has everything a smartphone has other than the actual phone call part. The SOC (System On Chip) that went into the Daydream headset was the same SOC that went into the Pixel 3 phone. I had worked on the phone project just before jumping over to virtual reality. Mobile chips are the thing for obvious reasons. The standard stack-up is going to be 12 layers with micro-vias for every layer.
Flex Circuits Are Endemic to Mixed Reality
Flex circuits get a lot of use in consumer electronics and this is especially so with goggles and headsets. The unique anatomy that each one of us has means that the system has to be adjustable in specific ways. Joining the Hololens team exposed me to all of the flexes that I sent off to the Original Design Manufacturer (ODM) while at Google. That was an eye opening experience in both flexes and rigid/flexes being way more than a purpose-built flat cable.
Think of the number of different sizes available at the optometrist. The distance between our eyes is a key factor that has to be dialed in for the system to work properly. Meanwhile, our eyes are monitored by separate cameras in order to figure out which way we are looking while wearing the apparatus. The software uses that information to maximize the resolution of the virtual thing that you’re actively observing.
It’s All About Control in the Virtual and Augmented Worlds
There are controllers that usually come with the device and they work in one of two ways. One uses at least two base stations in the VR space and triangulates on the headset and controller(s) using what amounts to LiDAR in the infrared band. This technology is known as the Lighthouse Tracking System and is found in HTC products. Meanwhile Meta goes with infrared LEDs and tracks them with cameras rather than photodetectors. Their technology is called Inside Out Tracking.
The new spatial computer from Apple does away with the controllers preferring to run off of hand gestures that you would learn for different ways of manipulating items in mixed reality. I recall gestures being done on early Pixel phones. That wasn’t the most popular user experience on the phone back then. Meanwhile, today’s content creators have high praise for the hand tracking on the Vision Pro platform.
Credit no less than 12(!) cameras plus a lidar unit for the overall performance. That’s even more than a Tesla. What I know about it is just from Apple’s marketing material. Augmented reality is adding content to the real world. Virtual reality is closer to a video game where the world is 100% special effects.
Either way, if you have to have a controller, you would want six degrees of freedom; up, down, left, right, forward and back. It should be a simple design and has to be robust enough so that you can break the wall mounted television with it. For some reason, that seems to be the most video-worthy failure mode of virtual reality. Does Artificial Intelligence want you to punch your TV right in the mouth? The less amusing failure mode is neck strain.
When It Comes to Wider Adoption, Less Is More
It’s not a huge leap to conclude that the only way the industry is going to really take off is to take considerable mass away from the whole system. The various circuits that make up the entire product have to be further integrated into chip sets that combine more features and that can be hard to do, especially when it comes to the plethora of sensors.
The interoperability of the various sensors depends on board layout to some extent. Every antenna, especially the unplanned ones, creates a new dimension of coexistence issues. Simply put, one more antenna or noise source doesn’t add on, it multiples the possible emissions problems.
Just combining WiFi and Bluetooth on a single radio chip is a huge challenge. To get the most out of the experience, we want the new WiFi 6 with the 60 GHz band that replaces the HDMI cable. Meanwhile, everything from batteries to memory chips has to shrink down for this to really work for everyone.
In the marketplace, the early adopters are always the ones to pay the price for getting a new product off the launch pad. Count camcorders and DVD players among my adoptees. Maybe someday, we’ll be using a device on our heads to fly though a virtual printed circuit board looking for trouble from the signal’s perspective. Our virtual future will arrive in one form or another. It will take time but not another 85 years.