Skip to main content

Making Sense Out of Sensors

The human, or any other animal’s body, is a marvel for its ability to take in data from its surroundings. We can detect light, temperature, pressure, aroma, and, sound. Some members of the Earth’s fauna can see, feel, taste or hear things beyond our normal input functions. In order to make our machines more useful, we have been imparting these human and superhuman sensory attributes upon them.

Some of the senses we give to our machines:

  • Camera vision

  • Thermometer/infrared detection

  • Hall Effect Sensor

  • Chemical Sniffer

  • Microphone

  • Haptics Sensor

  • Gyroscope

  • Altimeter

  • Radar

  • Sonar

  • LiDAR

  • And much, much more!

Notice that our machines can possess a wider pallet of senses than even the most perceptive person. What sets us apart is the organic fusion and processing of our limited set. Programming for learning and conditioning based on sensory input about its environment can only be accomplished in minuscule slices compared to an organic creature. The limits of artificial intelligence or machine learning set us people apart - for now.


Man and machine

Image credit: Endouble

A general rule for most of the really small sensors is they require space equal to 9 times their actual footprint. Draw a tick-tack-toe figure with the component taking up the middle square and create a component keep-out with the remaining eight. That’s a minimum, and most of these will work better with more space. It may apply to components that connected to the sensor as well. Consider all inner layers when creating any type of power plane or trace that is not related to the sensor. It should be no surprise that sensors are sensitive.

Here are a few layout app notes for a typical sensor.

• No components or vias should be placed at a distance less than 2 mm from the package land area.

• Signal traces connected to pads should be as symmetric as possible. Put dummy traces on the NC pads in order to have the same length of exposed trace for all pads.

• No copper traces should be on the top layer of the PCB under the package.

And so on. Different sensors will have their own special notes. Getting these items to work on the PCB layout was initially frustrating but became fascinating as I took a deeper dive into their inner workings. I’ll share what I found to be most noteworthy.

The camera gains resolution with every iteration. The CCD processes more colors and with truer contrast to provide rich images in real time. Combined with night-vision technology to solve the seeing-in-the-dark problem, we can take perceive our surroundings with unnatural acuity. A big enough lens can pull distant celestial bodies right into our lap. An array of cameras can be used for motion capture and 3D rendering. The MIPI interface is the common transport for camera data while the Display Port is used for projecting the images to our screens and headsets.

Thermocouples and infrared “guns” detect relative temperatures either by touching the hot spot or observing the temperature gradient from beyond the surface. Thermistors are, as the name implies, a resistor that reacts to temperatures. Generally, as things get hotter, the thermistor starts to throttle the device in a way that keeps the junction temperatures of the device from exceeding the safe range. Infrared is basically the same as a camera but takes in a wider spectrum.

Hall effect sensors operate by detecting magnetic fields. While not true magnetometers, they are excellent for deciding, for instance, if your car’s door is ajar or your laptop is open. The Hall effect can be used to illuminate a warning light or power up the device. In this binary function, they are far more reliable than a mechanical switch. Like many sensors, they are fussy about their surroundings and their power supplies. We used a “prox-flex” to mount it in proximity to the edge of the screen where it would make an inductive connection with the other half of the case when it was closed. That would trigger the Chromebook light show.



Image credit: Author

Microphones: In all likelihood, you have one within arm’s reach. Listening devices are everywhere. Like a camera array captures 3D, an array of microphones can mimic human hearing by measuring the delay of a sound wave from one mic to another. The array can feature both omnidirectional and more focused directional mics. This allows the unit to home in on the source of the sound and dynamically emphasize the mic with the best signal to noise ratio. By definition, they work in the range of human hearing, but different transducers can be tuned to pick up higher or lower frequencies.

Chemical sniffers have come a long way in recent times. They can detect and sort minute amounts of a specific chemical or compound in a mixture of other scents. An array of different sniffers can accomplish a wider task, but each one needs power which can add up to some higher voltage requirements.

Progress and competition in this field is such that the inner workings of these devices are tightly held trade secrets. If you know in detail, you are probably under a non-disclosure agreement. This makes sense when you consider that the driver for this technology is, of course, security, ie. border control. The sniffers are also useful for health and safety coverage such as detecting spoiled food deliveries. The highest compliment one of these things can get is to be called a dog ...that doesn’t need fire hydrant breaks.

Haptics sensors are interesting. The MacBook Pro that is helping me write this story has one, and it allows me to press on the trackpad for a click or to press harder for an alternative click reaction. I highlight a word and give the trackpad a “3D touch” to see a definition of the word. It probably does other things too. The convertible laptop/tablet devices rely on these to establish how you are using it at that moment.

Popular applications

Strain gauges, accelerometers, pressure sensors, even a solid-state compass can be packed into a VR/AR headset to help the machine interpret your movements and present an immersive experience. Swim with the whales, soar with the eagles or ride your favorite roller coaster from the same arcade bubble. Do all three at once. It’s your world.

We want our radar to be more bat-like. We have palm-sized quadcopters to avoid when our micro-drone takes to the air with a cloud of others. Sonar helps us park our cars and find schools of fish, both of which come in handy in the right circumstances.



Image credit: Velodyne LiDAR

Of course, my favorite sensor has to be LiDAR. The point cloud that returns through the array of optical detectors paints an unblinking 360-degree view at 200 meters or more. From a board design standpoint, it may have something in common with the graphics chips with such a wide bus and tight timing budget. From a self-driving vehicle perspective, this is the game changer, the enabler along with a number of other pieces will someday relieve me of the inefficiency of driving when I could be blogging.

The result of all of these sensory experiences for machines is data. Unimaginable reams of ones and zeros have to be converted into actionable information with little to no lag. That is where organic beings have the advantage. Millennia after millennia of survival-of-the-fittest has equipped us with the tools we need to make sense of this world. Granting any of those senses to an inert collection of metals and insulators is a minor miracle. Putting all of them together in a package with a 70-year warranty might take us a while. Animatronic chat-bots aside, we the People have a lock on humanity... for now.


About the Author

John Burkhert Jr is a career PCB Designer experienced in Military, Telecom, Consumer Hardware and lately, the Automotive industry. Originally, an RF specialist -- compelled to flip the bit now and then to fill the need for high-speed digital design. John enjoys playing bass and racing bikes when he's not writing about or performing PCB layout. You can find John on LinkedIn.

Profile Photo of John Burkhert