Angle detection is difficult to achieve with modern sensors. What could this functionality offer? And what does it have to do with gecko ears?
Researchers at Stanford University have created an experimental setup that can view future cameras and other light detection systems that record both the intensity and angle of incoming light.
The problem of angle detection
All consumer cameras on the market use image sensors (like a CCD or CMOS) to record still images or record video. This image capture is done by recording the intensity of the incoming photons.
The angle at which these photons enter the camera is not recorded. Such data, however, could be very useful with one particular application in mind: focus.
A camera that can record both the intensity and angle of incoming light could use that data to focus an image on a post (that is, after the image has been taken). You could also use angular information to help with focusing on the fly using triangulation. Two angle detectors separated by a given angle can be used to determine the distance from a light source with the use of the sine and cosine rule in trigonometry.
However, detecting the angle of incoming light is complex and requires equipment such as multiple lenses. While a nano-sensor would be useful (as it could grow onto the camera sensor directly), there is a problem with “sub-wavelength” detection. To better understand this problem in action, we can observe the animal kingdom with sound detection and positioning.
Angle of light and ears of gecko
Animals with ears whose spacing is greater than typical sound wavelengths (8 ~ 30 cm) can determine the direction of incoming sound through the time difference as the sound waves reach each ear.
For example, a sound wave reaching the right ear before the left ear must have originated in a direction towards the right ear. This type of position sensing is only possible due to the time it takes for sound waves to propagate (300 m / s), as well as the relative speed of neural transmissions, so that neurons can process enough information before they a sound wave reaches the second ear. Animals that are much smaller than these common wavelengths are said to be “sub-wavelengths” and cannot use this technique to determine the direction of a sound source. Most of these animals can determine position with the use of a connected socket that connects both eardrums acoustically.
When the sound wave reaches one eardrum first, it causes a change in the cavity between the two eardrums, and this decreases the sensing ability of the other eardrum. Although each eardrum will receive a signal that is essentially identical in amplitude, the eardrum to detect it will first affect the other eardrum and this difference is easily detected. One particular creature that uses this method is the gecko, which has an acoustic cavity that connects both eardrums and allows it to determine the direction of the sound source.
So can this coupling technique be used to determine the angle of incoming light with sensors that are considered “sub-wavelength”? Stanford University just answered this question!
Nanowires and angle detection
Researchers at Stanford University have created an experimental setup in which they can determine the angle of incoming light. The configuration is based on the coupling of two silicon nanowires that can interfere with each other when they receive incoming photons. The two wires, which are 100 nm in both width and height, are much smaller than the wavelength of the incoming photons and are placed at a distance of 100 nm from each other.
When incoming photons reach one of the wires first, a Mie scattering occurs, which essentially means that the absorption capacity of the second wire is affected. Since both cables are optically coupled and the photoelectric current is proportional to the angle of the incoming light, the angle can be easily determined.
The same experiment was performed, but with a 2um cable spacing to demonstrate that it is proximity that binds the cables and that the experiment showed no coupling.
Nanowires as shown in Stanford’s 2012 advertisement for welding nanowires with light. Image from Stanford University.
The researchers, however, took their experiment a step further and built two-angle detections. The two detectors were then separated by a known distance and, using the differential current readings from each sensor, they were able to triangulate the light source and therefore know its distance. According to its triangulation, distances from a light source can be determined with a precision of one centimeter within a range of 10 meters. Interestingly, this range finding method is considerably less complex than using high-speed electronic devices that fire a laser beam and then time the return trip.
Potential applications: cameras, computer vision, augmented reality
Using nanowire sensors for angle detection could affect camera sensors in a number of scenarios that need to perform angular or remote sensing without the need for complex hardware.
For example, LiDAR systems use a rotating mirror and laser in conjunction with high-speed electronics to program the return trip of a laser. While this method is reliable and already in use, it generally requires bulky parts (such as motors and mirrors), as well as having a minimal sensing distance.
However, nanowires may not have a minimum distance measurement due to the fact that they operate around the behavior of photons in the real world rather than a CPU and counter. A LiDAR system using nanowires would still need a rotating mirror with a laser, but there would be no need for a timer CPU and the results could be read with even the simplest microcontroller. A fixed laser could also be used, which would act as a laser range finder, but the entire sensor and laser setup could easily fit into a single IC package.
Angle detection, as noted above, could potentially be useful for photography. While professional photographers typically use manual focus, most novice users will use auto focus. Autofocus can be achieved using multiple methods. A simple example of one of these methods involves contrast and sharpness detection, so an object to be focused must have a sharp change in contrast between it and the background. The lens adjusts until the largest change is detected, at which point the camera considers the object in focus.
However, angle sensing sensors could provide both angular and directional information that would tell the camera exactly how far away the subject is. So instead of guessing if the image is in focus, the camera could adjust the camera’s focus settings (these settings are often displayed as a distance to the object). This could provide a path to lensless cameras.
This functionality also has ramifications for robotic vision applications, which provide additional data for processors to use, for example in autonomous vehicle guidance. Augmented reality, which relies on data from sensors to populate graphics over the existing environment, could see a revolution, as more advanced focus and distance sensing enable more immersive augmented experiences.
You can read more about the research in the magazine. The nanotechnology of nature.
The image shown includes the image of the nanowires used courtesy of Stanford University.