Imagine taking a picture of a sporting event that is so detailed you can make out every face in the crowd. Or getting so precise an image of a surgery patient that you can see near-microscopic anatomy.
These are just two of the marketplace applications for the gigapixel camera, a joint project of scientists from the University of Arizona, Duke University, the University of California, San Diego and Distant Focus Corp.
The prototype synchronizes the taking of one picture by 98 tiny cameras. The image sharpness surpasses anything the human eye could actually see while taking the picture.
A digital photo is compiled from the amount of information, called pixels, that is gathered for the image. Most consumer cameras take pictures at between eight and 40 megapixels.
At one gigapixel of data, the new camera can capture as much information as 25 of the best consumer cameras on the market today.
Computers Strengthen Optics
The UA team, led by Michael Gehm, assistant professor at the UA department of electrical and computer engineering, is figuring out how to corral that much data and translate them into a single image.
In developing the camera’s processing algorithms and software, Gehm’s team, which includes researchers Dathon Golish and Esteban Vera Rojas, is focusing on what he calls “a massive parallel array of electronic elements.”
"A shared objective lens gathers light and routes it to the microcameras that surround it, just like a network computer hands out pieces to the individual work stations,” he explains. “Each gets a different view and works on their little piece of the problem. We arrange for some overlap, so we don't miss anything."
While the prototype is larger than a standard ottoman you’d find in your living room, researchers expect that within five years components will become sufficiently miniaturized for consumer applications.
Gehm foresees many uses besides medicine and sporting events. The camera could make high-resolution records of large cultural artifacts such as buildings and frescos. Law enforcement and security agencies could better identify situations or people.
“We believe that the approach is applicable to any situation where one desires high-resolution imagery over a significantly wide field,” he says.
This work furthers a larger UA mission to create advanced sensor systems combining optics with computers. That’s what’s happening at the Laboratory for Engineering Non-traditional Sensors (LENS), of which Gehm is director.
“We ask, ‘How can we take optical systems and completely rethink them?’,” he says.