Lensless camera captures cellular-level 3D detail in living tissue
First, catch the tiger.
Next, attach Bio-FlatScope, the latest version of lensless microscopy being developed at Rice University.
This particular use is whimsical but not far-fetched, according to Jacob Robinson, an electrical and computer engineer at Rice’s George R. Brown School of Engineering, who has led recent efforts to test Bio-FlatScope on living creatures.
The research team’s FlatCam, a lensless device that channels light through a mask and directly onto a camera sensor, was aimed primarily at the world. The raw images appeared static, but a custom algorithm used the data they contained to reconstruct what the camera saw.
The new device looks inward to image micron-scale targets like cells and blood vessels inside the body, even through the skin. Bio-FlatScope captures images that no lens camera can see – showing, for example, dynamic changes in fluorescently labeled neurons in running mice.
An advantage over other microscopes is that light captured by Bio-FlatScope can be refocused after the fact to reveal 3D detail. And without a lens, the scope’s field of view is the size of the sensor (close to the target) or wider, without distortion.
A small, low-cost Bio-FlatScope could eventually look for signs of cancer or sepsis or become a valuable tool for endoscopy, said Robinson, who has partnered with colleagues from Rice’s Neuroengineering Initiative on the project.
The team’s proof-of-concept study also depicted plants, hydras and even, to a lesser extent, a human. Their results appear in Nature Biomedical Engineering.
The mechanism combines a sophisticated phase mask to generate patterns of light that fall directly on the chip, the researchers say. The original FlatCam’s mask looks like a barcode and limits the amount of light that passes through the camera sensor. But it doesn’t work well for biological samples.
The Bio-FlatScope phase mask is more like a random map of a natural landscape, with no straight lines. “We had to start from scratch and think about how to make it work in a realistic biological setting,” Robinson said.
“Being random allows the mask to be diverse enough to collect light from all directions,” said postdoctoral researcher Vivek Boominathan, one of the study’s four lead authors. “And then we take the random input, called Perlin noise, and do some processing to get these high-contrast edges.”
At the sensor, the light passing through the mask appears as a point-spread feature, a pair of fuzzy spots that seem unnecessary but are actually essential for acquiring detail about objects below the diffraction limit that are too small for many microscopes to see. The size, shape and distance of the drops from each other indicate the distance between the subject and the focal plane. The software reinterprets the data in an image that can be recentered at will.
A tiger was a bit over their budget, so the researchers started small, first capturing cellular structures in a lily of the valley and then calcium activity in a tiny jellyfish-like hydra. They moved on to monitoring a running rodent, attaching the Bio-FlatScope to a rodent’s skull and resting it on a wheel. The data showed fluorescently labeled neurons in a region of the animal’s brain, linking activity in the motor cortex to movement and resolving blood vessels as small as 10 microns in diameter.
Together with Rebecca Richards-Kortum and research scientist Jennifer Carns of Rice Bioengineering, the team identified vascular imaging as a potential clinical application of the Bio-FlatScope. Jimin Wu, graduate student and co-lead author, offered his lower lip to see if light passing through the camera could provide structural details of the blood vessels inside.
“It was kind of a technical challenge because it’s hard to position the Bio-FlatScope in the right position and hold it,” Wu said. “But it showed us that it could be a good tool to see the signs. sepsis, because pre-sepsis alters the density of the vasculature. Cancer also alters the morphology of the microvasculature.”
“We can imagine sticking a microscope in that position would be difficult, but maybe a little clip you put on your lip could look for things like sepsis or tumors in the oral lining,” he said. Robinson.
In the long term, the team sees the potential for a camera that could curve around its subject, like brain tissue, “so it can match the morphology of what you’re looking at,” Robinson said. “Or maybe you could fold it, glue it in place, and have it unfold and unfold.
“You can also do some really cool things by bending it for a fisheye effect, or you can bend it inward and have really high light-gathering efficiency,” he said.
The study’s co-lead authors include former Rice student Jesse Adams and applied physics graduate student Dong Yan. Co-authors are Rice graduate students Sibo Gao and Soonyoung Kim; former postdoctoral student Alex Rodriguez; Caleb Kemere, associate professor of electrical and computer engineering and bioengineering; and Ashok Veeraraghavan, Professor of Electrical and Computer Engineering.
Richards-Kortum is a professor at Malcolm Gillis University, professor of bioengineering and electrical and computer engineering, and director of the Rice360 Institute for Global Health Technology. Robinson is an associate professor of electrical and computer engineering and bioengineering, and he, Kemere and Veeraraghavan are core members of the Rice Neuroengineering Initiative.
The Defense Advanced Research Projects Agency (N66001-17-C-4012), National Institutes of Health (RF1NS110501), and National Science Foundation (1730574) supported the research.