Science Interviews

Interview

Sun, 27th Nov 2011

Lensless Microscopes

Professor Changhuei Yang, University of California Institute of Technology

Listen Now    Download as mp3 from the show Imaging the Invisible

Dave -   Over the last 400 years or so, microscopes have been developed so much they would often be unrecognisable to 17th century scientists.  They can use invisible wavelengths of light, electrons, or even atoms to see with and some are even able to watch individual atoms moving around.  But almost all microscopes have one thing in common Ė the lens.  It might be magnetic or electric instead of being made of glass, but there is still an element which does the focusing.  However, Professor Changhuei Yang from the University of California Institute of Technology is attempting to do away with the lenses in a microscope altogether. 

So Professor Yang, what are the problems with putting a lens in a microscope?

Changhuei -   Well, you run into cost issues.  Those sophisticated, optical elements in a conventional microscope do cost significant amounts of money to fabricate to implement it well.  You also run into the fact that you get astigmatism, chromaticity.  Basically, distortion that is intrinsic in a lens that you would have to live with or contend with.

Dave -   So essentially, they're just a very complicated, difficult thing to make.  If you can get away without them, it just makes the whole thing easier and cheaper?

Changhuei -   Exactly and that's basically what the research going on in my group is trying to do, is to basically come up with other ways of doing microscopy, in which lenses are not needed at all.

Dave -   Where did you get the idea of doing this from?

Changhuei -   Well the idea sort of came from the fact that a lot of us see floaters in our eyes.  Iíll just describe it to your audience that Ė floaters are basically debris that float around within your eyeball and when it get close to the retina itself and if you look up into a clear blue sky, the shadow that those floaters will cast on to your retina will be picked up as sharp images. 

What is interesting about floaters is that if you have floaters in your eyes, I encourage you to try the following experiment.  Try removing or putting your eyeglasses, or just try focusing or defocusing your eyes and you'll notice that those floaters look equally clear no matter what you do.  And this says that floaters, the way you see them, doesnít depend on the lenses in your eyes or eyeglasses for doing imaging.

Dave -   So these are the little patterns which you see, structures which float around, especially when you look at some bright light.

Ben -   I can vouch for this.  I have a couple of floaters that I've been View with floaters in the eyeaware of since I was a very small child and you're exactly right.  They always seem to be equally well focused regardless of whether I'm wearing glasses or not wearing glasses, even if I have contact lenses and the floaters are still there and they're still the same.  For me, they look almost like bacteria under a microscope appropriately.  They look like these sort of rod-shaped blobby things with no real structure.

Changhuei -   And the reason why you see them very clearly is because they are very close to your retinal layer itself.  So just like if you put your hand very close to a table, you can see a clear shadow image.  The same thing happens with these floaters.  And by the way, these floaters are fairly tiny objects.  They are typically on the order of a hundred microns upwards and yeah, when I see them, I see them with very good details.  And this suggests that there is really other ways that you can do microscopy.  For example, if you're really interested, you can imagine taking objects that you want to see and inject it directly into your eye and then use this floater phenomena to see things.

Dave -   I'm guessing you're not suggesting we do this!

Changhuei -   No, of course I'm not advocating that.  But the thing is this, thanks to the fact that cell phones now almost always contain a cell phone camera.  That actually allows us to have technologies that can do a very cheap imaging using this strategy.

Dave -   So you'd essentially just take whatever you're trying to look at and put it directly onto the sensor from a cell phone camera or I guess something which you can just buy off the shelf.

Changhuei -   That's right.  So they serve as the artificial retina and if we actually take cells and put them on or grow them on those chips, we should be able to actually get some sort of middle resolution imaging performed with it.

Dave -   So how good a picture can you get by just taking a cell and putting it onto a camera chip?

Changhuei -   Okay, so the typical resolution you can get that way is about 4 microns.  To set that in context, a typical cell length is about 10 microns in diameter.  So you can't really see features within the cells this way, but you can definitely tell the presence of the cells.  And what we did is we further improved the resolution by coming up with an approach in which we take a bunch of snapshots of these cells as they're laying on top of this sensor chip, with the illumination light source being scanned around and that actually gave us enough information that we can then do processing to get microscopy resolution.

Dave -   So this is based on the idea that if you sort of say, hold your hand a little bit away from the table, and you move a light around that's going to move the shadow around.

Changhuei -   Exactly, and notice that even if the pixel size on the sensorís chip is fairly large, if you take enough of those images where the shadows shift incrementally, you would have collected enough information that you can then later process to get a high resolution image out of that.

Dave -   So, if your shadow is half on one pixel and half on the other pixel, if you move the light a bit, itís going to be slightly more than half on the first pixel, and less on the second pixel.

Changhuei -   That's right.

Dave -   And you can use that information to work out exactly where the shadows should be sitting?

Changhuei -   That's right.

Dave -   So you essentially got your microscope which is a camera with the light moving around over the top, so would you actually physically move the light around?

Changhuei -   The way we implement the light movement is simply have a display in which we have the display showing a round blob of white light and then what that does is that that blob of white light simply moves around on the display and that creates that different angular shift of the illumination that we require for doing imaging.

Dave -   Brilliant!  So where would you actually see this being used?

Changhuei -   So we see this as being useful for both biology and biomedical applications.  So for example, in biology, one of our collaborators is a stem cell researcher and when he grows stem cells, and when they starts differentiating, some of those stem cells actually become highly motile. 

They move everywhere on the chip and it makes it very difficult for him to actually track where his cells are going.  But if you actually grow those cells on this sensor chip and do microscopy level imaging using our approach, you can then automatically trap the sells no matter where they are on the chip itself.  So you won't have to actually go and find the cells, you can just take a sponge of snapshots and then later, process it appropriately.

Dave -   So itís not just cheaper.  Itís actually doing something you couldnít do with another kind of microscope.

Changhuei -   That's right.  So a conventional microscope typically has a very limited field of view.  Weíre talking typically about 100 microns by 100 microns.  This new technology allows us to see over the entire area of a sensor chip which is typically on the other 5 millimetres by 5 millimetres.

Dave -   I guess and also, itís a lot cheaper so you could use it in a 3rd world kind of situation.

Changhuei -   That's right.  One of the applications associated with this is that you can actually use this potentially to look at TB cell cultures, TB bacterial cultures. The way it is done right now is you would have to stick this into Ė the TB bacteria culture into an incubator and then remove it at regular intervals to actually examine it under a microscope system. 

If you think about that whole process where itís really old, labour intensive, and you also have to actually run the significant risk of having the samples being contaminated due to this constant shuttling between the microscope and an incubator.  With the systems that we have, we can actually simply stick the entire imaging system into an incubator, and have the incubator send out the information either to a wire outside the incubator, or through Wi-Fi, and that allows you to actually image the cell in real time, and track them without actually having to remove them from the incubator.

References

Multimedia

Subscribe Free

Related Content

Comments

Make a comment

Interesting.
Interpolated resolution by moving the light source.

What about a time delay?  Lots of pixels would be an advantage, & great for looking at static stained slides.

However, if it takes... say 5 seconds to take an image, then it would be difficult to get a good image of living tissue like one gets when looking at a drop of pond water.

They also don't really list the interpolated resolution. 
The average eukaryote cell may be 10 microns, so 4 microns would allow it to be visualized, but with limited detail.
The average prokaryote cell is about 2 microns...  so that would give you about 4 cells per pixel.  Not too good.

Of course, technology will improve over time.

A traditional light microscope is limited to about 0.2 microns, or about 1/20th of the non interpolated lensless microscope above, but some techniques push them down into the nanometer range.

Perhaps there is a way to merge some of the technologies. CliffordK, Thu, 1st Dec 2011

Any scanner that examines an image thru a small aperture can be regarded as a lensless microsope, with a sufficiently bright light source and a small enough aperture resolution of less than the wavelength of the light used can be achieved.
An early use was the Farnsworth image dissector used by J L Baird in his early 240 line TV transmissions from Alexander Palace. syhprum, Thu, 1st Dec 2011

See the whole discussion | Make a comment