Hacker News new | past | comments | ask | show | jobs | submit login
New laser ultrasound technique can remotely image the inside of a person (news.mit.edu)
192 points by happy-go-lucky on Dec 22, 2019 | hide | past | favorite | 42 comments



Even after minutes my mind is still blown by the idea that there exists a proof of concept device for a remote sound generation - laser microphone - ultrasound imaging technique. Maybe we will actually live to see something like Star Trek tricorders.


You mean star trek smartphones? We're almost there already.


Tricorders would be amazing. Imagine not having to be crammed inside of a huge machine that irradiates you for 30 minutes.


Its unhealthy for a person's mind to chase dreams created by somebody else. "Star Trek tricorders"


The lasers aren't generating the sound waves per se, your body is. The laser heats up your body, causing things to expand, then the laser is turned off, causing things to contract. It's the expanding and contracting of your body that is creating the sound waves. These sound waves are then detected by something stuck to your body that converts the wave into an electrical signal.

This is different than ultrasound, which generates its own sound waves that are then reflected by the body back to the device.

It's novel, but it doesn't solve the problem of being "non-invasive"; you still need a sensor stuck to the body to pick up the sound waves.


It's using a laser to read the reflections contact-free.

>The researchers tested this idea with a laser setup, using one pulsed laser set at 1,550 nanometers to generate sound waves, and a second continuous laser, tuned to the same wavelength, to remotely detect reflected sound waves.


Oh I must have misread, I got hung up on their description of the previous iteration of this technology, where they needed transducers stuck to the body to detect the sound waves.


How does it help to have the sensing laser at the same wave length as the pulsing laser, if the pulsing laser is triggering sound waves which have a different wavelength?


The source laser was 1540 nm, the sensing laser was 1550 nm.[1]

> For the LUS system reported here, the 1550 nm LDV was specifically selected to maximize permissible optical backscatter from the skin while maintaining safety.

> Thus, the 1540 nm optical source was selected to maximize the converted acoustic source amplitude while remaining within the safety limits.

They wanted lasers close in the range of 1500 nm, but just over that wavelength has better safety limits. I am pretty certain those lasers were just chosen because that was what was commercially available in the narrow range of frequencies they wanted. The source laser is a Q-switching laser[2], which is needed to produce those pulses.

It doesn't particularly matter what wavelength they used (~2000 nm also would have worked); the doppler (aka self-mixing[3]) technique they used is very highly tolerant to noise.

[1]: https://www.nature.com/articles/s41377-019-0229-8

[2]: https://en.wikipedia.org/wiki/Q-switching

[3]: https://en.wikipedia.org/wiki/Self-mixing_laser_interferomet...


If they had two lasers of similar frequencies they would get interference beats/aliases in the data. If they use the exact same frequency (eg by splitting the source) it shouldn’t show up. Should allow for less stable/expensive laser sources too.


> If they had two lasers of similar frequencies they would get interference beats/aliases in the data.

1. They did use different frequencies- 1540 nm and 1550 nm.

2. The measuring was done with doppler interferometry, so beats would not impact measurements. The sensor is only responding to the difference between two signals, not the intensity.

3. 10 nm wavelength shift is a much larger difference than anything they'd be measuring, so the only thing that would actually even "show up" (as noise) would be light at the same wavelength as the measuring laser. Any constant noise at the same wavelength as the measuring laser will be filtered out. Only slight frequency deviations would come through- like the kind from vibrations shaking the laser.


I'm guessing in order to achieve the same body penetration characteristics on RX and TX.


Mary Lou Jepson's 'OpenWater' work is a similar project where light + computation is rapidly advancing the options for deep, real-time medical imaging:

https://www.youtube.com/watch?v=k879MFfB_3Q


The two are very different. Openwater, as per their patent filings, uses ultrasound to change the refractive index of tissue and/or phase shift and use an inverse distortion matrix to undo scattering. I've yet to see a real result on dynamically changing tissue. By contrast, this uses light to generate ultrasound waves, basically photoacoustic tomography.


This sounds a lot like optical coherence tomography (OCT), which is essentially laser-based “ultrasound” used to image the inside of retinal tissue. (It’s not actually ultrasound, it’s entirely optical but uses similar principles.)

https://en.m.wikipedia.org/wiki/Optical_coherence_tomography

It’s already a common diagnostic tool used for patients who are at risk for glaucoma. By creating a 3D cross section of the eye’s retinal structures, it lets physicians detect early damage from the disorder, hopefully before any vision loss.


Thanks for linking to that. I had this done a while back but didn't know what it was called or how it worked. If only all imaging techniques were as fast and pleasant this one. It was over in seconds. Although I suppose the small size of retinas likely play a part in that as well.


How is this different from photoacoustic tomography?


I’m sure this is a serious question but I lol’d when I read it. I’m going to use that line at a party or a meeting some time to sound smart.


Or you could assume he’s more experienced than you in that field and not be offensive?


You could also notice that it was clearly a light-hearted joke and not be so quick to take offense.


Definitely no offense meant. It just sounded so smart that you could say it in context or out of context and most people wouldn’t know what you’re talking about but would probably assume you are experienced in whatever field.


Photoacoustic tomography, based just on the meaning of the words, sounds like an okay description of what they are doing.



Conventional ultrasound transducer arrays aren’t cheap. Could this solve the cost issue, e.g. for developing countries?


I believe so. I'm a bit surprised that microring resonator based optical ultrasound detector arrays haven't already been used.


> one laser remotely generates sound waves that bounce through the body...

This is oddly related to what I was just watching. It was a SciShow episode that discussed how phonons which can be generated (come from??) sound waves. The phonons may be affected by gravity in the reverse way than all other matter and may even repel matter.

Phonons it seems are not just sound. The MIT website explains "phonon is just a fancy word for a particle of heat."

Anyway it seemed somewhat related especially "The resulting mechanical vibrations generate sound waves that travel back up". But I guess the researchers in the article may not have meant the sound waves literally go up.

http://news.mit.edu/2010/explained-phonons-0706


Would this work over larger distances - e.g. remote sensing from space?


I doubt it just because there would be so much noise.


I saw a thing about the Voyager 1 probe, and how they communicate with it using two satelite dishes - one pointed at Voyager, and one pointed in the direction of but very slightly off of Voyager. Since both have the same background noise but only one dish has the transmissions from Voyager in it, you essentially subtract the non-Voyager dish's input from the other one and are left with the clean (or at least a lot cleaner) signal from Voyager.

Perhaps something similar could be done to reduce noise, using an empty patch of ground nearby the target.


I wouldn't be so sure. Run enough trials and you can reduce a lot of the noise.


Enough computation can find the signal in any amount of noise!


Cool! Sounds really impressive!


I truly wish Gene Roddenberry could be alive today to see how far we have come.


But what's the use case?


Is there a single true technological innovation (not some weekend or IoT gadget) that doesn’t end up with a use case, sooner or later?


Technology is the application of scientific knowledge, so you've got a tautology there. That said: Capacitance Electronic Disc.


You walk into your doctors office and before you even sit down they have a trove of health data before you even tell them anything. If these were readily available just like say blood pressure testing at every pharmacy, we could perhaps catch certain health conditions early on.


I’d expect this in my home, constantly monitoring my vitals, without the need to wear a device. Like a smart light bulb or thermostat.


Detect drug mules while they are still in the security line. Ex-post-facto parallel construction via "behavioral profiling" or a dog trained to alert on command keeps the technique secret. Collection of medical data is a lucky bonus.


I thought of another one, you sell health insurance and want to assess their health and your risk without asking the person. Heart disease got it, blood flow problems got it, liver issues yup, digestive issues yes. Ultrasound is used in many diagnostic tests this would be an insurance companies dream technology.


That's an interesting take, I didn't even consider the privacy issues of remote medical sensing.


Tricorder.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: