top of page

Denoising Microwave Imaging Data Using Deep Learning

I worked with Dr. Chao Liu from the University of Colorado Denver to develop a deep learning model to denoise microwave imaging data. 

A team at Michigan State University was researching non-destructive evaluation techniques. Specifically, they were interested in developing methods of imaging plant roots underneath the soil in a non-destructive manner. The team generated microwaves and concentrated them on a carrot. They placed several antennae around the carrot in the soil to pick up reflected microwaves. They then used a method known as time-reversal to reconstruct 2D and 3D imaging of the data based on the way the microwaves were being reflected. This is similar to how ultrasound imaging works but uses microwaves instead of sound waves. The problem was the soil was adding a lot of noise to the reconstructed images. 

Dr. Liu and I developed a deep learning model to denoise the data collected from the sensors. Each feature sample for the model was made up of data from 27 antennas at angles (50° - 310°), 1601 frequencies (1 GHz to 10 GHz) per angle, and 4 scattering parameters (Real & Imaginary parts) per frequency. So the input to the model was a tensor of shape 27x1601x8. Our model was based on the auto-encoder model and aimed to remove any irrelevant signals coming from the soil. Below is a diagram of our base model.

model.png

We used an HFSS simulation to generate our training samples. We modeled the noise from the soil samples and added it to the samples. We then used the noisy samples as the features and the simulated "clean" samples as the targets. 

Once our model was trained we used it to make a prediction on some experimental data collected at Michigan State University. The results can bee seen below.

Construction of Raw Experimental Data

experimental1.gif

Field Intensity Magnitude

experimental2.gif

Windowed Temporal Signals

Construction of Denoised Experimental Data

modelpred1.gif

Field Intensity Magnitude

modelpred2.gif

Windowed Temporal Signals

When we examine the construction results above of the raw data and the same data passed through our model, we can see that the field intensity magnitude of the data passed through our model has a lot less noise around it. The windowed temporal signal also showed the root much more clearly. Our model successfully removed a significant amount of noise caused by the soil. However, our model tends to produce overly idealized results and can reduce some detail of the roots being imaged. Overall, we were successful in developing a model to denoise microwave imaging data using deep learning. 

bottom of page