Raman Imaging Time Calculation
Raman spectroscopic imaging is a point scanning technique, whereby a Raman spectrum is acquired for each pixel of a defined image area. As such, the time required to acquire a Raman image is dependent on the size of the imaging area (x vs y vs z in um), the desired spatial sampling resolution (which sets the number of pixels (Raman acquisitions) in each dimension), and the Raman spectral acquisition time.
The total imaging time can therefore be approximately calculated as:
2D Imaging Time = ((X * Y) / Resolution) * Acquisition Time
3D Imaging Time = ((X * Y) / XY_Resolution) * (Z * Z_Resolution) * Acquisition Time
For example, to image a 50 um x 60 um area with a 0.5 um spatial sampling resolution and a 0.8 second acquisition time:
Imaging Time = ((50 * 60) / 0.5) * 0.8 = 4800 seconds = 80 minutes
As such, while decreasing the acquisition time by a factor of i will decrease imaging time by the same factor, decreasing the spatial sampling resolution by i (e.g. from 0.5 um to 1 um) will decrease imaging time by 2i as the number of pixels in both x and y is decreased by i.
The total imaging time can therefore be approximately calculated as:
2D Imaging Time = ((X * Y) / Resolution) * Acquisition Time
3D Imaging Time = ((X * Y) / XY_Resolution) * (Z * Z_Resolution) * Acquisition Time
For example, to image a 50 um x 60 um area with a 0.5 um spatial sampling resolution and a 0.8 second acquisition time:
Imaging Time = ((50 * 60) / 0.5) * 0.8 = 4800 seconds = 80 minutes
As such, while decreasing the acquisition time by a factor of i will decrease imaging time by the same factor, decreasing the spatial sampling resolution by i (e.g. from 0.5 um to 1 um) will decrease imaging time by 2i as the number of pixels in both x and y is decreased by i.