We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Software Differences can Skew Medical Scan Results

Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 2 minutes

Differences in software can significantly skew the results of medical scans commonly used in clinical care and research, according to new findings.

The lead author was Larry Alan Pierce, a University of Washington research scientist in the Department of Radiology’s  Imaging Research Lab. Paul E. Kinahan, UW professor of radiology, was the principal investigator.

Pierce, Kinahan and their colleagues looked at software for viewing images generated by positron emission tomographic or PET scanners.  These devices create three-dimensional images of processes within the body. PET scans are most often used to detect cancers and cancer metastases and to follow tumor response to cancer treatment.  PET scans also can help assess the function of the brain, heart and other tissues.

CT.jpg

PET scans can both image tumors and show their metabolic activity. Courtesy of UW Radiology.

A patient undergoing a PET scan receives a small amount of radioactive material. This isotope is bound to a molecule that is taken up and concentrated in the part of the body the clinician or researcher wants to examine, such as tumor tissue. The PET scanner detects radiation given off by the isotope. That information is then processed by software that analyzes the data. The software then produces a three-dimensional image that reveals where in the body the isotope has concentrated.

In the recent study, the researchers wanted to see whether these processed images appeared the same when they were viewed with different software packages.

To do this, they created a standardized image file, called a digital reference object, in which all the data values were known. They sent the file to 16 collaborating sites where the files were viewed with 21 different software packages. Their collaborators were asked to report back certain measurements from the images their viewing software generated.

The results varied greatly, depending on the software package. In the case of one measurement, called the maximum voxel standardized uptake value, some software packages showed uptake that was as much as 38 percent lower than what should have been reported. By another measure, readings were off by as much as 100 percent –  twice as high as they should have been.

 “We were surprised that these software packages, most of which are FDA-approved and some of which can cost as much as $100,000, gave such different results,” Kinahan said.

It is not clear why the different software packages yielded different results, but many have algorithms to smooth or enhance images. This may result in images that do not accurately represent the data from the original PET scan file.

“With these variations, it is as if you had a digital picture of a person’s face and when you opened the file with one software package, say Photoshop, you see freckles on the face, but when you open it with another software package, the freckles disappear, and with yet another package the freckles are there but they’re darker,” Pierce said.

One remedy, Kinahan and Pierce suggest, would be for centers to adopt a standard reference file to check against. In this way, radiology staff could see whether they are getting different values from different software and make adjustments as necessary.

“Ultimately, we hope that the manufacturers of medical imaging software will work together to standardize their products and reduce this variability,” Pierce said.