UPDATE : Monday, September 7, 2020
상단여백
HOME Hospital
SNUBH develops deep-learning tech to detect lesion change in chest X-ray images
  • By Lee Han-soo
  • Published 2020.03.12 15:25
  • Updated 2020.03.12 17:26
  • comments 0

Researchers at Seoul National University Bundang Hospital (SNUBH) have developed a deep-learning technique that compares past and present chest X-ray images to detect lesion changes.

SNUBH Professors Lee Kyung-joon (left) and Kim Ji-hang

Hospitals use chest X-ray images to diagnose lung diseases such as pneumonia and lung cancer as they can receive test results in a short period, and the test is relatively inexpensive. Also, studies are actively underway on using artificial intelligence diagnostic systems to assist medical workers in reading X-ray test results.

The existing chest X-ray imaging studies have a common limitation because only single-point images could be independently analyzed when creating a diagnostic algorithm. It is crucial, however, to compare images from the past and present to detect how the lesions have changed over time and reflect them on the diagnosis when reading test results in actual clinical practice.

To overcome the limitations of the existing algorithm, the team, led by Professors Lee Kyung-joon and Kim Ji-hang at the hospital, divided 5,472 pairs of chest X-ray images obtained from SNUBH into datasets for learning, verifying and testing.

To establish the criteria for lesion change from 4,370 pairs of training data, they extracted the imaging records of patients who took at least two X-rays and doctors' written readings on them. The team then sub-classified such interpretations into the groups of change and no change by using natural language processing algorithms according to the pattern of lesion changes.

Then, the team turned the algorithm that detected changes from past and present images into a machine learning basis by using the collected data. More specifically, their method was to extract the characteristics of lesion changes by using deep learning models, create and analyze correlative maps between the features of the two images and determine whether there were changes by analyzing the distribution of matching maps in correlations calculated.

Afterward, they verified its verifying function of change by comparing it with the existing researches that conducted the analysis of cross-section area and previous related studies, and analyzed its statistical accuracy by computing detecting ability in each changing pattern in the form of the area under curve (AUC). As a result, the correlation map algorithm used by the research team showed an accuracy of 0.89, higher than the existing algorithms’ 0.77 to 0.82.

"The new deep learning technique can be applied to screen for emergencies, including the detection of acute changes," Professor Lee said. "It can also be used as a primary diagnostic tool, and can lead to advanced research on automatic chest radiation automatic reading technology."

The team expects that the new method will contribute to the vitalization of convergence research in the future with the method serving as an example of successfully integrating the latest IT technology in the medical field, Lee added.

It presented the study results at the Medical Image Computing and Computer-Assisted Intervention, a world-class gathering of related researchers.

corea022@docdocdoc.co.kr

<© Korea Biomedical Review, All rights reserved.>

Other articles by Lee Han-soo
iconMost viewed
Comments 0
More
Please leave the first comment.
여백
여백
여백
Back to Top