UCSF, GE Healthcare join to create advanced algorithms
The initiative aims to develop a next generation of clinical decision support systems that can assist clinicians with diagnoses.
The University of California at San Francisco and GE Healthcare are teaming up to develop advanced analytics to support the next generation of clinical decision support systems hosted on a cloud platform.
The project includes the development of a library of “deep learning algorithms,” that can be embedded in decision support and aid in quicker diagnoses in acute situations such as trauma.
Algorithms are a set of steps or rules for problem solving, in this case running on a computer and analyzing large data sets to expedite diagnoses of medical imaging procedures, starting with pneumothorax, which is a collapsed lung.
Deep learning is a newer technology in the family of artificial intelligence, explains Michael Blum, MD, associate vice chancellor for informatics and director of the UC San Francisco Center for Digital Health. Researchers will create programs that computers will run to create the algorithms, and then run the algorithms through clinical data to make treatment recommendations.
IBM’s Watson Health computer is another example of early deep learning technology.
The deep learning algorithms being developed will “teach” imaging systems how to distinguish between normal and abnormal scans so clinicians can quickly prioritize and treat patients.
“What is so powerful about combining analytics, deep learning and cloud technology is that the solutions will only get smarter and more scalable over time,” says Charles Koontz, chief digital officer at GE Healthcare.
While the initiative will start with imaging technologies, it will expand over time to clinical data sets within electronic medical records, Blum says. “At a high level, it is reasonable to say the next step will be to create end-to-end workflows,” he adds.
Also See: Mount Sinai plans advanced research into robotic surgery
UC San Francisco brings to the project a large delivery system as a test bed and life sciences expertise, and GE brings provider clients using its suite of imaging technologies in cardiology, cath lab, and intensive care units, among others.
Researchers will run analyses of algorithms on the cloud platform, rather than having to physically download and then run the algorithms, which will be a workflow improvement, Blum explains.
But first, researchers need to teach computers with embedded algorithms to recognize a variety of medical situations and assess the severity. For instance, computers will be shown many images of normal and collapsed lungs and over time will learn the abnormal features of a collapsed lung and the various ways that pneumothorax is identified on X-rays, until the computers are trained enough to accurately alert clinicians to the presence or non-presence of pneumothorax or another condition.
UC San Francisco and GE healthcare also want to design algorithms that can speed diagnoses, Blum says. For instance, a feeding tube might be put in a patient but doesn’t go all the way down or curls up, so an X-ray is required, meaning that the tube cannot be taken out, cleaned and reused until a radiologist eventually reads the X-ray two hours later. But an algorithm may be able to immediately access images and speed diagnoses, increasing treatment efficiency.
The project includes the development of a library of “deep learning algorithms,” that can be embedded in decision support and aid in quicker diagnoses in acute situations such as trauma.
Algorithms are a set of steps or rules for problem solving, in this case running on a computer and analyzing large data sets to expedite diagnoses of medical imaging procedures, starting with pneumothorax, which is a collapsed lung.
Deep learning is a newer technology in the family of artificial intelligence, explains Michael Blum, MD, associate vice chancellor for informatics and director of the UC San Francisco Center for Digital Health. Researchers will create programs that computers will run to create the algorithms, and then run the algorithms through clinical data to make treatment recommendations.
IBM’s Watson Health computer is another example of early deep learning technology.
The deep learning algorithms being developed will “teach” imaging systems how to distinguish between normal and abnormal scans so clinicians can quickly prioritize and treat patients.
“What is so powerful about combining analytics, deep learning and cloud technology is that the solutions will only get smarter and more scalable over time,” says Charles Koontz, chief digital officer at GE Healthcare.
While the initiative will start with imaging technologies, it will expand over time to clinical data sets within electronic medical records, Blum says. “At a high level, it is reasonable to say the next step will be to create end-to-end workflows,” he adds.
Also See: Mount Sinai plans advanced research into robotic surgery
UC San Francisco brings to the project a large delivery system as a test bed and life sciences expertise, and GE brings provider clients using its suite of imaging technologies in cardiology, cath lab, and intensive care units, among others.
Researchers will run analyses of algorithms on the cloud platform, rather than having to physically download and then run the algorithms, which will be a workflow improvement, Blum explains.
But first, researchers need to teach computers with embedded algorithms to recognize a variety of medical situations and assess the severity. For instance, computers will be shown many images of normal and collapsed lungs and over time will learn the abnormal features of a collapsed lung and the various ways that pneumothorax is identified on X-rays, until the computers are trained enough to accurately alert clinicians to the presence or non-presence of pneumothorax or another condition.
UC San Francisco and GE healthcare also want to design algorithms that can speed diagnoses, Blum says. For instance, a feeding tube might be put in a patient but doesn’t go all the way down or curls up, so an X-ray is required, meaning that the tube cannot be taken out, cleaned and reused until a radiologist eventually reads the X-ray two hours later. But an algorithm may be able to immediately access images and speed diagnoses, increasing treatment efficiency.
More for you
Loading data for hdm_tax_topic #better-outcomes...