I tried and tried to come up with a story to
I made the rounds, attended various lectures, mainly PET/CT and SPECT/CT talks, which were quite informative, and collected an adequate amount of CME's to justify the stratospheric cost of a last-minute plane ticket, and a corner room at the McCormick Hyatt. Perhaps the most important lesson was that if you want a diagnostic CT image from your SPECT/CT, it should probably have a diagnostic CT component attached to it as the machine that produced this image doesn't:
I spent a good bit of time taking with and about Rad-Aid, and even spent an hour or so behind the desk at their booth:
Everyone who stopped by got the official Doctor Dalai Business Card, and a very enthusiastic retelling of my adventures in Ghana. In all seriousness, there seems to be a LOT of interest in giving back in this manner, and I could not be more thrilled and honored to be a part of it.
I spent only a few moments talking with my friends at Merge discussing PACS. Having had to learn version 7.x on my own whilst in Ghana, I can tell you it has some new features, such as worklists built with block structures, a novel approach. It took me some time to get used to the new back end, which now divides properties among two different management areas. With greater power comes greater complexity...
Of course, the BIG DISCUSSION all over RSNA was Artificial Intelligence, and in particular, AI as applied to Radiology. Well, let's be even more specific. There was a cloud (pun intended) hanging over McCormick, the specter of RSNA Yet To Come, which I quite presciently predicted in my 2011 RSNA Christmas Carol:
I sat down on a PET/CT gantry and bowed my head. The room spun, and when I looked up again, we were seated on a bench beside Lake Michigan. It was a blustery day, with winds one only sees in Chicago in the winter. Strangely, I felt no chill, as I watched leaves blowing through the PACSman's shadowy figure.I would love to take credit for the current hysteria, which would mean that vast numbers of you out there actually read my stuff, which we all know is not the case. No, my colleagues have manifested this paranoia without my help. The demise of Radiology has been predicted for years, in various forms, from numerous causes, and with timelines anywhere from yesterday to 100 years from now. The latest incarnation of this sooth-saying comes from none other than Ezekiel Emanuel, the physician brother of Hizzoner Rahm Emanuel,
I looked behind me and gasped. The once-stately Lakeside Center was in ruins, shattered black pillars and glass everywhere.
"PACSman! What happened here?"
"Oy, Dalai, you need to lay off the Kung Pao, OK? Welcome to RSNA 2045," he said. "Or, well, it would have been if there still was an RSNA. Which there isn't."
"But why?"
"What did you expect?" he said. "Between the UnAffordable Care Act, the doctors' 'fix' that fixed you guys good, and all of your good friends, the clinicians, you radiologists didn't stand a chance."
"But who reads imaging studies now?" I asked.
"Geez, Dalai, why do you even care? OK, OK," he said. "You've come this far. Look, imaging reached the point where it didn't pay squat, right? So no one wanted to do it anymore. Even physicians' assistants and nurse practitioners wouldn't touch it. Imaging got so cheap that people got their scans at Walmart and everybody's data were stored in the cloud or on some vulture -- I mean, vendor-neutral -- archive. Got that? So many images were crammed into all these interconnecting networks that ... badda bing, badda boom, they grew self-aware. So, the damn computers are doing the diagnosing themselves. Whaddya think of that? End of the line for radiology."
"No, PACSman!" I exclaimed. "It cannot be! This is an honorable profession, and it cannot end this way!"
(M)achine learning will displace much of the work of radiologists and anatomical pathologists. These physicians focus largely on interpreting digitized images, which can easily be fed directly to algorithms instead. Massive imaging data sets, combined with recent advances in computer vision, will drive rapid improvements in performance, and machine accuracy will soon exceed that of humans. Indeed, radiology is already part-way there: algorithms can replace a second radiologist reading mammograms5 and will soon exceed human accuracy. The patient-safety movement will increasingly advocate use of algorithms over humans — after all, algorithms need no sleep, and their vigilance is the same at 2 a.m. as at 9 a.m. Algorithms will also monitor and interpret streaming physiological data, replacing aspects of anesthesiology and critical care. The timescale for these disruptions is years, not decades.I will reserve my opinion of this for a few moments, but suffice it to say, it rhymes with "Wool Schmidt".
Artificial Intelligence and Machine Learning as applied to
One of the best was a mock debate between Drs. Eliot Siegel, who took the side of the humans, and Bradley Erickson, who insisted machine domination is relatively imminent. Of course, all physicians were supposed to be replaced in 1910 by the Vibratory Doctor...
I can't begin to do justice to these topics, and a quick Google search will give you more information than you could possibly assimilate in a lifetime. But the "debate" made us understand that it will take the epitome of AI, Artificial General Intelligence, to begin to replace us. And THAT probably won't arrive for a long time. In fact, people who are strong believers in such things were surveyed about when they thought AGI would actually arrive. They responded:
- By 2030: 42%
- By 2050: 25%
- By 2100: 20%
- After 2100: 10%
- NEVER: 2%
I'm with the 2030 crowd.
I don't want to get into the mechanics and such of Machine Learning and image recognition and such. But some of the hype has been driven by advances in Machine Vision...Some have said that because Google can recognize a photo of a dog, it's ready to read complex medical imaging. Not quite:
The dog is a big visual signal, if you will, but a subtle little fracture on a great big bone is only a couple of pixels out of thousands. Reading these exams is not as easy as it looks!
Not to belabor this, but another talk from Dr. Igor Barani, founder of Enlitic, a company leveraging Deep Learning for triage, clinical support, and other non-threatening medical applications, presented some of his work, and in this video of lung nodule evaluation you can get some idea of how the machine "thinks":
So where are we going with this? You may remember my post from last year about IBM's Watson:
Now you might say that Computer Aided Diagnosis is already here. You would be missing the point. CAD doesn't learn. Watson, being a cognitive computer, learns. It learns the way I learned to read CT's. Hopefully it will read them better than I do. Think of it this way... I went to college to learn the chemistry and physics (and for me, engineering and computer science) needed to understand higher concepts. I went on to medical school to learn how the body is put together with all that chemistry and physiology and stuff. I learned where the pulmonary arteries were, and what happens if a clot gets lodged in one. In radiology residency, I learned how it looks on a scan if that happens. (Well, to be fair, the scanners weren't fast enough for CTPA grams back then, and so we learned the concept with conventional arteriography, but you get the idea.)
One physician was overheard saying something like, "Bah. My first-year residents could get that one." Yes...A COMPUTER can match the achievement of a human that has gone through college and medical school. Let this sink in. Code Word: Avicenna shows us THAT A COMPUTER IN THE EARLIEST STAGES OF LEARNING HOW TO READ COMPLEX IMAGING STUDIES CAN MATCH A FIRST-YEAR RADIOLOGY RESIDENT.
This, people, is the epitome of disruptive technology. This is a sea-change in how radiology will manifest in the future. The implications here are staggering. To me, this is MUCH more important and noteworthy than an extra Tesla on a magnet (although a Tesla in my garage would be most appreciated) or an extra hundred slices on a CT. Code Name: Avicenna represents the most important development in our field in a very, very long time. This is a fundamental change in the way we do things. It assists the radiologist, allowing him/her to perform at the highest possible level, but does not replace us. Not for the foreseeable future, anyway.
I was right on that one, at least.
I have seen the future, and its Code Name is Avicenna. Seriously. Trust me, I'm a doctor! But if you don't believe me, just ask Watson.
So where are we now?
I spoke with several reps from IBM, and I am further reassured that HAL, I mean Watson, bears no ill-will toward us lowly humans, particularly radiologists. IBM has no plans to replace us. They said so and I tend to believe it.
Watson himself will manifest in a few different guises, which will be deployed in the coming years. There is sort of a tentative timetable, but I was asked not to reveal that on the off chance that something comes in later than expected. Software, even intelligent software, can be cantankerous, you know. And the FDA can be even more vexing.
You've already met Code Name: Avicenna. IBMerge today categorizes him as part of the "Watson Health Imaging Cognitive Solutions", and deems him "A cognitive physician support tool that suggests differential diagnoses options to help inform the physician’s decisions for the patient." This is the module that impressed me last year with its (OK, his) ability to call a pulmonary embolus on a CT arteriogram. Once released to the public, well, radiologists anyway, Avicenna will concentrate on heart, breast, lung, brain, and eye problems. He will, eventually, launch from PACS as a radiologist assistant. Note I didn't say replacement. At RSNA, Avicenna was put to work in the "Eyes of Watson" display over at the Lakeside Building, chugging away at a (relatively) small palate of test cases. I didn't want to be too obvious about videoing the display, but here are a couple of screen shots showing Avicenna's on-the-fly "thinking" process:
Avicenna has a few new peers, also named for famous old Physicians. (No, there is no Code Name: Dalai; I'm old but not famous, nor is there a Maimonides as yet.)
Code Name: Iaso is named not for a physician per se, but for the daughter of Asclepius, the Greek g-ddess of recuperation from illness. You'll find a lot of tea-based products out there also bearing her name. She is, according to IBMerge, "(a) cognitive "peer review" tool used to detect and reconcile differences between clinical evidence and the patient’s EMR problem list and billing records with the ability to be used prospectively as well." I was told that Iaso will be looking in particular at aortic stenosis and echocardiagram results. It seems that 23% of the time, aortic stenosis is reported in the echo, but somehow doesn't make it to the EMR. Iaso will help "bridge the gaps" in information such as this.
Code Name: Gaborone seems to be named after a town in Botswana rather than a physician (IBMerge, let me know if I'm wrong about that...) Gaborone will be "(a) cognitive data summarization tool that looks expansively at available patient data sources, filters and presents the contextually relevant information within a single view." He (I assume he...pardon my gender insensitivity) will be a stand-alone product.
Watson for Oncology is making its mark outside of imaging. This product "(i)mproves clinical decision making by integrating disparate patient data and images in one workflow to drive evidence-based treatment recommendations." You might have seen the recent news about this Watson module saving a patient:
University of Tokyo doctors report that the artificial intelligence diagnosed a 60-year-old woman's rare form of leukemia that had been incorrectly identified months earlier. The analytical machine took just 10 minutes to compare the patient's genetic changes with a database of 20 million cancer research papers, delivering an accurate diagnosis and leading to proper treatment that had proven elusive. Watson has also identified another rare form of leukemia in another patient, the university says.Not bad for a kid that never went to medical school.
The technically-named Marktation Medical Interpretation Process may "free the radiologist to operate at the top of his/her license." From IBMerge, "Marktation is a process for interpreting medical images. When a physician labels findings on an image using text or speech recognition, the text label is simultaneously stored on the image and pushed into the clinical report. Additionally, Watson anatomical image analytics enables the text label to be posted into the right position of the clinical report and automatically adds a description of the anatomical location to the physician's label. Marktation is a reading paradigm shift aiming to improve reading speed and accuracy." In other words, this module assists us rads in marking lesions. It may sound trivial, but when you're putting little cursors on little tiny lesions and reporting them all, it gets tedious and painful. This could help. A lot.
Finally, Watson has two other pals (siblings? cousins?) for us to play with. The Watson Clinical Integration Module "...aims to present intelligently compiled clinical information based on the indications for an exam as well as Watson's understanding of clinical relevance. This module aims at increasing reader efficiency and helping counteract some of the most common causes of errors in medical imaging, such as base rate neglect, anchoring, bias, framing bias, and premature closure." The Lesion Segmentation and Tracking Module "...aims to automatically segment (outline and measure) physician-marked lesions, pre-mark new exams with the index lesions from prior exams, and produce tracking tables. The module aims to speed the interpretation and reporting of comparison exams in cancer patients and others patients whose findings require longitudinal tracking."
The details of all these many faces of Watson will come with time. I predict you'll see at least some of the modules on a PACS near you sooner than you think. I could say more, but a promise is a promise.
Nancy Koenig, General Manager of Merge (Previous CEO Justin Dearborn now runs Tribune Publishing, another Michael Ferro/Merrick Ventures acquisition, and I guess the CEO title isn't appropriate with IBM owning Merge) had this to say about our electronic friend: "Watson cognitive computing is ideally suited to support radiologists on their journey 'beyond imaging' to practices that address the needs of patient populations, deliver improved patient outcomes, and demonstrate real-world value." And that is the antidote to the current hysteria.
Watson, Enlitic, and all the other AI's out there, are NOT out to replace us radiologists. They are tools for us to use in our quest for ever-better patient care. Nothing more, nothing less. To fear them makes no more sense than fearing radiation, electricity, hammers, guns, or tactical nuclear weapons. Used properly, they can serve man (the last on the list works as a deterrent to other, hopefully sane folks with similar toys).
Dr. Ezekiel and a few rather rabid AI sycophants on Aunt Minnie not withstanding, word of the demise of our profession is a bit premature. No one, and I do indeed mean NO ONE at RSNA, save perhaps for some star-struck journalists and a few companies with nothing real to show (like Deep-Something), claims we will be replaced by machines within any of our lifetimes. That is the bottom line. Watson and his cousins aren't out to get us after all.
However...
This situation is a wake-up call, like quite a few others we have endured or ignored over the years. Think self-referral and AMIC. AI is powerful technology, and it has great potential to help us. Could computers someday "grow self-aware and do the diagnosis themselves"? Sure, if someday has no endpoint.
So here's my Dalai-ism on the topic, simple-minded as you might expect, but still profound, if I do say so myself:
What's our greatest irrational fear of AI? That it will take our jobs away. That the insurance companies or the government will latch onto Watson as a replacement for us cranky, expensive flesh-and-blood radiologists, and leave us shivering out in the cold, holding signs saying "Will Read CT For Food" and "Buddy, Can You Spare A Cup Of Barium?"We need to be in control of this technology.
So it occurs to me that we aren't asking the right questions. Ignore the What and When, and ask, "HOW do we keep control of this?" I posed that very question to Dr. Siegel after one of the sessions. His answer was clear: "If we are in on the development of the technology, we will have a far greater say in how it is used. And besides, can you imagine how long it will take for the FDA to approve machine reads?" And I'm sure he's right about that. And keep in mind, there are so very many other bunches of low-hanging fruit for AI to conquer. Why should radiology be at the head of the line for obsolescence? Because Dr. Emanuel hates us, apparently. Fortunately, he has no pull with Big Blue, or Deep Anything.
So, for those fearing Big Electronic Brother, Here's my advice: Take a deeeeeep breath, and then take a big gulp of Scotch, or a Valium, or whatever you require to climb off the ceiling. And relax. The computers are here To Serve Man.
I'm sorry, Ezekiel. I'm afraid HAL can't do that.
1 comment :
Excellent report and commentary for those of us who couldnt attend.
Thank you very much.
Post a Comment