An AI Told Me I Had Cancer


I realized that I had imagined the AI would take in my entire chart and make a diagnosis, possibly with some dramatic gradually-appearing images like the scenes on Grey’s Anatomy where they discover a large tumor that creates a narrative complication and is solved by the end of the episode. I’ve written before about this phenomenon, where unrealistic Hollywood conceptions of AI can cloud our collective understanding of how AI really works. The reality of AI in medicine is far more mundane than one might imagine, and AI does not “diagnose” cancer the way a human doctor does. A radiologist looks at multiple pictures of the affected area, reads a patient’s history, and may watch multiple videos taken from different perspectives. An AI takes in a static image, evaluates it relative to mathematical patterns found in the AI’s training data, and generates a prediction that parts of the image are mathematically similar to areas labeled (by humans) in the training data. A doctor looks at evidence and draws a conclusion. A computer generates a prediction—which is different from a diagnosis.

Humans use a series of standard tests to generate a diagnosis, and AI is built on top of this diagnostic process. Some of these tests are self-exam, mammography, ultrasound, needle biopsy, genetic testing, or surgical biopsy. Then, you have options for cancer treatments: surgery, radiation, chemotherapy, maintenance drugs. Everyone gets some kind of combination of tests and treatments. I got mammography, ultrasound, needle biopsy, genetic testing, and surgery. My friend, diagnosed around the same time, detected a mass in a self-exam. She got mammography, ultrasound, needle biopsy, genetic testing, surgical biopsy, chemotherapy, surgery, radiation, a second round of chemo, and maintenance drugs. The treatment depends on the kind of cancer, where it is, and what stage it is: 0–4. The tests, treatment, and drugs we have today at US hospitals are the best they have ever been in the history of the world. Thankfully, a cancer diagnosis no longer has to be a death sentence.

Because Geras and his collaborators pre-trained the model and put it online, all Robinson and I had to do was connect our code to the pre-trained model and run my scans through it. We teed it up, and … nothing. No significant cancer results, nada. Which was strange because I knew there was breast cancer. The doctors had just cut off my entire breast so the cancer wouldn’t kill me.

We investigated. We found a clue in the paper, where the authors write, “We have shown experimentally that it is essential to keep the images at high-resolution.” I realized my image, a screenshot of my mammogram, was low-resolution. A high-resolution image was called for.

Robinson discovered an additional problem hidden deep in the image file. My screenshot image appeared black and white to us, like all X-ray images. However, the computer had represented the screenshot as a full color image, also known as an RGB image. Each pixel in a color image has three values: red, green, and blue. Mixing together the values gets you a color, just as with paint. If you make a pixel with 100 units of blue and 100 units of red, you’ll get a purple pixel. The purple pixel’s value might look like this: R:100, G:0, B:100. A color digital photo is actually a grid of pixels, each with an RGB color value. When you put all the pixels next to each other, the human brain forms the collection of pixels into an image.



Source link

How to pitch me: 7 investors discuss what they’re looking for in March 2023 Previous post How to pitch me: 7 investors discuss what they’re looking for in March 2023
Windows 11 could be stealth-nerfing graphics cards – even the RTX 4090 Next post Windows 11 could be stealth-nerfing graphics cards – even the RTX 4090