Friday, November 28, 2025

So I said - something about AI in healthcare

Another in an occasional series of trying to provide some more content here by posting worthwhile comments I’ve posted elsewhere.

In this case, I took a YouGov survey related to public perceptions about the use of AI in healthcare. Three of the questions asked for general responses rather than picking from among multiple choices.

-

November 26, 2025

What ethical considerations are most important to think about when adding AI tools to healthcare?
I was told by my surgeon some years ago “You treat the patient, not the X-ray.” The more we use AI, the more that adage is reversed.

During my recent hospitalization my PCP came by on their rounds, during which they displayed not through words but tone and demeanor a genuine personal concern for my health, something of which AI is incapable of expressing or feeling, at best offering instead merely an algorithmically-driven facade of concern, a programmed pretense, which well could be likened to the comforting reassurances of the scammer.
  
What is your overall impression of AI in healthcare?
Not ready for prime time. For now, it’s a bandwagon promising what it can’t (and perhaps never will) deliver, driven less by public health than by the profit-driven preferences of the corporate spectrum of health care (i.e., hospitals and the insurance industry) who pursue a goal of “efficiency” (read as “fewer employees”) and would, as I suggested earlier, “treat the X-ray, not the patient,” with us coming to exist less as patients than as datasets.

Is there anything else about AI in healthcare that you would like to share with us?
AI is good for, indeed excellent at, analyzing large amounts of data, producing results that can be viewed and considered mathematically because that’s what they are - mathematical derivations from mathematical data.

But healthcare in general and medicine within that reach involves more than mere data but also includes personalities and foibles and trust and other human interactions along with unavoidable judgment calls driven by such non-mathematical considerations, all of which are beyond its capabilities.

Which, by the way, makes the use of chat boxes by consumers for health information advice fraught with risk and worse as shown by recent suits against various companies whose chat boxes are accused of having encouraged teenager users to commit suicide. AI simply is not up the task to which the health care industry is trying to set it in pursuit of profit.

No comments:

 
// I Support The Occupy Movement : banner and script by @jeffcouturer / jeffcouturier.com (v1.2) document.write('
I support the OCCUPY movement
');function occupySwap(whichState){if(whichState==1){document.getElementById('occupyimg').src="https://sites.google.com/site/occupybanners/home/isupportoccupy-right-blue.png"}else{document.getElementById('occupyimg').src="https://sites.google.com/site/occupybanners/home/isupportoccupy-right-red.png"}} document.write('');