iirc the reason it isn’t used still is because even with it being trained by highly skilled professionals, it had some pretty bad biases with race and gender, and was only as accurate as it was with white, male patients.
Plus the publicly released results were fairly cherry picked for their quality.
That reminds me, pretty sure at least one of these ai medical tests it was reading metadata that included the diagnosis on the input image.