Artificial Intelligence (AI) tools, used by SA medical schemes to detect fraud, are perpetuating racial bias against black healthcare providers, according to the Council for Medical Schemes’ (CMS) report on racial profiling. – Business Day (16 July 2025)
The report found that black practitioners were disproportionately investigated and penalised, challenging the long-held belief that data-driven algorithms are inherently neutral.
Despite arguments from medical schemes and administrators that their fraud, waste, and abuse (FWA) investigations were initiated by "neutral" software and objective algorithms, the panel concluded that “procedural fairness had been compromised, and the results suggested discriminatory profiling”.
Meanwhile, experts warned against “mindlessly importing AI models trained in different socio-economic contexts”, such as the US-developed tools used by two of the schemes investigated by the CMS, as they “may carry embedded assumptions unsuited to SA’s diverse population”.