🔥 Land A High Paying Web3 Job In 90 Days LEARN MORE

The New Rule of AI Discrimination in the Healthcare Sector in Progress

In this post:

  • The new HHS rule requires health service providers to mitigate the risks of AI patient care tool bias. 
  • AI-supplied services should reduce discrimination by making reasonable efforts. 
  • This rule increases the protection against discrimination for healthcare provision entities and those receiving federal funds.

The Office for Civil Rights (OCR) in the Department of Health and Human Services (HHS) recently issued a final rule on the applicability of section 1557 of the Affordable Care Act, which limits any health programs funded by federal dollars from discriminating based on race, color, nationality, sex, age, or disability.

This rule aligns with the Bostock v. Clayton County Supreme Court’s decision in 2020, and Section 1557 is understood as discrimination “based on sex,” which includes both sexual orientation and gender identity. The last rule further resolved that even when machine learning is applied in an artificial intelligence-powered technology, the providers’ use standards stay under Section 1557. Therefore, they are required to examine and address any potential risks of discrimination.

Assessing AI bias risks

Under the governance rule, a “patient care decision support tool” is any standalone or combined configuration of any technology or method a covered entity uses to aid clinical decision-making. For instance, some suppliers of tools for direct patient care applications include predictive algorithms that can assess patient risk and severity of future health events, as well as analysis engines used to endorse or deny medical claims based on the necessity of treatment. 

OCR stated that with the utilization of such tools and favoring AI-based decision-making, the situation with every patient must be reconsidered. Several studies demonstrated that reliance on certain algorithms led to racial and ethnic disparities. The above definition encapsulates automated vehicles and non-automated aids like flowcharts. Health technology could thus be unfair if people with disabilities are prevented from acquiring equal care, and the providers do not take into consideration the individual characteristics of the health check.

See also  Second Jupuary proposal may pass this week, setting up two major airdrops

Reasonable bias prevention efforts

The covered entity’s ability to take on the additional consumer responsibility or its financial means. Whether the products are produced with an intention for their intended use as approved by the developers and regulators, regardless of the data layout, developers should endeavor to educate the entity regarding the risk of discrimination.

Be explicit about whether the data subject (on whose behalf the covered entity is acting) has been following a process or whether the covered entity (i.e., an organization acting on behalf of the subject) has already developed one for evaluating the tools used.

The rule radically differs from the former rule that Medicare Part B funding is classified as federal assistance if given to providers and suppliers. Slightly reducing the number of persons receiving Part B funds and exempting them from Section 1557 requirements will make them comply with Section 1557 and other civil rights regulations implemented by the OCR office.

Land a High-Paying Web3 Job in 90 Days: The Ultimate Roadmap

Share link:

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Editor's choice

Loading Editor's Choice articles...
Cryptopolitan
Subscribe to CryptoPolitan