Loading...

Demand for Transparency in AI Grows as Howso Tackles “Black Box” AI

AI

Most read

Loading Most Ready posts..

TL;DR

  • Howso provides transparent AI, while others use opaque black-box AI.
  • Transparent AI benefits retail, healthcare, and education, with clients like Mastercard.
  • Mike Capps stresses AI transparency for fairness and accountability.

In a world increasingly driven by artificial intelligence (AI), transparency in decision-making processes has become a pressing concern for many. Mike Capps, co-founder of Howso, a Raleigh-based company specializing in explainable AI, asserts that just as people scrutinize the ingredients in their breakfast food, they should demand transparency in AI systems that influence critical aspects of their lives, such as healthcare and education.

The rise of black box AI

AI’s pervasive presence in our lives has led to its utilization in pivotal decision-making processes, ranging from medical procedures and credit approvals to parole determinations. However, Capps argues that a significant issue with many existing AI systems is their opaqueness, often referred to as “black box AI.” 

These systems make final judgments without providing clear insights into how those conclusions are reached, leaving users and stakeholders in the dark about the decision-making criteria.

Howso, formerly known as Diveplane, was founded by Mike Capps in 2018 with a mission to challenge the prevalence of black box AI. The company’s unique approach to AI, known as “attributable AI,” sets it apart. 

Attributable AI allows users to trace a decision back to specific data points, making the decision-making process transparent and understandable. For instance, if a medical surgery recommendation is made, Howso’s system can pinpoint the 17 most crucial data points that influenced that decision, offering clarity and accountability.

Howso’s AI engine has found applications across various domains. One of its clients, Scanbuy, collaborates with major retailers to leverage Howso’s tool for customer intelligence. This enables retailers to predict customer preferences in a manner that is both accurate and explainable. 

Notably, educational institutions like N.C. State and UNC have also embraced Howso’s technology for specific projects, emphasizing the growing demand for transparent AI in academia.

The decision to open-source Howso’s AI engine in September underscores the company’s commitment to fostering transparency. This move empowers users to design their own explainable AI-driven platforms, further expanding the reach of transparent AI technology.

Noteworthy clients and partnerships

Howso’s impressive list of clients includes industry giants like Mastercard and Mutua de Madrileña, a Spanish insurance company. Additionally, the Virginia Department of Behavioral Health and Developmental Services has harnessed Howso’s technology for enhanced decision-making processes. These partnerships demonstrate the broad applicability and demand for AI systems that prioritize transparency and accountability.

Capps underscores the critical importance of transparency in AI, drawing a parallel with food labels. Just as consumers rely on nutrition labels to make informed choices about their food, individuals should demand similar transparency regarding AI-driven decisions that impact their lives. It is not merely a matter of trust but also a fundamental requirement for responsible software development.

The pitfalls of black box AI

Black box AI, as Capps highlights, poses several inherent problems. Firstly, it raises questions about the reliability and accountability of AI systems. If the inner workings of an AI system are hidden, it becomes challenging to identify and rectify bugs or errors. Consequently, the potential for unintended consequences or biases remains unaddressed, leading to significant replacement costs.

One particularly crucial application of AI where transparency is vital is in parole decisions. These determinations often rely on historical data, which may contain biases. These biases can be perpetuated when scaled up for efficiency, potentially leading to unfair and discriminatory outcomes. Capps emphasizes that while there is a desire to streamline and expedite court processes, this should not come at the expense of perpetuating racial biases.

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Share link:

Benson Mawira

Benson is a blockchain reporter who has delved into industry news, on-chain analysis, non-fungible tokens (NFTs), Artificial Intelligence (AI), etc.His area of expertise is the cryptocurrency markets, fundamental and technical analysis.With his insightful coverage of everything in Financial Technologies, Benson has garnered a global readership.

Stay on top of crypto news, get daily updates in your inbox

Related News

Cryptopolitan
Subscribe to CryptoPolitan