Loading...

Certification for Ethical Data Use in Generative AI by Fairly Trained

In this post:

  • Fairly Trained Certifies Ethical AI Data Use for Transparency
  • Obtain data consensually; open license or model developer-owned
  • Rigorous process, annual fee; warns against violations

In response to ongoing disputes between creators, intellectual property holders, and generative AI companies regarding the ethical use of data, a new non-profit organization, Fairly Trained, has emerged as a standard-bearer for transparent and consensual data practices in the development of generative AI models.

Fairly Trained, spearheaded by CEO Ed-Newton Rex, aims to bridge the divide between generative AI companies that prioritize obtaining consent from data providers and those that argue against any legal obligation to do so. The organization believes that consumers deserve transparency about how companies handle copyrighted material and, as a response, offers a certification process for companies committed to ethical data use.

The L certification: A mark of consent-based training

Fairly Trained currently offers a single certification, known as the Licensed Model Certification (L Certification). This certification is attainable for any generative AI system provider that has trained its models using data obtained through consensual agreements.

To qualify for the L certification, companies must ensure that the training data adheres to specific prerequisites. Firstly, the data should be provided to the model developer under a contractual agreement with a party possessing the necessary rights. Secondly, the data must be available under an open license, in the public domain, or fully owned by the model developer. Obtaining a license from an organization that licenses creators, such as a record label or stock image library, is considered consent for certification purposes.

Stringent application process and ongoing compliance

Companies seeking certification must undergo a meticulous application process. This involves demonstrating a robust data due diligence process and maintaining comprehensive records of the training data used for each model. The application process commences with an online form, followed by Fairly Trained reviewing the submission and potentially requesting additional information.

Upon successful submission, companies are required to pay an annual certification fee ranging from $500 to $6,000, depending on their revenue. This fee is intended to support Fairly Trained’s ongoing efforts to uphold ethical standards within the generative AI industry.

Warning against violations and certification withdrawal

Fairly Trained emphasizes its commitment to upholding ethical standards and warns that any company found to be in violation of its rules or categories will have its certification rescinded. The organization reserves the right to withdraw certification without reimbursement if new information comes to light that could impact the outcome of the certification.

Fairly Trained’s introduction of the L Certification addresses the growing concerns surrounding the ethical use of data in the development of generative AI systems. By providing a clear framework for consent-based practices, the organization aims to empower consumers with information about companies that prioritize ethical considerations in their AI training processes. As the generative AI landscape continues to evolve, Fairly Trained’s certification could become a benchmark for companies seeking to demonstrate their commitment to responsible and transparent data practices in the industry.

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Share link:

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Related News

Microsoft
Cryptopolitan
Subscribe to CryptoPolitan