A Microsoft engineer has sounded the alarm over the company’s AI image generator, warning of its propensity to produce violent and sexual content while ignoring copyright laws. Shane Jones, a principal software engineering manager at Microsoft, has been actively testing the product, known as Copilot Designer, in his free time. His findings, which include disturbing imagery such as demons alongside themes of abortion rights, underage drinking, and drug use, have prompted him to escalate the matter to higher authorities.
Jones’ concerns and actions taken
Jones, who has worked at Microsoft for six years, began testing Copilot Designer as part of his role as a red teamer, tasked with identifying vulnerabilities in AI technology. Disturbed by the images generated by the tool, which he deemed unsafe and potentially harmful, Jones reported his findings internally to Microsoft but found the company reluctant to take decisive action. Despite his efforts to raise concerns, including posting an open letter on LinkedIn and contacting U.S. senators, Jones felt compelled to escalate the issue further.
Escalation to FTC Chair and Microsoft’s Board
In a bid to address the ongoing issues with Copilot Designer, Jones sent letters to Federal Trade Commission Chair Lina Khan and Microsoft’s board of directors. He urged the FTC to investigate the matter and called on Microsoft to implement changes such as adding disclosures to the product and adjusting its app rating to reflect mature content. Jones also requested that Microsoft’s board initiate an independent review of the company’s responsible AI incident reporting processes.
Concerns over generative AI and lack of oversight
Jones’ actions highlight broader concerns regarding the lack of oversight surrounding generative AI technology. With the potential for widespread dissemination of harmful content, particularly in the context of upcoming elections, Jones emphasizes the urgent need for stricter safeguards. He notes that the Copilot team receives numerous product feedback messages daily, indicating a significant challenge in addressing all issues promptly.
Copyright infringement and content issues
Apart from the creation of violent and sexual images, Copilot Designer also raises concerns about copyright infringement. The tool has been observed generating images featuring copyrighted characters such as Disney’s Elsa and Star Wars figures, potentially violating both copyright laws and Microsoft’s policies. Jones stresses that the issue extends beyond copyright violations, highlighting the broader implications of harmful and disturbing imagery being disseminated globally.
Shane Jones’ efforts to address the concerns surrounding Microsoft’s AI image generator underscore the growing need for robust oversight and regulation in the field of generative AI. His actions serve as a call to action for both regulatory authorities and technology companies to prioritize the safety and ethical use of AI technologies.
Land a High-Paying Web3 Job in 90 Days: The Ultimate Roadmap