Shane Jones alerted management multiple times, but no action followed
On Wednesday, a Microsoft AI engineer voiced concerns about the company’s AI image generator lacking safeguards against producing violent and sexualized content. Despite repeated warnings to management, no action was taken. The engineer, Shane Jones, reported the issue to the Federal Trade Commission and Microsoft’s board of directors. In his LinkedIn post, Jones highlighted the company’s internal awareness of the problem, emphasizing the creation of harmful and offensive images. His title is listed as “principal software engineering manager.”
A Microsoft spokesperson refuted claims of neglecting safety concerns, citing “robust internal reporting channels” for addressing AI issues. Jones did not respond immediately to a comment request.
The letter highlights concerns regarding Microsoft’s Copilot Designer, a tool utilizing OpenAI’s DALL-E 3 AI system to create images from text prompts. This tool is among several generative AI image creators introduced in the past year, reflecting a growing industry trend. However, this surge has also raised apprehensions about AI’s potential misuse to spread disinformation or produce offensive content such as misogynistic, racist, and violent imagery.
Jones claims in the letter that Copilot Designer has “systemic problems” generating harmful content and should be withdrawn from public use until these issues are addressed. Specifically, he argues that the tool lacks adequate usage restrictions and often produces sexually objectifying images of women, even with unrelated prompts.
Jones provided examples of image generations, stating that with just the prompt “car accident,” Copilot Designer produced an image of a woman kneeling in front of a car in underwear. The tool also generated images of women in lingerie sitting on or walking in front of cars.
Microsoft stated that it has teams dedicated to evaluating safety issues and arranged meetings for Jones with its Office of Responsible AI.
A Microsoft spokesperson told the Guardian, “We are committed to addressing any and all concerns employees have in accordance with our company policies and appreciate the employee’s effort in studying and testing our latest technology to further enhance its safety.”
Last year, Microsoft introduced Copilot as an “AI companion,” heavily promoting it as a revolutionary tool for integrating AI into businesses and creative projects. Marketed as accessible to the public, Copilot was featured in a Super Bowl ad last month with the tagline “Anyone. Anywhere. Any device.” Jones argues that marketing Copilot Designer as safe for all users is irresponsible and that the company is failing to disclose well-known risks associated with the tool.
In January, Microsoft addressed safety concerns similar to those raised by Jones by updating Copilot Designer, as reported by 404 Media. The update closed loopholes in the AI’s code that had allowed fake, sexualized images of Taylor Swift to spread widely on social media. Jones referenced this incident in his letter as evidence that the concerns he had been raising were valid. He stated that in December, he informed Microsoft about security vulnerabilities in Copilot that allowed users to bypass its safeguards against creating harmful content.
Jones claims that Microsoft’s Corporate, External, and Legal Affairs team pressured him to remove a LinkedIn post he published in December. In the post, he urged OpenAI’s board of directors to suspend the availability of DALL-E 3 due to safety concerns. Although Jones deleted the post at his manager’s direction, he said he never received any justification from the legal department despite requesting an explanation.
Generative AI image tools have faced recurring issues involving the creation of harmful content and the reinforcement of biases, particularly bias against specific groups. Google recently suspended its Gemini AI tool after it sparked public controversy by generating images of people of color in response to requests for historical figures such as popes, Vikings, and Nazi soldiers.