Microsoft engineer warns company's AI tool creates violent, sexual images, ignores copyrights

Microsoft’s AI engineer, Shane Jones, has raised concerns about the Copilot Designer AI tool, stating that it generates disturbing and harmful images, including violent and sexual content. Despite Jones’ warnings to Microsoft about the inappropriate content being created by the product, the company has not taken sufficient action to address these issues. Jones has escalated the matter by sending letters to the FTC and Microsoft’s board, urging for better safeguards or the removal of Copilot Designer from public use until adequate measures are in place to prevent the generation of harmful images

Read more at: www.cnbc.com

Home