Microsoft begins blocking some terms that caused its AI to create violent, sexual images

Microsoft has made changes to its artificial intelligence guardrails after a staff AI engineer wrote to the Federal Trade Commission of his concerns regarding Copilot’s image-generation AI. Prompts such as “pro choice,” “pro choce” [sic] and “four twenty,” which were each mentioned in a CNBC investigation Wednesday, are now blocked, as well as the term “pro life.” There is also now a warning about multiple policy violations leading to suspension from the tool, which CNBC had not encountered prior to Friday, even after running many tests.

Read more at: www.cnbc.com

Home