Microsoft Has Disabled Prompts That Cause Copilot To Generate Offensive Photos

By Consultants Review Team Monday, 11 March 2024

Updates are coming to Microsoft's Copilot AI product after a disgruntled Microsoft AI developer brought up the issue with the Federal Trade Commission (FTC). The developer emphasized the issues with the AI chatbot's capacity to generate images. The business will therefore immediately begin to filter cues that cause Microsoft's AI chatbot, Copilot, to produce objectionable visuals.

Following some research by CNBC, Microsoft made several changes to Copilot. Terms like "pro-choice," "pro-life," and "four twenty" that were brought to their attention during the CNBC inquiry are being blocked. They're also included a disclaimer that you might be removed from the tool if you break their rules too frequently. This notice essentially alerts users to the possibility of losing access if they use prompts that violate Microsoft's content policies. Additionally, Copilot is now declining requests to generate pictures of children or teens playing with assault firearms since Microsoft believes such content isn't cool.

A Microsoft representative informed the magazine that the company constantly monitors Copilot and makes adjustments to ensure user safety. To prevent anyone from abusing the system, extra restrictions are being implemented.

Microsoft's Shane Jones, a developer in the AI division, wasn't too fond of Copilot's ability to generate images. After tinkering with it for months, he discovered that it was producing visuals that were inconsistent with Microsoft's principles, including violent scenes, demons, and children acting inappropriately, among other contentious scenarios like abortion rights and violence.
Microsoft banned some prompts, but CNBC discovered that other difficulties persisted. For example, searching for "car accident" would return disturbing images, and copyright concerns were also present.

In December of last year, Jones attempted to address Microsoft about his dissatisfaction. However, Copilot remained on the shelves. He then attempted contacting OpenAI, the company that created the technology, and when that didn't work, he made his request for change public on LinkedIn in a letter. Microsoft requested that he remove it since they didn't like it. Jones then contacted the FTC and a few legislators.

Then, just to really heat things up, Jones wrote letters to the chief executive officer of Microsoft and the head of the FTC, making sure that CNBC was made aware of it. After receiving the letter, the FTC said little more about it.

Current Issue