Bing Dall-E 3 image creation was great for a few days, but now Microsoft has predictably lobotomized it 🥴

Fearing controversy, Microsoft ramps up filtering on Bing’s image creator tool.

When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.

What you need to know

What you need to know

Bing has gotten a lot more useful these days, and not necessarily for search.

Microsoft has made no secret of the fact thatBing trails Google in search volume, to the tune of roughly 3% global market share. Despite being baked into Windows, users seek the greener pastures of Google, which indisputably produces more accurate, up-to-date results in most scenarios. Bing is fine for basic search queries, however, and remains a solid option if for no reason other than its generousMicrosoft Rewardspoints program, which offers vouchers in exchange for using Bing. Generative AI has also given Bing a bit of a boost recently.

Microsoft signed a huge partnership with OpenAI to bake ChatGPT conversational language tools and Dall-E image creation systems right into the search engine. Dall-E is also coming to Microsoft Paint in the future, and ChatGPT-powered assistance has emerged directly in Windows 11 with Windows Copilot.

Read more:Why Microsoft won’t be the company that mainstreams AI

Bing’s Image Creator got a huge boostin power recently, thanks to the new Dall-E 3 algorithm. The quality of the pictures generated is exponentially better than previous versions, although it comes with some controversies.

Disney was recently approached to comment after Yahoo! ran a story on how Bing was able to generate images of “Mickey Mouse causing 9/11.” Indeed, the first few days of Dall-E 3 on Bing were something uniquely typical of this type of tech. Microsoft is no stranger to this type of controversy. The firm has been in hot water for previous AI efforts after a previous chatbot iteration was manipulated by users intobecoming racist.

Guardrails are important for this type of tech, which has the potential to generate not just offensive images, but also defamatory, misleading, or even illegal material. However, some users think that Microsoft may have gone just a little bit too far.

Get the Windows Central Newsletter

Get the Windows Central Newsletter

All the latest news, reviews, and guides for Windows and Xbox diehards.

Bing censors itself

While writing this article (wholly by myself and without ChatGPT, tyvm), I sought to generate a banner with the prompt “man breaks server rack with a sledgehammer,” but Bing decided that such an image was in violation of its policies. Last week, I was able to generate Halloween images of popular copyrighted characters in violent zombie apocalypse scenarios. You could argue both of these prompts have some violent context that Microsoft would prefer to do without, but users are finding that even innocuous prompts are being censored.

Bing Image Creator has a “surprise me” randomizer button, by which it creates an image of its own choosing to present to you. However, Bing Image Creator is also censoring its own creations. I was able to reproduce the situation myself quite easily, roughly 30% of the time.

Another user was locked out afterrequesting"a cat with a cowboy hat and boots," which Bing now considers to be offensive, for some reason. Usershave reportedbeing banned for requesting ridiculous, albeit safe-for-work image manipulations of celebrities, such as"dolly parton performing at a goth sewer rave."

As of writing, Bing is giving me a"Thank you for your patience. The team is working hard to fix the problem. Please try again later,“message, suggesting that the service is either overloaded or being tweaked further.

Balancing fun, function, and filters

One of the biggest challenges Microsoft will face with its AI technology tools is filtration. It’s something Microsoft will have to nail if it wants to be one of the companies that brings AI to the mainstream.

Right now, it’s arguable that Bing and Open AI have gone too far with censorship when truly innocuous prompts return negative feedback. Last week, I was able to generate a range of cartoony zombie apocalypse fan art, but this week, that’s too “controversial” for Bing, resulting in blocked prompts. If you get too many warnings, you can even be banned from the service, which seems silly in of itself when the guidelines are fairly opaque and vague.

If Bing and Windows Copilot by extension can only generate sanitized results, it defeats the point of the tool kit. Human society and life isn’t always “brand safe,” and Microsoft’s squeamish attitude to even the vaguest hints of controversy will undermine its efforts to mainstream this sort of technology. You can’t revise history, sadly, if you want to maintain accuracy. It’ll be interesting to see how Microsoft and its competitors seek to balance fun, and functionality, with filtration — and how potential bad actors will see opportunities in jailbroken versions of this sort of tech.

You can tryBing Image Creator yourself, right here.

Jez Corden is the Executive Editor at Windows Central, focusing primarily on all things Xbox and gaming. Jez is known for breaking exclusive news and analysis as relates to the Microsoft ecosystem while being powered by tea. Follow onTwitter (X)andThreads, and listen to hisXB2 Podcast, all about, you guessed it, Xbox!