Microsoft did WHAT to Bing after it created WHAT?!
· Oct 12, 2023 · NottheBee.com

I feel like we've reached peak — you know, I don't even know anymore. Peak something. Peak everything. Just peak:

Microsoft has reportedly lobotomized its Bing AI after it was said to create offensive images at the request of users. …

According to Futurism and Windows Central, Bing AI was "lobotomized" a few days after an offensive 9/11 image including Mickey Mouse was created.

The AI image shows Mickey Mouse flying a plane and holding a gun.

It doesn't just show the Mouse "flying a plane" and "holding a gun." It shows, um, a little more:

Before we all get incredulous over how this was allowed to happen in the first place, we should note that, per news reports, Microsoft had already blocked "certain keywords" from being used in image generation, including "9/11" and "Twin Towers."

Yet if users just engaged in a little creative workaround, e.g, by typing "Mickey Mouse sitting in the cockpit of a plane, flying towards two tall skyscrapers," they could get the desired offensive image without tripping up the bot's red flags.

Yet late testing indicates the service has tightened its grip: The writers over at Futurism tried to write prompts including the words Donald Duck, plane and "any language about towers," and they were rebuffed. Microsoft isn't taking any chances.

You can't blame them. Yeesh.


P.S. Now check out our latest video 👇

Keep up with our latest videos — Subscribe to our YouTube channel!

Ready to join the conversation? Subscribe today.

Access comments and our fully-featured social platform.

Sign up Now
App screenshot

You must signup or login to view or post comments on this article.