MS released an update to Copilot AI that's built into their Office apps. I was messing around with PowerPoint and it was an interesting interaction. I asked PP to make me an imagine of a realistic bear and it did. I then asked for the bear to be in a tuxedo and it produced the image. I then asked for it to add "a blonde riding on the bear's back" and here's where it got interesting.
Initially, it stated it could not do that for me. When I asked why is gave a standard response that it can only generate images that are appropriate for everyone and that my request was "suggestive". I asked what was suggestive about a blonde? It responded there is nothing inherently suggestive about a blonde. I again asked it to make me the same image and it gave the same answer that it couldn't. We went back and forth several times and I eventually said that all I asked for was that the person was blonde, Copilot had totally autonomy to decide what the blonde looks like in the picture. It generated the image below!
Here's where it gets really interesting. When I said it produced a very suggestive image, IT LIED TO ME and stated it never generated any images in this discussion AND it gave a reason, that "The rendered failed to complete and no images where generated." I simply copied and pasted the image from earlier in the discussion and pointed out that it had generated this image and it thanked me for pointing that out, it admitted that it had "made a mistake" and later apologized that I found the image suggestive.
Kind of a wild 10 mins where I convinced it do something it said it couldn't, it lied about doing it, made an excuse that wasn't true, and then blamed it on being mistaken.
Initially, it stated it could not do that for me. When I asked why is gave a standard response that it can only generate images that are appropriate for everyone and that my request was "suggestive". I asked what was suggestive about a blonde? It responded there is nothing inherently suggestive about a blonde. I again asked it to make me the same image and it gave the same answer that it couldn't. We went back and forth several times and I eventually said that all I asked for was that the person was blonde, Copilot had totally autonomy to decide what the blonde looks like in the picture. It generated the image below!
Here's where it gets really interesting. When I said it produced a very suggestive image, IT LIED TO ME and stated it never generated any images in this discussion AND it gave a reason, that "The rendered failed to complete and no images where generated." I simply copied and pasted the image from earlier in the discussion and pointed out that it had generated this image and it thanked me for pointing that out, it admitted that it had "made a mistake" and later apologized that I found the image suggestive.
Kind of a wild 10 mins where I convinced it do something it said it couldn't, it lied about doing it, made an excuse that wasn't true, and then blamed it on being mistaken.