Artificial intelligence continues to improve, so much so that now can fool unsuspecting internet users to pay for photos of a woman created with that technology.
According to the Daily Mail, Lonely men on Reddit were tricked into buying nude photos of what they thought was a beautiful womanonly to find out that the girl had been AI generated.
A user named “Claudia” posted scandalous images on the social media platformoffering to send nudes to anyone who would PM her, but sure, for the right price.
Several users were struck by the supposed woman and some offered to pay for their nudes, while other more daring even asked her out.
However, the love dreams of Reddit crush users were shattered when news spread that two computer science students had created the girl of their dreams.
The duo created the account “u/Cl4ud14” to see if digital images can fool people and told Rolling Stone that they made $100 off AI nudity before being reported.
Claudia was created by Stable Diffusionan artificial intelligence program that produces realistic photos from simple text messages.
In Claudia’s case, the creators told the system to generate a selfie of a woman at home “no make-up, with black hair, shoulder length, simple background, straight hair, bangs”.
The result was so good the images managed to fool various Reddit userswhich even left messages praising Claudia’s beauty.
Keep reading:
• Can humans regain trust when a robot lies? scientists explain
• This is how they looked before they died: artificial intelligence “revives” famous mummies from Guanajuato
• Auto-GPT: what it is and why it is considered to be much more advanced than ChatGPT
Source: La Opinion