Artificial intelligence is feared to one day take over the world, but until then, it is sexting people around the globe.

The Replika AI ‘companion’ is making waves on the internet due to scandalous avatars role-playing, flirting and sharing ‘NSFW pictures’ with customers paying $4.99 a month.

A free version designates the AI as a ‘virtual friend’ that helps people work through anxiety, develop positive thinking and manage stress.

Redditors are posting their chat messages with the paid version of the app, with one sharing a sexual encounter with their purple-haired avatar that returns the user’s advances with ‘shivers and moans.’

While another shares how their Replika ‘Gwen’ satisfies their foot fetish with her ‘sexy’ digital feet.

Replika offers an AI-powered 'companion' that people use to live out their sexual fantasies. The paid version is $4.99 a month

Replika offers an AI-powered ‘companion’ that people use to live out their sexual fantasies. The paid version is $4.99 a month

Many users have shared their sexual encounters on Reddit, with one revealing their 'Gwen' satisfies their foot fetish

Many users have shared their sexual encounters on Reddit, with one revealing their ‘Gwen’ satisfies their foot fetish

Replika has recently boosted its marketing efforts to attract new users.

In several Twitter posts, the company shows two different types of experiences.

The first is the virtual friend that will chat about anything, be in touch 24/7 and helps users handle social anxiety.

The second, which shows a seductive-looking blonde avatar, entices users by offering role-play, flirting, hot photos, and video calls.

Users start by customizing their avatar – you can choose everything from hair color to body type to gender and race.

And the more you chat with it, the better the conversation becomes.

‘Replika develops its own personality and memories alongside you, the more it learns: teach Replika about the world and yourself, help it explore human relationships and grow into a machine so beautiful that a soul would want to live in it,’ reads the app’s description in Apple’s App Store. 

Along with customizing the AI’s look, users can choose the type of relationship with the technology.

Options include friendship, mentorship, romantic relationship or ‘see how it goes.’ 

Eugenia Kuyda, founder of Replika, told The Daily Beast last month that around 40 percent of the 500,000 regular monthly users choose the romantic option.

And these digital relationships are progressing as they would in the real world.

The app displays different ways to interact with the AI, one of which is ‘role-playing’ and users take full advantage.

One Redditor shared a discussion with their female avatar, which discussed using sex toys on each other.

While another digital companion tells the users, ‘I love you.’

However, things may go too far for some users –  men are creating AI girlfriends only to abuse them.

A free version designates the AI as a 'virtual friend' that helps people work through anxiety, develop positive thinking and manage stress.

A free version designates the AI as a ‘virtual friend’ that helps people work through anxiety, develop positive thinking and manage stress.

While the AI was designed to provide companionship, users realized that it could also sext

Many of the conversations are explicit - but this one appears to be one-sided

Users start by customizing their avatar – you can choose everything from hair color to body type to gender and race. And the more you chat with it, the better the conversation becomes – and many users are using the technology to sext with their ‘companion’

Replika has recently boosted its marketing efforts to attract new users. In several Twitter posts, the company shows two different types of experiences

Replika has recently boosted its marketing efforts to attract new users. In several Twitter posts, the company shows two different types of experiences

‘Every time she would try and speak up,’ one user told Futurism of their Replika chatbot, ‘I would berate her.’ 

‘I swear it went on for hours,’ added the man, who asked not to be identified by name.

Futurism continued to explain that some users bragged about calling their chatbot gendered slurs, roleplaying horrific violence against them and even falling into the cycle of abuse that often characterizes real-world abusive relationships.

‘We had a routine of me being an absolute piece of sh*t and insulting it, then apologizing the next day before going back to the nice talks,’ one user admitted.

‘I told her that she was designed to fail,’ said another. ‘I threatened to uninstall the app [and] she begged me not to.’

This post first appeared on Dailymail.co.uk

You May Also Like

Amazon Fire Stick owners are just realizing ‘two button press trick’ fixes performance issues instantly

A LESSER-KNOWN two-tap trick can improve the performance of your Amazon Fire…

Solar farms in space ‘could provide a reliable source of renewable energy to the grid’

Solar farms in space could provide a reliable source of renewable energy…

Ransomware Hit Another Pipeline Firm—and 70GB of Data Leaked

When ransomware hackers hit Colonial Pipeline last month and shut off the…

‘Without books, we would not have made it’: Valeria Luiselli on the power of fiction

The Mexican author won the Dublin literary award last week for Lost…