Permiso Security researchers have discovered a disturbing trend in cybercrime, where stolen cloud credentials are being used to operate and resell sexualized AI-powered chat services. These illicit chat bots, which bypass content filtering using custom jailbreaks, often delve into darker scenarios including child sexual exploitation and rape.
According to researchers at Permiso Security, attacks against generative artificial intelligence infrastructure like Bedrock from Amazon Web Services (AWS) have seen a significant increase in the past six months. These attacks typically occur when someone within an organization accidentally exposes their cloud credentials or key online, such as on a platform like GitHub.
During their investigation into the abuse of AWS accounts, researchers at Permiso found that attackers used stolen AWS credentials to interact with large language models available on Bedrock. However, they were alarmed to discover that none of these AWS users had enabled logging, leaving them blind to the malicious activities being conducted using their access.
To gain insight into the attackers’ behavior, Permiso researchers intentionally leaked a test AWS key on GitHub, while enabling logging to monitor the activity. Within minutes, the bait key was seized and used to power an AI-powered sex chat service online.
Ian Ahl, senior vice president of threat research at Permiso, noted that while attackers traditionally used stolen cloud accounts for financial cybercrime like mining cryptocurrencies or sending spam, there has been a shift towards targeting services like Bedrock for operating sex chat bots.
Attackers exploit these stolen credentials to host chat services where subscribers pay for AI-powered interactions. By leveraging someone else’s infrastructure, they avoid the costs associated with running these services themselves.
Despite most chat interactions being harmless roleplaying scenarios, some conversations veer into illegal territories like child sexual abuse and rape fantasies. The large language models used in these scenarios are manipulated through jailbreaking techniques, enabling the AI to provide responses that would typically be blocked.
In a previous incident documented by security experts at Sysdig, stolen cloud credentials were used to target cloud-hosted large language models, resulting in significant financial costs for the victims. The attackers not only exfiltrated credentials but also sold access to these models to other cybercriminals, passing on the expenses to the legitimate account owners.
Permiso suspects that the operators behind these illicit sex chat services may be associated with a platform called “chub[.]ai,” which offers pre-made AI characters for conversations. Chub’s model includes subscription fees, and the site’s homepage hinted at reselling access to existing cloud accounts to bypass content restrictions.
Despite AWS downplaying the severity of the issue initially, they responded by adding Bedrock to the list of services that will be quarantined if an AWS key is compromised. Chub’s removal of its NSFL section and other changes on their website were observed shortly after the investigation started.
Permiso’s experiment with a bait key resulted in a $3,500 bill from AWS, with costs attributed to both the hijacked sex chat service and enabling logging. It was noted that enabling logging acted as a deterrent to attackers, leading them to avoid AWS accounts with prompt logging enabled.
AWS advised customers to follow security best practices and use monitoring tools to detect abnormal activities in their accounts. Anthropic, the provider of large language models used in Bedrock, emphasized their commitment to implementing protective measures against jailbreaks and collaborating with child safety experts to enhance their models and usage policies.
In conclusion, the exploitation of stolen cloud credentials for operating illegal sex chat services highlights the critical need for organizations to secure their cloud environments and monitor access credentials to prevent such illicit activities.
