With the current boom in AI-generated content, the risk management world faces unprecedented challenges in keeping e-commerce safe and usable. Through machine learning, users can create digitally manipulated images and videos of human subjects that still look authentic to the untrained eye. These forgeries are called deepfakes.
The unsavory side of this technology, namely sexually explicit deepfakes, has caught some platforms flatfooted. This January, we saw content moderators across social media scrambling to remove pornographic deepfake material of Taylor Swift—and the targets aren’t just celebrities. Some deepfake sites advertise services that can digitally undress people in ordinary photos.
Over the past few years, G2 Risk Solutions (G2RS) has been monitoring this industry and protecting our clients from their reach. Most deepfake porn sellers stay underground and launder payments through peer-to-peer payment apps, cryptocurrencies, and seemingly benign fronts. For example, G2RS investigators found that one prominent deepfake porn site funneled their transactions through a merchant that sells jumbo bean bag chairs.
G2RS has discovered that bad actors may cycle through multiple fronts within a network to avoid raising red flags with fishy transactions. It’s unlikely that a fully automated system will find cases like this. They require a human touch to investigate, and that’s where we excel. G2RS analysts are knowledgeable and adaptable enough to tackle any new trend in the industry, including AI-generated content.
If you have questions or would like to learn more about deepfakes, contact us.