On Tuesday, actress Scarlett Johansson issued a statement addressing OpenAI’s use of an AI voice that closely resembles hers. Last September, Johansson declined an offer from OpenAI to provide her voice for its system, which is used primarily by customers with the latest GPT-4o.
In the statement, Johansson said she was shocked to hear the resemblance in the company’s latest demo of the GPT-4o model. Soon after, OpenAI announced it would pause the use of this particular voice, known as “Sky.” The company claimed the voice belonged to a different professional actress but declined to disclose who.
Although the fashion industry has yet to see misuse of AI voices become a problem at scale, it is already a problem for TikTok and Instagram creators. The issue has been growing in recent months, alongside bot activity. Social media bots perform automated interactions like posting, liking and following, which can be used for marketing or manipulating public opinion.
For example, TikTok videos and comments sections promoting Shein have shown that bots and AI voices are catching. Over the last six months, whenever Shein has been mentioned negatively on TikTok, a surge of bot accounts has flooded the comments with positive messages that are quickly upvoted, creating a misleading narrative about the brand’s popularity and customer satisfaction. Shein was not able to provide a comment in time for publication.
If unregulated, these practices have the potential to manipulate public perception into a more favorable view of Shein and promote its products.
Upon clicking on the profiles, these bot accounts are often empty profiles with suspicious usernames. The accounts are usually new, linking out to non-existent or scam Instagram profiles. And they promote Shein in positive posts, too. For example, on TikTok, a now-deleted viral video from last year suggesting shoppers should purchase their summer apparel from Shein saw an influx of pro-Shein comments from accounts that, upon closer inspection, appeared to be bots.
Bots and AI voice simulation are affecting creator content directly, too. Shein, for one, is being accused of stealing content from popular influencers without their knowledge. It allegedly downloads videos, removes watermarks, overlays its own promotional text and uses the altered content in ads on the platform.
Fashion technology expert Danielle Vermeer said this tactic is being used on TikTok, where stolen videos are repurposed to drive traffic to scam surveys promising $750 in Shein credits. These surveys, likely fraudulent, exploit the influencers’ reach by getting their viewers to engage with scam links. As a result, the creator’s credibility decreases.
“These creators often don’t even know their content has been altered and repurposed until their followers start tagging them and asking if it’s actually them,” said Vermeer. “One creator I follow said she had to hound TikTok and TikTok Shop for a week to get her AI-altered videos taken down, despite being in the Creator Program and expecting them to protect her from situations like this.”
The misuse extends to altering influencers’ voices using AI to promote products they have never endorsed. In February, influencer Ida Giancola (@dionysiangirl) experienced her videos being stolen and her voice altered to promote fast-fashion products. Despite reporting this to TikTok and TikTokShop, it took several days for the content to be taken down.
Sara Walker, the creator behind @styledsara on TikTok, also had a similar experience in February. The fake brand behind the posts blocked her on the platform and did not reply to requests to take down the AI voice video.
“It’s not surprising to see bad actors use AI voice manipulation to try to sell unknowing consumers more junk, but it is discouraging to see platforms not protect creators from this new form of identity theft,” said Vermeer.
However, no regulations focusing on content scamming and AI voice use currently exist. Instead, creators like Giancola are being forced to resort to a “whack-a-mole” strategy to get the videos taken down.
For Shein, these reported misuses come at a critical time, as the company is reportedly preparing for an initial public offering in London. This news would impact their brand value and credibility as a prospective public company.