WeTransfer, the service that allows users to send large files to others, is explaining itself to clients and updating its terms of service after a backlash related to training AI models.
The company published a blog post, «WeTransfer Terms of Service — What’s Really Changing,» that details more updates the company made to its policies, after users noticed that recent changes seemed to suggest WeTransfer was training AI models on the files users are transferring.
In the blog post, the company says: «First things first. Your content is always your content.»
The post goes on to say, «We don’t use machine learning or any form of AI to process content shared via WeTransfer.» WeTransfer explains that its use of AI would be to improve content moderation and enhance its ability to prevent the distribution of harmful content across its platform.
The company adds that those AI tools aren’t being used and haven’t been built yet. «To avoid confusion,» it says, «we’ve removed this reference.»
A representative for WeTransfer did not immediately return an email seeking further comment.
The backlash over the terms prompted users such as political correspondent Ava Santina to write on X, «Time to stop using WeTransfer who from 8th August have decided they’ll own anything you transfer to power AI.»
What this means for users
Anxieties are high about what information users share or store in services such as social media accounts is accessed by companies to train AI models. WeTransfer may be used for highly sensitive file transfers, raising fears that private information might be accessed by AI. According to the company, this isn’t the case.
To further explain, the company said in its post:
- «YES — Your content is always your content. In fact, section 6.2 of our Terms of Service clearly states that you ‘own and retain all right, title, and interest, including all intellectual property rights, in and to the Content’.»
- «YES — You’re granting us permission to ensure we can run and improve the WeTransfer service properly.»
- «YES — Our terms are compliant with applicable privacy laws, including the GDPR.»
- «NO — We are not using your content to train AI models.»
- «NO — We do not sell your content to third parties.»
When the Terms of Service change
While eagle-eyed experts understood the potential implications of what WeTransfer’s new terms could mean for people using the service, it’s unlikely that most people would be able to spot such changes.
«Expecting users to fully understand Terms of Service is unrealistic. These documents are often too complex to navigate,» says Haibing Lu, associate professor at the Leavey School of Business at Santa Clara University.
Lu told CNET that companies would do well to clearly highlight any changes they make to AI-related terms and explain them clearly to give people a real choice. «That’s what true transparency looks like,» Lu says. «Companies are increasingly risking backlash when they update Terms of Service to include AI, especially when users’ data is involved.»
Companies including Adobe, Slack and Zoom have had similar issues with terms changes related to AI, but it’s not just AI that’s the problem, Lu says — rather, it’s the lack of transparent communication.
In the case of WeTransfer, Lu says the company’s response, including revising the terms and blogging about them, «was a smart move and helped rebuild trust. It showed they were listening and willing to act fast.»
WeTransfer could include more understandable language in its terms, or communicate the changes better or sooner, Lu says, adding: «Transparency shouldn’t start after a backlash.»