SoundCloud is facing significant backlash from creators after revelations that the music-sharing platform could use uploaded music to train its artificial intelligence (AI) systems. The controversy stems from an update to SoundCloud's terms of service, which states that users "explicitly agree that your Content may be used to inform, train, develop, or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services." These terms were added in February 2024, raising concerns among musicians regarding their rights and the potential exploitation of their work.
Futurism was the first to highlight the growing discontent among artists, particularly after the musical duo The Flight took to social media to express their outrage. They announced, "Ok then . . . deleted all our songs that we uploaded to SoundCloud and now closing account," on Bluesky. This sparked a wave of similar responses, with other users also deleting their accounts in protest. One user commented, "Thanks for the heads-up. I just deleted my account." This reaction underscores the anxiety many creators feel about the implications of AI on their work.
A spokesperson for SoundCloud attempted to clarify the situation, asserting that the platform has never used artist content to train AI models. The spokesperson stated, "SoundCloud has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes." They emphasized that the February 2024 update was intended to clarify how content may interact with AI technologies within SoundCloud’s own platform, including uses such as personalized recommendations, content organization, and fraud detection.
Despite the assurances from SoundCloud, the backlash reflects a broader concern within the music industry about the ethical implications of AI. Tech companies increasingly rely on vast amounts of data to train AI systems, leading to revisions in terms and conditions that may not adequately inform users of how their content is being used. In November 2023, X (formerly Twitter) updated its own terms of service to allow the training of its AI models on user content, leading to similar criticism.
The Federal Trade Commission (FTC) has warned companies about the potential legal ramifications of altering privacy policies without proper notice. In a February 2024 statement, the agency noted, "It may be unfair or deceptive for a company to adopt more permissive data practices and to only inform consumers of this change through a surreptitious, retroactive amendment to its terms of service or privacy policy." This adds another layer of complexity to the ongoing debate about AI and content ownership.
SoundCloud has been actively embracing AI technology, launching six new AI tools in November 2024 aimed at enhancing creativity for artists. The company also joined AI For Music’s “Principles for Music Creation With AI” pledge, committing to uphold ethical and transparent AI practices that respect creators’ rights. CEO Eliah Seton stated in a blog post, "SoundCloud is paving the way for a future where AI unlocks creative potential and makes music creation accessible to millions, while upholding responsible and ethical practices." However, the recent backlash suggests that many creators remain skeptical about these commitments.
On May 9, 2025, SoundCloud reiterated its stance in a statement following the uproar. The spokesperson emphasized that the platform has never used artist content for generative AI music, and that it has implemented technical safeguards, including a "no AI" tag to prevent unauthorized use of content. They stated, "SoundCloud has always been and will remain artist-first," and believes that AI can serve as a valuable creative tool when guided by principles of consent, attribution, and fair compensation.
Despite these reassurances, concerns persist among artists and advocates regarding the potential future use of their music for AI training. Ed Newton-Rex, founder of Fairly Trained, a non-profit advocating for ethical AI practices, expressed his apprehension regarding SoundCloud's statement. He noted that it "doesn’t actually rule out SoundCloud training generative AI models on their users’ music in future," highlighting the ambiguity surrounding the platform's intentions.
Newton-Rex further stated, "I think it’s important they rule this out and update their terms accordingly. Otherwise I for one will be removing my music." This sentiment reflects a growing demand among creators for clearer policies and more robust protections against the unauthorized use of their work.
In response to the concerns raised, SoundCloud has committed to introducing robust internal permissioning controls to govern any potential future use of user content for training generative AI models. The company also indicated that it would implement clear opt-out mechanisms should it ever consider using user content for such purposes, ensuring transparency with its creator community.
The ongoing debate surrounding AI in the music industry is emblematic of a larger conversation about the intersection of technology and creativity. As AI continues to evolve and reshape the landscape of music production, the rights of artists and the ethical use of their content remain paramount. The situation at SoundCloud serves as a critical reminder of the need for clear communication and trust between platforms and their users.
As the music industry grapples with these challenges, the response from SoundCloud and other platforms will be crucial in determining how creators navigate the complexities of AI. The balance between innovation and protecting artists' rights will be a defining factor in the future of music and technology.