Leaders from the creative industries urge Parliament to act on foreign AI companies

‘Nearly every song ever written by a Canadian songwriter has already been scraped,’ says music industry witness.

October 16, 2025
Photo credit: iStock.com/Blue Planet Studio

Artificial Intelligence companies are exploiting creative works of music, art and writing without permission, credit or compensation, five leaders of the creative industries told the House of Commons Standing Committee on Canadian Heritage.  

The leaders, including OCAD University’s Kelly Wilhelm, offered their views on the threat of AI to Canada’s artists, creative economy and cultural sovereignty last week at the second of five committee meetings scheduled to study the issue. 

AI companies “scrape” data from copyrighted materials to train their models, which allows AI systems to generate new works that can compete directly with original creators — undermining their income and leading to confusion in the market.  

Kelly Wilhelm, head of the Cultural Policy Hub at OCAD University (OCAD U), said that simple, clear and harmonized standards are needed to prevent uncertainty in a precarious labour market where many creators are self-employed or freelance. “There’s no question that AI impacts creative intellectual property [IP] copyright and labour. It has already disrupted the value chains on which the creative industries, their companies, and their workers depend,” said Ms. Wilhelm. 

Regulation should protect creative professionals’ rights and IP, she said, while enabling Canada’s creators to innovate responsibly. “Government action that values and invests in creative and cultural professionals and their IP contributes to innovation — it does not stifle it,” she said. 

Copyright infringement 

Leaders representing the music and music publishing-industries stressed that the government must uphold copyright law and recognize that the training of AI models on the music of artists, done without their permission, is infringement.  

Margaret McGuffin, CEO of Music Publishers Canada, said that the music industry has already seen a “mass theft” of copyright-protected songs. AI companies have essentially stolen songs twice: first by scraping data to train systems without permission, and again when those systems generate unlicensed music that copies or replaces the originals.  

“To put this into perspective: nearly every song ever written by a Canadian songwriter has already been scraped and is already stolen by these AI companies without consent, credit or compensation,” said Ms. McGuffin. 

Canada’s Copyright Act (Bill C-11) allows text and data mining (TDM) for non-commercial purposes by organizations that legally have access to copyrighted works — what is known as a TDM exception. The exception does not apply to for-profit AI companies, who are mining works illegally; however, because most major AI companies are headquartered outside of Canada, pursing legal action against foreign AI companies is legally complex and expensive, making enforcement of copyright law extremely difficult. 

Witnesses implored policymakers to oppose requests by AI developers to expand the TDM exception to include them — an argument that they expect will be made by developers in order to facilitate growth and investment in the technology sector.  

“We strongly oppose new copyright exception,” said Jennifer Brown, CEO of the Society of Composers, Authors and Music Publishers of Canada (SOCAN). “The TDM exception would not facilitate growth in either the creative or technology sector. While there is no evidence to suggest that a TDM exception is necessary to maximize investment in the AI sphere, it would certainly deprive creators of the economic benefits of their works.” 

Identifying AI-generated content  

Multiple witnesses emphasized that without transparency, creators cannot know how or when their works are being used, making it nearly impossible to enforce their rights or be compensated. They said regulators should make labeling of AI-generated material mandatory. “Mandatory labelling of AI outputs would mean the public can make informed choices about the type of content that they consume,” said Ms. McGuffin. 

They also called on regulators to require that AI companies track and publicly disclose the data sets that they use to train their models. “Proponents of AI say they should be able to steal everything and will claim this [tracking] isn’t possible. But if we’re going to unlock human consciousness with AI, shouldn’t it be able to write a bibliography?” said Patrick Rogers, CEO of Music Canada, the trade association representing Canada’s major music labels. 

Mr. Rogers also called on the government to crack down on deepfakes — fabricated digital content which replicates another person, often used to spread disinformation — which have been used in the music industry to mimic real artists. “What if someone used [deepfakes] to ruin your career?” he asked. “What’s illegal on paper should be illegal online. Putting your words in my mouth is not free speech.”  

Protecting minority and linguistic cultural communities 

The speakers argued that dominant AI platforms from Silicon Valley that prioritize shareholder profit over public interest are a threat to Canada’s cultural sovereignty. They warned that exploiting local creators and scraping cultural data can lead to “cultural homogenization” which can carry serious risks for linguistic and cultural minority groups. 

“AI is fundamentally a homogenizing tool. It is about recording and reproducing existing patterns found in the data that is fed into the machine,” said Ms. Wilhelm of OCAD U. She said protecting creative IP and cultural data sovereignty is essential to resisting this homogenization.  

Algorithms trained on large datasets replicate the most common patterns, privileging majority and mainstream culture. Marc-Olivier Ducharme, who testified on behalf of ArtIA, a Quebec collaborative that explores the intersection of AI and the arts, said that francophone and Indigenous communities could have their languages, stories and artistic practices diluted or ignored in AI outputs. Mr. Ducharme said AI represents “the next frontier of technological colonization.” 

“It’s the exploitation of creators by dominant AI models trained on data stolen from the creators [who are] marginalizing francophone, Indigenous and other minority cultural communities, such as Acadians. This is unacceptable,” he said.  

Mr. Ducharme called on the government to fund culturally responsible AI infrastructure. This could include “cultural data trusts” which would be a common resource where data sets are curated, preserved and made available under community-led rules.  

“We are proposing cultural data trusts — sovereign infrastructure that protects and enhances our cultural data with governance and access modalities decided by the communities and the artists themselves,” he said.   

By supporting this type of innovation, the government would help creators keep the economic benefits of AI-generated products and innovations within Canadian communities. Community governance, said Mr. Ducharme, would provide safeguards against exploitation and unauthorized AI training by foreign AI companies. 

The government needs to act swiftly to safeguard minority linguistic and cultural communities from dominant AI companies by investing in Canada’s cultural sector as it grapples with artificial intelligence, he said. 

“AI giants are funded to the tune of hundreds of billions, a scale of deviation that creates dominant positions that are difficult to offset,” said Mr. Ducharme.  

“We are now trying to protect our cultural diversity rather than suffer homogenization. The Canadian government has invested $2.4 billion in AI — none of which has been allocated to culture.”  

The committee is scheduled to hold three further hearings on the subject, on Oct. 22, 27, and 29.