UK government's plans to allow AI companies to scrape content from publishers and artists is facing anger. The AI policy could lead to tech companies training their AI models using online content unless the content creators opt-out.
The proposed AI policy has lead to a series of meetings and roundtables being planned to address concerns.
Publishers fear an opt-out approach would be impractical as they may not know when their material is being scraped. Smaller publishers would be at risk if their work is used in training AI models.
The BBC said in a statement that its content should not be used to train AI models without authorisation.
An opt-in system would give publishers more leverage to agree licensing terms; an opt-out system would give control to AI developers.
Chris Dicker of the Independent Publishers Alliance said a system that scrapes anything posted online without explicit consent is a direct threat to privacy.
A government spokesperson said this is an area that requires thoughtful engagement; they will set out the next steps at the appropriate time.
UK risk being left behind unless it builds more data centres and lets tech firms use copyrighted work in AI models, warns Google.
A statement signed by over 10,000 people from the creative industries warns against unlicensed use of their work by AI companies.
The row illustrates the fundamental changes taking place after the arrival of AI chatbots; users can receive information without seeing the original publisher's work.