• Lowpass
  • Posts
  • Keeping the web alive, despite AI

Keeping the web alive, despite AI

Bat signals, but for web crawlers

Hi there! My name is Janko Roettgers, and this is Lowpass. This week: Creative Commons’ AI Signals, and new data on the growth of Korean content on streaming.

How Creative Commons wants to keep sharing alive in the age of AI 

A little over two decades ago, a group of academics and activists formulated an ambitious idea: What if there was a way for content owners to encourage the sharing, and even remixing, of their works without giving up control altogether?

To facilitate this, they formed the Creative Commons non-profit, and released a set of more permissible content usage licenses. Creative Commons estimates that its licenses have since been applied to more than 2.5 billion works, including photos, books and even feature films. 

(Full disclosure slash humble brag: A book I wrote in 2003 about the demise of the traditional music industry became one of the first books to be re-released under the terms of a Creative Commons license when the organization expanded to Germany in 2004.)

But with the emergence of generative AI, Creative Commons finds itself increasingly on the defensive. A growing number of creators and content owners are looking for ways to keep web crawlers out, and often do so by limiting sharing of their works more broadly. 

“Machine use of web content is not new,” said Creative Commons director of strategic communications Rebecca Ross during a virtual town hall this week. “But it feels different this time [...] because of the pace and scale. AI without guard rails is causing backlash.”

Publishers resort to blocking all kinds of crawling through paywalls and other technical tools, and some pursue copyright changes that would inadvertently also limit other types of reuse currently permissible by law.

“One of the things that we are quite concerned about is that eventually, people may not share their works at all,” Ross said. “This is bad for everybody. We've all worked over the last two decades to open up content online. Removal of that content not only harms AI developers, but it harms humans.”

To prevent that kind of backsliding, Creative Commons is now working on a new set of licenses, if you will: Machine-readable signals that would tell AI crawlers under what terms they can access online content.

(…)

Subscribe to Premium to read the rest.

Become a paying subscriber of Premium to get access to this post and other subscriber-only content.

Already a paying subscriber? Sign In.

A subscription gets you:

  • • A full-length newsletter every week
  • • No ads or sponsorship messages
  • • Access to every story on Lowpass.cc
  • • Access to a subscriber-only Slack space and subscriber-only events

Reply

or to participate.