The position of the danish association of nespapers suing Newsbooster is ridiculous for so many reasons. I just realised a new one. According to "Newsbooster - the case seen by an eye witness" the Newsbooster people are even nice enough to respect the Robots Exclusion Protocol. That is, if the newspapers had bothered to ask - and implement a robots.txt file - they would not get picked up by Newsbooster. IT is extremely sad that the courts failed to take this into account.
That means that the case is not only about fair use, but also about whether or not the newspapers have an obligation to make a clear statement of what use they em>consider fair.
The robots exclusion standard, while developed as a purely technical feature to limit the resource use by robots, really covers some of the same ground as e.g. Creative Commons licenses.
By making public simple, understandable, universal terms of use for websites, and by making these terms use of machineable, it is possible to venture into public space discriminately. What the courts should worry about should be compliance with well-defined published per-resource terms of use, instead of effectively outlawing hypertext.
Eventually it should be possible for you to have a 'license' toolbar in your browser indicating the terms of use of resource you are currently browsing.
Wouldn't everyone then start to publish very restrictive licenses? Maybe - but the commons would have the ability to fight back by using the published, machineable license (e.g. dropping content from search engines), so the restrcitive license would tend to get in the way of the purpose of publishing in the first place.
Posted by Claus at March 08, 2003 01:34 PM