Friday, October 21, 2005

Does Google Print violate copyright

Peter Suber has taken a detailed look at the Google Print issues and whether the company is infringing on copyright.

"First we have to get clear on two terms. "Google Publisher" is the name of the project to scan books that publishers voluntarily offer Google to scan. "Google Library" is the name of the project to scan books in certain libraries, with the consent of the libraries but not necessarily the consent of the publishers. Both are parts of the larger "Google Print" project, which is to scan full-text print materials for adding to the Google search index. For publishers, Google Publisher is opt-in, while Google Library is opt-out. Google Library covers books under copyright as well as books in the public domain. The Authors Guild lawsuit is about the copyrighted portion of Google Library...

I've seen a plausible case for copyright infringement by Google Library --which, by the way, is opposed by a plausible case against copyright infringement (more below). But I haven't yet seen a plausible case that the authors or publishers will be injured by Google Library. On the contrary, publishing groups who address this, like the ALPSP, tend to concede that the program will benefit them but insist that the rules be followed anyway. The AG press release doesn't mention harm or injury. Deep within the complaint that it filed in court, however, it alleges the members of the class-action suit have suffered "depreciation in the value and ability to license and sell their Works [and] lost profits and/or opportunities" (p. 10). The complaint seeks to collect damages for these losses.

If the AG mentions losses merely as a formality --which is how it looks-- and covertly concedes that Google Library will increase book sales, then the suit might still make sense. But instead of trying to stop a continuing injury, the AG would be trying to extract a settlement. It may want a cut of Google's ad revenue on top of increased sales. If so, then (as I said last month) we're watching a shakedown...

If the case goes to trial, some of these arguments are likely to be more central than others. If I had to boil down this summary to the most essential elements, I'd put it this way. The authors --and the publishers who share the same grievance-- are getting far too much mileage from the claim that Google's opt-out policy turns the usual copyright rule on its head. This claim has a deceptive strength. It's strong because it would be valid for most full-text copying. It's deceptive because it assumes without proof that the Google copying is not fair use. Hence it begs the question at the heart of the lawsuit. If the Google copying is fair use, then no prior permission is needed and the opt-out policy is justified. Moreover, Google has several good arguments that its copying really is fair use, most notably its argument that its indexing will enhance rather than diminish book sales and its analogy to long-accepted opt-out policies for search-engine indexing of other copyrighted content.

On the merits, it's an important question to settle. But I admit that I'm not very comfortable having any important copyright question settled in today's legal climate of piracy hysteria and maximalist protection --a new world order getting old fast."


Walter S. Mossberg at the Wall Street Journal is not overly enamoured with the use of DRM.

"For some activists, the very idea of Digital Rights Management is anathema. They believe that once a consumer legally buys a song or a video clip, the companies that sold them have no right to limit how the consumer uses them, any more than a car company should be able to limit what you can do with a car you've bought.

But DRM is seen as a lifesaver by the music, television and movie industries. The companies believe they need DRM technology to block the possibility that a song or video can be copied in large quantities and distributed over the Internet, thus robbing them of legitimate sales.

In my view, both sides have a point, but the real issue isn't DRM itself -- it's the manner in which DRM is used by copyright holders. Companies have a right to protect their property, and DRM is one means to do so. But treating all consumers as potential criminals by using DRM to overly limit their activities is just plain wrong."

Update: Ed Felten has a useful follow up to Mossberg's column.

Blackboard and WebCT to merge

The two biggest electronic learning platform vendors for higher education have agreed on a merger/takeover deal. Blackboard are to acquire WebCT by early 2006 in a deal that will give the new company about two thirds of the world market in learning management system (LMS) or virtual learning environments (VLEs), as we have come to call them in the university sector.

This gives me some concern, not because my own university uses either of these platforms (we don't yet), but because Blackboard have demonstrated a will to censor inconvenient research. In the Spring of 2003 they rolled out the lawyers to prevent two students presenting a research paper on security flaws in the Blackboard campus ID card system.

The two, Billy Hoffman of the Georgia Institute of Technology and Virgil Griffith of the University of Alabama had decided to publish a paper on the problem at a security conference in Georgia but Blackboard’s lawyers stepped in and got a court to issue an injunction preventing the disclosure of the details of the problem.

The students eventually reached an out of court settlement with Blackboard apologising to the company for their actions and agreeing to "refrain from any further unauthorized access to or use of the System," including "any transaction designed to better understand or determine how the System works." They also agreed to do 40 hours community service.

Now Blackboard got their injunction preventing these students from presenting their research, so they had an arguable legal case. However, the builders of one of the widest deployed platforms for the creation and delivery of higher education digital content were prepared to go to court to block the publication of inconvenient research and extract a settlement whereby a couple of techie students agreed to refrain from “any transaction designed to better understand or determine how the System works."

Blackboard is a business and has a duty to protect the interests of the company’s shareholders. These interests, however, may not necessarily coincide with those of their education institution customers.

So, universities should think carefully before locking themselves into propriety VLEs. The corollary of that is that the architecture of higher education VLEs should be:

• based on open standards
• modular
• flexible
• expansible
• interoperable with other systems

If a small number of players with closed systems come to dominate the HE market for VLEs in the long term we could face restrictions on information development and distribution which we take for granted today. Restrictions built into the architecture of the systems we use, backed up by force of law.

Latest EDRI-gram

The latest EDRi-gram has been published with, as usual, many notable stories. The one that caught my eye was the European Ombudsman Mr Diamandouros' call for EU Councils of Ministers to legislate in a more transparent manner.

He basically said the Council have failed to provide valid reasons for meeting behind closed doors and recommended:

"The Council of the European Union should review its refusal to decide to meet publicly whenever it is acting in its legislative capacity."

Lessig on drm, cc and interoperability

Larry Lessig's latest newsletter is on drm and interoperability.

"Interoperability. Perhaps the most important thing that the Internet has given us is a platform upon which experience is interoperable. At first, the aim of the computer and network geniuses was simply to find a way to make computers talk to each other. Then application geniuses found ways to make the content that runs on these different devices interoperate on a single digital platform. We are close to a world where any format of sound can be mixed with any format of video, and then supplemented with any format of text or images. There are exceptions; there are some who don't play in this interoperability game. But the push of the network has been to produce a world where anyone can clip and combine just about anything to make something new...

The Internet was not built with permissions in mind. Free access was the rule.

That free access creates many problems for many – indeed in principle, it might create problems for us all. Record and film companies are notorious for complaining about this Internet feature. Given its original design, you could "share" your complete record and film collection with your 100,000 "best friends." It's not surprising that they view this feature rather as a bug. But I suggest any of us might regret this feature of the net in certain circumstances. Do you really like it that someone can take a personal email you sent him and forward it to his 100 best friends?

The point is that however good free access is, sometimes, at least some think its not very good, at least for them. And the most powerful of these some have therefore pushed for technologies that would be layered onto the Internet and enable them, or content owners generally, to control how digital content gets used."

Adelphi Charter

John had a nice piece on the RSA Adelphi Charter on his blog last week.

Since then Google have been sued again over their Google Print project which aims to make all the worlds's books searchable online. ZDNet had a report earlier in the week suggesting the American Association of Publishers were behind the lawsuit. The names on the actual complaint, though, are those of some of the major publishers, McGraw-Hill, Pearson, Penguin Group, Simon & Schuster, and John Wiley & Sons.

The AAP president, Patricia Schroeder, said

"The publishing industry is united behind this lawsuit against Google and united in the fight to defend their rights. While authors and publishers know how useful Google's search engine can be and think the Print Library could be an excellent resource, the bottom line is that under its current plan, Google is seeking to make millions of dollars by freeloading on the talent and property of authors and publishers."

Which in translation reads - publishers want a bigger slice of the earnings.

Google spokesman, David Drummond said

"Google Print is a historic effort to make millions of books easier for people to find and buy. Creating an easy-to-use index of books is fair use under copyright law and supports the purpose of copyright: to increase the awareness and sales of books directly benefiting copyright holders. This short-sighted attempt to block Google Print works counter to the interests of not just the world's readers, but also the world's authors and publishers."

Which in translation reads - don't be stupid, we're doing you a favour sending loads more customers your way with no effort on your part; and you want to cash in on our business too?!

Wednesday, October 19, 2005

Google Print moves on Europe

From the NYT, "Google said Monday that it had begun operating local-language sites in eight European countries for its Google Print program, its closely watched effort to make all of the world's books searchable online, expanding into territories where it has drawn fierce criticism."

Tuesday, October 18, 2005

Universities and the new terrorism act

I accidentally happened across an interesting discussion on this afternoon's BBC Radio 4's "The Learning Curve", where Libby Purves interviewed Dr Steve Wharton, President of the Association of University Teachers and Professor Les Ebdon, Vice Chancellor of Luton University, who is updating guidance on terror and intolerance for Universities UK. (The relevant interview starts about 16 minutes into the programme).

Both Wharton and Ebdon raised serious and legitimate concerns about the potential impact of the "glorifying terrorism" provisions of the new anti-terrorism bill, Education Secretary Ruth Kelly's recent call for academics to play an active part in "cracking down on campus extremists" and the government's use of a report (the Gleeson report) published by a think tank known as the Social Affairs Unit, which if it is to be believed, Purves says, "paints a disturbing picture" of extremism in universities.

Wharton runs courses in political discourse and persuasion propaganda and could theoretically be directly targetted by law enforcement authorities if the bill were to become law in its present form. He expects his students to engage with difficult subjects and to acquire the tools to critically analyse propaganda and had been planning to introduce a course specifically on the discourse of terrorism.

Ebdon dismissed, as "widely discredited," the Gleeson report which claims UK universities are riddled with terrorists. He was particularly keen to point out that claims in the report about the so called "4th bomber" being associated with Luton were based purely on a story in the Daily Mail that the paper had withdrawn and apologised for.

Wharton also rejected Ruth Kelly's call for vigilance amongst academics in the cause of anti-terrorism as an effort at turning academics into "a kind of 21st century stasi by the back door." The notion that academics should be "obliged to hunt out people in the classroom" was ridiculous.

I doubt the interview will get much exposure but at least it gets it's 15 minutes here on b2f. :-)

Microsoft CTO condemns UK ID card plan

Microsoft's UK technology chief, Jerry Fishenden, has condemned the UK government's ID card plans.

"Mr Fishenden, national technology officer for the computing giant, said it could lead to "massive identity fraud".

"Putting a comprehensive set of personal data in one place produces a honeypot effect - a highly attractive and richly rewarding target for criminals," he told The Scotsman.

The system was "something that no technologist would ever recommend", he said."

Good for him.

Monday, October 17, 2005

Adviser to Canadian Prime Minister on Open Access

Arthur Carty, national science adviser to the Canadian Prime Minister has called for a culture of sharing to facilitate a global information system.

"With the world looking to science to find solutions to global problems, the need to safeguard, evaluate and exchange information and knowledge has never been more pressing...

Above all, our goal must be to maximize the impact of research for societies everywhere, not just the developed world. People in developing nations must be able to access and contribute to the vitality of the global research information and communications system. An open-access philosophy is critical to the system’s success: if research findings and knowledge are to be built upon and used by other scientists, then this knowledge must be widely available on the web, not just stored in published journals that are often expensive and not universally available...

Creating a system with these attributes is no longer just a question of developing appropriate technologies; for the most part these already exist. Rather, it’s a matter of building, integrating and improving the technical infrastructure, operational standards, research support systems, regulations and institutional roles and responsibilities. It’s also a matter of nurturing a culture of open access and sharing, beyond what researchers have ever embraced...

a culture of open access and sharing. This is harder to build than the nuts and bolts of the system because it requires a new mindset among researchers, administrators, governments and in some cases companies – everyone involved in the creation and dissemination of knowledge.

This past year has seen a major change in the way two important funders of health research do business. The National Institutes of Health in the United States and the Wellcome Trust in the United Kingdom both announced that all research that they fund must be archived in a free repository that’s accessible by other scientists and the public.

However, filling archives, though necessary, will not be able to change the mindset of people in the research enterprise. We have to find ways to motivate researchers in all countries to preserve and exchange their research data, to publish their findings in open access journals and to deposit their published articles in institutional repositories... Institutions, too, need to know that their investments in expanding and improving the quality of their data archives and open-access repositories are recognized as measurable scientific outputs."

This latter point - what's in it for us - is one that many institutions, even my own University which exists purely to facilitate supported open access to higher education, are stuggling with when they consider creating their own repositories. The good news on the Open University front is that we are looking at getting our own open access project underway in 2006.

Crypto-gram, October 15, 2005

Bruce Schneier's latest Crypto-Gram Newsletter has been published. Extract:

"DUI Cases Thrown Out Due to Closed-Source Breathalyzer

According to the article: "Hundreds of cases involving breath-alcohol tests have been thrown out by Seminole County judges in the past five months because the test's manufacturer will not disclose how the machines work."

This is the right decision. Throughout history, the government has had to make the choice: prosecute, or keep your investigative methods secret. They couldn't have both. If they wanted to keep their methods secret, they had to give up on prosecution.

People have the right to confront their accuser. People have a right to examine the evidence against them, and to contest the validity of that evidence. As more and more evidence is collected by software, this means open-source equipment.

We are all safer because of this decision. (And its implications are huge. Think of voting systems, for one.)

<> "

and from one of his recent essays:

"The effects of wholesale surveillance on privacy and civil liberties is profound; but unfortunately, the debate often gets mischaracterized as a question about how much privacy we need to give up in order to be secure. This is wrong. It's obvious that we are all safer when the police can use all techniques at their disposal. What we need are corresponding mechanisms to prevent abuse, and that don't place an unreasonable burden on the innocent."

Expensive, pointless, dangerous - ID cards

AC Grayling, author of In Freedom’s Name: the Case Against Identity Cards, (published by Liberty) has an article in the Times this morning spelling out some of the problems with ID cards. Extract:

"Forged cards might not fool forensic experts, but in most cases they will not be recognised by bank clerks and shop assistants, so their effect on fraud will be minor.

ID cards might help to catch illegal immigrants if it becomes necessary to produce a card in order to get, say, hospital treatment (an idea strongly opposed by the BMA), but they will not deter illegals from anything other than seeking services for which a card is needed. To put the entire population to the necessity of paying for an ID card in order to catch a small number of illegal immigrants is like killing flies with bombs."

If you think Mr Grayling is just one of those extremists from the "civil rights industry," as I heard it described on the radio over the weekend, how about this:

"Instead of wasting hundreds of millions of pounds on compulsory ID cards as the Tory Right demand, let that money provide thousands more police officers on the beat in our local communities."

Another civil rights industrialist? No. That was Tony Blair speaking to a Labour party conference, ten years ago, beofre he became prime minister.

For me it simply makes no sense to spend billions of pounds of public money on technology which will not work.

ID card scanning system riddled with errors

The Independent on Sunday have discovered something that has been widely known for some time - that ID card scanning systems based on biometric technology are flawed.

"The Government is planning to use face, iris and fingerprint scans to identify people on ID cards. But studies have found that being scanned in the wrong type of light or in shadow could lead to an inaccurate ID, because biometric technology is flawed.

Internal reports for the Government warned that manual labourers whose fingertips are worn or nicked, could find their fingerprints are not recognised. Men who go bald risk being identified as someone else, experts say. Pianists, guitarists and typists - whose fingerprints can be worn down - could also face inaccurate readings.

Government trials have found that the biometrics of black, elderly and disabled people have a higher chance of being incorrectly matched against their true ID. People with eye problems also have a relatively high chance of inaccurate identification."

Which is why CTO, Ian Watmore, has been quietly working behind the scenes to ensure most ID card authentication checks won't be based on scanning biomentrics but on a pin number. Ministers, however, are not really interested in minor details like that though, so expect all the usual overblown rhetoric in the debates this week.