JUser: :_load: Unable to load user with ID: 63
Tuesday, 18 March 2008 12:14

ACAP content protection protocol "doesn't work" says Google CEO

A group of content owners and publishers have spent almost two years developing the Automated Content Access Protocol (ACAP) to enable providers of all types of content to communicate permissions information to search engines and the like, but the CEO of Google claims it is incompatible with Google's search engine technology.

Asked why Google was not supporting ACAP, Eric Schmidt told iTWire: "ACAP is a standard proposed by a set of people who are trying to solve the problem [of communicating content access permissions]. We have some people working with them to see if the proposal can be modified to work in the way our search engines work. At present it does not fit with the way our systems operate."

He denied that Google's reluctance to embrace the system was because it wanted as few barriers as possible between information and its search engines. "It is not that we don't want them to be able to control their information."

However, according to ACA, Google is trying to tell publishers they should be content with the current technology, known as robots.txt, despite having been closely involved in the development of ACAP to date. "From a practical point of view, ACAP has been the huge beneficiary of input, technical know-how and quiet wisdom of all of the major search engines, albeit in an 'informal' way," ACAP said.

ACAP questioned Google recently on its stance on ACAP at the Changing Media Summit in the UK and claims that Google spokesman Rob Jonas responded by saying that "the general view within the company is that the robots.txt provides everything most publishers need to do."

This is not good enough for ACAP. Chairman Gavin O'Reilly - also Chairman of the World Association of Newspapers - said in response: "It's rather strange for Google to be telling publishers what they should think about robots.txt, when publishers worldwide - across all sectors - have already and clearly told Google that they fundamentally disagree. If Google's reason for not (apparently) supporting ACAP is built on its own commercial self-interest, then it should say so, and not glibly throw mistruths about. CONTINUED

"Google should reflect on the fact that after 12 months of intensive cross industry consideration and active development – which Google has been party to – publishers have identified not only the patent inadequacies of robots.txt, but more progressively have come up with a practical, open and workable solution for publishers and content aggregators. So, we – once again - call upon Google to embrace ACAP and to readily acknowledge the right of content owners to determine how their content is used."

According to ACAP "robots.txt is a well established method for communication between content owners and crawler operators. However, robots.txt is not sophisticated enough for today's content and publishing models. Robots.txt, in its current form as implemented by most search engine operators, provides only a simple choice between allowing and disallowing access. These simple choices are inconsistently interpreted. A number of proprietary extensions have been implemented by several of the major search engines, but not all search engines recognise all or even any of these extensions." ACAP, in contrast, "provides a standard mechanism for expressing conditional access which is what is now required."

ACAP claims that the various court cases that have arisen between publishers and Google are a symptom of the problem that ACAP seeks to solve, not the problem itself.

Google aims to get around the problem of access to proprietary content by striking commercial deals with some publishers, but ACAP says this will not solve the problem. "Business relationships on the Internet should not simply be about deals done between very large corporations. It will not be possible to manage the very large number of business relationships in the absence of much greater automation. ACAP aims to enable the majority of smaller publishers, smaller search engines and other innovative intermediaries to enter the growing market for online content with confidence."


Australia is a cyber espionage hot spot.

As we automate, script and move to the cloud, more and more businesses are reliant on infrastructure that has high potential to be exposed to risk.

It only takes one awry email to expose an accounts payable process, and for cyber attackers to cost a business thousands of dollars.

In the free white paper ‘6 steps to improve your Business Cyber Security’ you will learn some simple steps you should be taking to prevent devastating malicious cyber attacks from destroying your business.

Cyber security can no longer be ignored, in this white paper you will learn:

· How does business security get breached?
· What can it cost to get it wrong?
· 6 actionable tips




Recent Comments