Online publishers propose new content-usage protocol ACAP

Websites
Share

Last week I ranted at how French and German newspapers banned Google from aggregating their content on its news service.

Now publishers are getting together to back a new software-based protocol called ACAP (Automated Content Access Protocol) which is used to tell search engine spiders, and other services, what can be done with the content they crawl. The project is due to start later this year and last for 12 months.

There are already systems in place which well-behaved automated spiders will respect, such as whether certain directories, images or sites can be indexed at all, but this new proposal deals with how obtained information can be used and published.

It sounds as if plenty of publishing associations are on board already, including the International Publishers Association, European Publishers Council, and the World Association of Newspapers, though there’s no word yet on whether the search engines will accept and adhere to the standard – after all, they will have to re-programme their systems to accommodate these new tags and usage policies.

The alternative? Well, either a string of lawsuits if publishers decide they don’t want their content published in certain ways on other sites, or a ban on news aggregation. One problem I see is that the legitimate search sites (Google, MSN, Yahoo, etc) will probably adhere to good practice, whilst the spam sites will do whatever the hell they like with content regardless of any protocols that get put in place.

I personally think it will be a loss to Internet users and original content publishers alike if there’s a clampdown on news aggregation services. What do you think?

Andy Merrett
For latest tech stories go to TechDigest.tv

One thought on “Online publishers propose new content-usage protocol ACAP

Comments are closed.