Web 2.0 and APIs: the case for Unlimited queries
Every self respecting search engine has one, loads of other sites have one, and lots of people are using them to make great new stuff: Application Programming Interfaces or API’s. There’s a big ‘but‘ on some of them though…
Wikipedia describes an API as: ‘An application programming interface (API) is the interface that a computer system, library or application provides in order to allow requests for services to be made of it by other computer programs, and/or to allow data to be exchanged between them.‘
Now these API’s are super useful, I use them in several of my scripts and have great fun with them. I used the Technorati API, for instance, in the first version of the Technorati rank and link count Greasemonkey script I wrote. This was fast, and worked quite well for a while, so I was very happy. After a few hours of browsing though, it stopped working. The error I got was ‘You have used up your daily allotment of Technorati API queries.‘
Now I can’t understand why they’d want to limit use of the API like that. Why not? Well, now that I have decided to not use the API (after all how much is 500 queries when you distribute this script to 100+ people?), I’m left with no other choice but to scrape the content of their normal site. This costs them way and way more bandwidth, and it’s significantly slower. That’s not much of a service is it? Remember: I do want to use their services! I want to know the Technorati rank for each page I’m visiting, that’s a good sign for them, isn’t it?