Machine-friendly wiki interface
For a client-side reader/editor and legitimate bots, it would be useful to be able to bypass some of the variableness of the for-humans web interface.
- Retrieve raw wikicode source of a page without parsing the edit page
- ex http://www.wikipedia.org/wiki/Foobar?action=raw
- Should we be able to get some meta-data along with that -- revision date, name, etc? Or all separate...
- How best to deal with old revisions? The 'oldid' as at present, or something potentially more robust; revision timestamp should be unique, but may not always be (only second resolution; some old timestamps wiped out by a bug in February '02 leaving multiple revisions at the same time)
- At some future point, preferred URLs may change and UTF-8 may be used more widely; a client should be able to handle 301 & 302 redirects, and the charset specified in the Content-type header. If your bot won't handle UTF-8, it should explicitly say so in an Accept-charset header so the server can treat you like a broken web browser and work around it.
- Fuller RDF-based Recentchanges
- Also page history and incoming/outgoing links lists? Watchlist?
- A cleaner save interface and login?
Comments, suggestions?