WikiGraph As Delicious Data

I decided to upload my WikiLog graph to my Del.icio.us account. Because Everything Is A Graph...

I sucked down a couple lists of the pages on my site - 2005-11-21-CurrentWikilogCount

  • one just giving the last-mod-date

  • another listing each WikiWord contained in each page

I grabbed Matt Biddulph's Python code to wrap the API, though it really wasn't necessary.

  • I took the same Expanding Wiki Words code I use in my WikiLog to define the Description for each item

  • each WikiWord becomes a tag - WikiWordAsTag

  • I used the last-mod-date as the posting-date, but that doesn't seem to have mattered

I wrote some code to combine my 2 input lists and start uploading, with a built-in throttle to avoid getting blocked.

I'm up to roughly 3000 pages now. Boy is that a long tag list.

Some possible things to do

  • count the unique tags

  • tell other Wiki hosts to try the same thing (Blue Oxen, Community Wiki?)

  • maybe auto-post all those sites? I don't have the link info, actually they probably don't either... but they could write something that generates the data on the back end much more effiiciently than crawling the site via HTTP...

  • do separate upload of remote links tied to my local tags

    • for a BlogBits page that's pretty straightforward - use all the WikiWord-s contained in the page as tags, for each of the 1-2 remote links

    • for more straight-Wiki pages, maybe only use the page WikiWord as a tag?

Thoughts after uploading:


Edited:    |       |    Search Twitter for discussion