Gerv just posted about the number of random Wikipedia pages in his browser history. Hmm, sounds like a meme in the making!

Back in November, Mardak posted some code to pull data out of Places… I’ve modified it to create a list of 10 random Wikipedia links from your history (along with the total). Looks like you’ll need a Firefox 3.1 beta (or newer) for this to work. The count will be a bit different than what you see with Gerv’s method, as I try to be more accurate with filtering meaningful Wiki pages.

1. Open this link, select-all and copy it.
2. Open up the Error Console.
3. Paste into the Code field at the top, and click Evaluate.

(Note that this runs code without any security restrictions, so don’t get in the habit of doing this for people you don’t trust!)

10 random Wikipedia URLs out of 1414…

  1. Scaled Composites SpaceShipThree
  2. Hanny’s Voorwerp
  3. Jerky
  4. Rabbit of Caerbannog
  5. Venice
  6. Chief Justice of the United States
  7. Rat Pack
  8. Turbofan
  9. Telnet 3270
  10. AIM-120 AMRAAM

5 thoughts on “Infoholicismeme”

  1. Justin, your script seems to fail with FF3.0.x and works only with nightlies :

    Erreur : uncaught exception: [Exception… “Component returned failure code: 0x80570018 (NS_ERROR_XPC_BAD_IID) [nsIJSCID.getService]” nsresult: “0x80570018 (NS_ERROR_XPC_BAD_IID)” location: “JS frame :: javascript:%20num=10;C=Components;d=C.classes[‘;1’].getService(C.interfaces.nsPIPlacesDatabase).DBConnection;q=d.createStatement(“SELECT%20url%20FROM%20moz_places%20WHERE%20url%20LIKE%20’′”);ignore=/^(Special|Talk):/;seen={};w=[];while(q.step())%20{u=q.row.url;n=unescape(u);x=n.indexOf(‘#’);%20if(x==-1)x=n.length;%20n=n.substring(29,x).replace(/_/g,’%20′);%20if(!ignore.test(n)%20&&%20!(n%20in%20seen))%20{%20w.push({url:q.row.url,%20name:n});seen[n]=true}}%20h=”10%20random%20Wikipedia%20URLs%20out%20of%20″+w.length+”…n”;%20while(num–)%20{%20r=Math.floor(Math.random()*(w.length+1));%20h+=”“+w[r].name+”n”;}h+=””;%20open(‘data:text/html,’+escape(h)) :: :: line 1″ data: no]

  2. It’s a great script, but unfortunately the output is in ISO-8859-5 and at least a couple of my results were mangled UTF-8 URLs (from Russian Wikipedia).

  3. I only have 37 wikipedia URLs in my history, and your script selected one of them twice and one of them three times… the second time I evaluated it succeeded in picking 10 distinct URLs.

Comments are closed.