Is the tree green?

The internets are serious business. When you’ve got questions, they’ve got answers… Is Lost on TV? Is it a rerun? What year is it? 2009? 2010? Has the LHC destroyed the world yet? Is Abe Vigoda still alive? Is it MFBT?

Which brings us to the question frequently asked by Mozilla developers. Is the tree green? Ta-da:

Some random factoids about

  • It’s using a cross-site XMLHttpRequest to fetch the tree status from Tinderbox. This is a neat way to generate the answer in the browser (instead of scraping the status from a cron script on my webserver), but it means you’ll need a recent browser (like Firefox 3.1!) which supports this.
  • It’s surprisingly snappy. The HTML is about 7K, and the XHR status is about 2K. Compare that to the normal Tinderbox page, which weighs in around 220K.
  • The page polls the Tinderbox status every 2 minutes and updates the displayed status. The first time the state transitions from not-green to green, it will trigger an alert(). I think this will be rather useful if you’re waiting for the tree to turn green before checking in a patch.
  • It doesn’t check if the tree is open or closed. (Yet?)

Thanks to Jesse for the original idea, and to Reed for helping to get the Tinderbox server configured to allow XS-XHR.

Ignore all rules

Recently, I noticed that one of Wikipedia’s policies is rather interesting: Ignore All Rules.

If a rule prevents you from improving or maintaining Wikipedia, ignore it.

Seems like a rather interesting way to encapsulate exceptions and common sense as part of an official policy, and well as introducing a touch of rambunctious energy and freedom into what could otherwise become a stodgy bureaucracy.

I like it.


Gerv just posted about the number of random Wikipedia pages in his browser history. Hmm, sounds like a meme in the making!

Back in November, Mardak posted some code to pull data out of Places… I’ve modified it to create a list of 10 random Wikipedia links from your history (along with the total). Looks like you’ll need a Firefox 3.1 beta (or newer) for this to work. The count will be a bit different than what you see with Gerv’s method, as I try to be more accurate with filtering meaningful Wiki pages.

1. Open this link, select-all and copy it.
2. Open up the Error Console.
3. Paste into the Code field at the top, and click Evaluate.

(Note that this runs code without any security restrictions, so don’t get in the habit of doing this for people you don’t trust!)

10 random Wikipedia URLs out of 1414…

  1. Scaled Composites SpaceShipThree
  2. Hanny’s Voorwerp
  3. Jerky
  4. Rabbit of Caerbannog
  5. Venice
  6. Chief Justice of the United States
  7. Rat Pack
  8. Turbofan
  9. Telnet 3270
  10. AIM-120 AMRAAM