Us Techies

Note: This entry has been restored from old archives.

Us techies (in my generation: who grew up in the 80s/90s knowing what an IRC was, how to push the power button on the “computer”, and wrapping weird batch scripts around multi-disc ARJ decompression to install pirated games for our friends, for example) were a fairly unique bunch in back in the day. But to those entering Uni now (10 years later) the tech was there from the day they were born, for all of them.

Amongst our own generation we’re still respected (though never necessarily liked) for our ability to tweak the Excel spreadsheet or fix the Internet.

Amongst the youngsters will we be no better than the dude who fixes the shower? Just another annoying job that somebody has to do.

Nothing against plumbers, here in the UK the plumbers make the $$$ and can actually afford things like houses. I still think I’d rather be a landscape gardener.

My thoughts while reading Meet Your Future Employee.

I do worry when they start talking about “HTML programming” though, and “advances like blogging and social networking”. I suspect the reporter may be showing her own generational gap… nobody can hope to keep up with the pace of change.

I can’t help but observe that statements like “[lack of] face-to-face communications skills, a critical asset for a modern IT career” are comming from the old IT career professionals. Predicting their own obsolescence?

[Actually, I think my IT-generation sits somewhere in between that which is the subject of the article and that which is the observer, um, or am I generation Y? I’m certainly not X. The article mainly leaves me just all colours of confused. Gah, bloody compartmentalisation.]

Collateral Damage: An Unintentional Storm Worm DOS

Note: This entry has been restored from old archives.

Anyone else get the feeling that the Storm Worm proves that the entire ‘net security industry is useless? We already know that most security is ineffective against targeted attacks, and now Storm makes is clear that the state of security in general is ineffective against widespread attacks. Sure, your AV product will almost certainly protect you from Storm, but it wont protect you from Storm breaking the ‘net in general. The problem is that the fact that you do have an AV product installed and up to date places you in the minority.

OK, implying that we’re all stuffed is rather over the top … but sometimes I really feel rather perturbed by the whole situation.

Anyway, the latest fun fact I’ve noticed regarding the Storm worm is that some security-sensitive sites have started using blacklists to block HTTP clients. At this moment there are several security sites that give me messages like “ACCESS DENIED” or “File Not Found or your IP is blocked. Sorry.” but they work perfectly well if I bounce through a remote proxy. Why? Well according to some lists, such as cbl.abuseat.org, I have a Storm Worm infection. It happens that my ADSL picked up a new dynamic IP this morning that someone with an infection must have had last week. I understand why the websites are doing this, though I’m skeptical of the effectiveness of it as a countermeasure. Being the victim of a DDoS is pretty much worst-case-scenario for a popular site, anything that might reduce your vulnerability is going to look good.

What is the solution? Certainly not this sort of blacklistsing? We probably need to see a shift in the responsibility. The dumb end users can’t be held responsible, would it be a car owner’s fault if his car was stolen and subsequently the thief runs down a child with it? What if the car owner left the car with the engine running while popping into the newsagent to pick up a paper? The child’s death is still not the car owner’s fault I’d say, even if said owner is somewhat foolish. But we don’t know how to hold the thief responsible in the botnet case. The analogy works to describe my case for absolving the user, but breaks down when you look at it for assigning blame to the driver. Are the cars computers, IP addresses, or packets? Who’s the driver? What we do know is that 100% of car thieves are homicidal maniacs! Iieee!

Now, given that there are cars speeding around everywhere being driven by child-killers, roadblocks have been set up all over the place to keep the killer-cars out. Each roadblock has a long list of number-plates to check against approaching cars, the problem is that the list is very large and is always out of date. Some killers will get though (but you may be saved from the DDoS) though you’ll possibly just end up with a huge line of cars at your roadblock (DDoSing your roadblock!). Also keep in mind that the killers who aren’t on the list know that they aren’t and are capable of organising themselves to show up at a given location instantly.

How do we reliably know a bad packet from a good one? Who should be responsible (infrastructure providers need to foot some of this I think). What’s the solution? Buggered if I know 🙂 and if I did I wouldn’t be telling, would I? Let’s hope that some of the large number of smart cookies out there thinking about this come up with something that doesn’t suck! However, I fear that all solutions involve a giant and expensive leap: a new Internet. (Or, at least, a major overhaul of the one we have.) Is that even possible?

Fungal Positioning System

Note: This entry has been restored from old archives.

Walking, rambling, trekking… call it what you will, we do like a good long trundle. Alas, we don’t always have the time and energy left on the weekend for gallivanting. We didn’t make the best use of the, rather wet, summer here in the UK, but now that the occasional crisp sunny days of the colder months have arrived we’re getting out more.

A good while back, inspired by Antonio Carluccio’s Neal Street Restaurant and the subsequent addition to our library of his Complete Mushroom Book book, we became interested in the pursuit of fungi. This, combined with our fondness for wandering, has since inspired the collection of a few more books[1] and a serfish habit of walking with eyes downcast.

Amethysts In Hand
Amethysts In Hand

Now Autumn is well upon us and legions of fungi abound! However, we’re not yet so confident as to go merrily munching away at the bounty of the woods. That said, last weekend (Oct 7th) we saw some interesting specimens in woods south of Rickmansworth and we did net ourselves a good collection of Laccaria Amethystea (the common name is Amethyst Deceiver, one of my photos is to the right but this is a far better photo). These made a pleasant addition to the evening’s pasta. Yes, we picked bright purple toadstools and then ate them!

Over the weekend just passed we became more serious in our fungal pursuit. But now I shall significantly digress to the other subject of this post: GPS. Last week I was doing a little web-shopping, thinking to get a funky LED torch[2] and/or a couple of foldable knives (for fungus gathering). In the end I came away with neither item, having been lured off target by the glingy goodness of a fancy electronic gadget.

There are many GPS units around these days, with Garmin and Magellan seeming to have the best ranges for for the off-road trekker. In the end I picked a Garmin eTrex Vista HCx, the top of the line for the eTrex range, complete with the iffy features of an electronic compass and barometric altimeter (but hey, when buying a new toy you may as well get all the geekbling! gling?). The cost/benefit analysis of the purchase decision basically came down to 50 quid extra for the altimeter, compass, and high-speed GPS hardware (with the additional cost of battery life being 25 hours rather than 32). In the end I decided that for the cost of a reasonable dinner for two… why not?

Along with the Vista I have the official Garmin TOPO Great Britain map, a bloody expensive heap of bytes. At 100 quid from many UK sellers, it seems very expensive until you stop to think that it includes topographical and road data for the entire UK. Reflect on Encyclopaedia Britannica for a moment though, remember when they produced a CDROM version and tried to flog it for a four digit price? The digression digresses… It’s the great divide between, what I think of as, “the past” versus the new “digital product generation”. Shelves of encyclopaedias that you pay thousands of units of currency for have become an anachronism and I expect many parts of that industry were laid to rest by the “digital generation”. At a time when it seems even the empires of the media distribution companies may crumble, vendor lock-in can’t keep the likes of Garmin going for long. Tomorrow the capabilities of their eTrex will be in my phone[3] and Google Earth will be the only software I need as roving communities of GPS geeks build up their own databases of topographical data. Gah! Enough idle speculation, back to my digression.

Garmin Vista HCx
Garmin Vista HCx

In the short time I have had to play with my geek bounty I’ve been pretty impressed. The Garmin gets a lock damn fast and the GPS tracking against their map is impressively spot on, doubly impressive to see it map into Google Maps with high accuracy as well! (More on that in a moment.) At first the screen seemed rather small (3.3×4.3cm), but it does not inhibit use of the device as much as I expected (it is also surprisingly readable in daylight). The input interface is simple, using 5 buttons and a mini-joystick, it took a little learning but after a day in the field I didn’t have to think to operate it.

So, the downsides? Well, as per earlier rant, map data is very expensive. I bought this for UK trekking so the UK map was essential (and realise, if you prefer to pay for such things this will add 50% more to the price of a good unit). While, considering the content, I think the price isn’t unjustified I also think that it is a significant “hidden cost” that really should be better disclosed in the product description and specifications. Time for some more subdigression. A system when you could license, say, 100 square miles of map would be great for the trekker. By this I mean you’d have such a license and at any one time be able to load on at most 100 square miles from an online Garmin world-map database. For something like a 20 quid yearly subscription this would seem pretty attractive. It is probably prone to having the data ripped though, but that’s nothing new — as far as I can see you can already download unlocked versions of the majority of Garmin map products from various file-sharing systems.

The second point about the maps is: don’t get your hopes up. They’re nowhere near as good as the Ordnance Survey OS and Landranger maps. Consider it this way: a Garmin GPS unit with GB TOPO maps is a near-perfect navigational aid, but keep your trusty OS handy for the fine details. The up-side is that the topographic data on the GB TOPO maps is from the OS, so it matches perfectly and it’s easy to both home in on your on-paper location, and map a waypoint into the GPS based on OS map features. As far as I can work out the TOPO maps are the best you’ll get for the Garmin, I think trying to display all the OS data would be a UI nightmare anyway.

What else is wrong with the device? Well, I find the electronic compass to be too unstable, but I might just need to get more used to it. So far I’m not convinced that I’d want to use it to take a bearing. Now to my main gripe, the little research I have done indicates that Linux basically doesn’t exist in the world of Garmin. (Shock! Horror! Oh, poor me, the big bad company doesn’t care that I’m a technodeviant!) The Win32 MapSource tool that comes with the device is a bit clunky but actually does it’s job pretty well, letting you plot out courses to upload to the GPS device and download then edit tracks and waypoints saved on your trekking. (With the insane limitation that it cuts off waypoint names at something like ten characters, what decade is this!)

What can Linux deviants turn to? Well, some dude has done a great job on a tool called gpsbabel, this does the very important task of sucking data from the unit or from files saved in MapSource format and converting them into a variety of other formats. I have found that the process that works best for me is to download data from the unit in Windows/MapSource to tidy up the tracks and waypoints as necessary, then use gpsbabel to covert the data into the format I ultimately desire: Google KML. (gpsbabel works under both Windows and Linux.) Though the KML needs to be hand cleansed, otherwise Google Maps barfs on some parts of it, I haven’t had time to take a closer look at this.

I’ve had the eTrex Vista HCx for only 4 days, so it is still “early days”. I’m hoping to work out an acceptable all-Linux solution. This might be using gpsbabel to suck from (and load to) the device and “Google Earth” to edit and create tracks and routes. The $US20 per year version of Google Earth appears to support Garmin devices, that is certainly worth exploring. Unfortunately Google Earth stopped working for me when I upgraded my Ubuntu to gutsy (I’ll echo other people in the opinion that upgrading to gutsy was mostly a PITA, last thing I wanted was bloody geek wank like compiz), I’ll wait for the free version to work again before trying the Plus version. You can also edit tracks and points with the Google Maps web-application, but I find it too laggy. (Is is just me, or has Firefox become a slow piece of crud these days, I find myself using Opera more and more often now.)

As is the way of these things I have now written a lot more about the negative than the positive. Don’t be fooled! So far I’m very happy and impressed with the new toy, it was really very pleasant company on a couple of longish walks we did this past weekend.

So, fungus I said. Gus? Who’s Gus? (Gus is the name identifier I’ve loaded onto my GPS!)

Whippendell 20071020 Samples
Fungal Specimens from Whippendell Woods

On Saturday October 20th, GPS in hand, we reprised our Whippendell Woods Walk — hunting fungi. Mapping the track from the Garmin into Google Maps left me rather impressed by both the accuracy of the GPS and the translation between the GPS and Google Maps. The trail comes up with enough accuracy to even be mostly on the correct side of the canal we followed (though often in the canal). We gathered 10 samples for later identification, which has proven to be a fun exercise. It’ll be interesting to see how long our little amateur-mycology hobby lasts. (Historically, I’m very bad at hobbies, the pattern tending to be an intense burst of focused interest shortly followed by complete and utter neglect.) The hardest part of fungus hunting is that we have an interest in finding stuff that is good to eat, gastronomic exploration is very much a part of who I am. But fungi are a bit of a dangerous minefield of creatures with names including words like “death”, “sickening”, and “poison” and on top of that we noticed this weekend that people had been through and really not treated the fungi very well. (There’s quite a bit of money in commercial harvesting of wild fungi these days, sometimes I curse the recent gourmet revolution driving up the scarcity and prices of things that used to be little-known delicacies.)

On Sunday we did a quick south-of-Ricky pub-ramble. Taking in the Ye Olde Greene Manne (nothing special, a chainpub) and the Rose and Crown (pretty good pub).

I intend to write more about both walks… though, as ever, such intentions go onto the pile with the likes of writing about some call graph visualisation I explored recently, several noteworthy places I’ve eaten at, some good coffee houses, some interesting books… the list goes on.


[1] The Encyclopedia of Fungi of Britain and Europe by Michael Jordan (excellent but rather large for trekking); Field Guide to Edible Mushrooms of Britain and Europe by Peter Jordan (not related); Collins Gem – Mushrooms by Patrick Harding (ultra mobile).

[2] The LED Lenser V2 Professional seems rather nice, though I have read some less than positive comments about the LED Lenser products.

[3] You should see the technogeek lolly goodness available (or soon to be) in Japan, the likes of: OLED display watches with 4GB storage for audio and video; normal sized mobiles with wifi and GPS; self-milking genetically engineered digital cows that you can keep in the fridge and that live on old food that otherwise might evolve

Django on Debian Sarge

Note: This entry has been restored from old archives.

The main reason I’m posting this is so that other people can avoid trying to play “chase the dependency trail” in back-porting the Debian etch Django source package to sarge. If you want to do that I suggest working on modifications to the source package rather than following the trail.

[Note: If you want Django on Debian etch then you can simply: apt-get install python-django (not the latest and greatest though, 0.95.1 at this time, your best bet in all cases is probably a local user install from SVN)]

Here’s how I successfully installed Django as a Debian package, followed by the way I failed to do so.

Creating a Django Debian package (successfully)

After my trail following led me to a cliff I gave up on the 0.95.1 Debian source and moved on to this hackish method of getting a package. There are plenty of ways to get to packages on Debian. Let me scare you with this:

:; su -c 'apt-get install rpm alien'

(I’ll used Sam’s neat prompt to differentiate between commands and other crud.)

Then:

:; wget http://www.djangoproject.com/download/0.96/tarball/
:; tar xzf Django-0.96.tar.gz
:; cd Django-0.96
:; python setup.py bdist_rpm --packager="Yvan Seth" --vendor "django" 
            --release 1 --distribution-name Debian

However, the last step fails with “Illegal char '-' in version: Version: 0.96-None” so I edit build/bdist.linux-i686/rpm/SPECS/Django.spec and remove the -None from version (can’t see a way to do this with the bdist_rpm options):

:; vim build/bdist.linux-i686/rpm/SPECS/Django.spec
:; cp ../Django-0.96.tar.gz build/bdist.linux-i686/rpm/SOURCES/
:; rpmbuild -ba --define "_topdir 
        $HOME/source/Django-0.96/build/bdist.linux-i686/rpm" --clean 
        build/bdist.linux-i686/rpm/SPECS/Django.spec
:; fakeroot alien -k 
        build/bdist.linux-i686/rpm/RPMS/noarch/Django-0.96-1.noarch.rpm

Joy! Now I have a django_0.96-1_all.deb, do I dare to install this critter?

:; su -c 'dpkg -i django_0.96-1_all.deb'
Password: 
Selecting previously deselected package django.
(Reading database ... 41757 files and directories currently installed.)
Unpacking django (from django_0.96-1_all.deb) ...
Setting up django (0.96-1) ...

This sucks like a 747 engine (or blows, all a matter of perspective), but the deed is done. Probably better than “su -c 'python setup.py install'” but in the end it would probably have been best to just do a local --prefix=$HOME/blah type of install.

Setting it up

The Django site documents installation at http://www.djangoproject.com/documentation/0.96/install/.

The Django documentation is rather good, once I was though with my packaging débâcle the doco got me up and running in next to no time. My notes here are specific to my system and probably not useful to anyone.

Database

I’m already running both PostgreSQL and MySQL. I chose to use PostgreSQL because the sarge package for python-psycopg fits the specification given by the Django instructions while the python-mysqldb version is a little older than the specified minimum version. I’m also more familiar with postgres. So:

:; su -c 'apt-get install python-psycopg'

You’ll want to set up a database, so get an appropriately privileged psql shell and, for example, do:

CREATE DATABASE django_mysite;
CREATE USER django_mysite WITH PASSWORD 'dumbpassword';
GRANT ALL PRIVILEGES ON DATABASE django_mysite TO django_mysite;

mod_python

The mod_python setup is documented here: http://www.djangoproject.com/documentation/0.96/modpython/

I have mod_python installed and it makes sense to use it. I’m using a pretty funky Apache vhost setup though and I’m not going to detail it here. In essence you want to find the appropriate VirtualHost section and insert something like this:

    <Location "/djangotest/">
        SetHandler python-program
        PythonHandler django.core.handlers.modpython
        SetEnv DJANGO_SETTINGS_MODULE mysite.settings
        PythonPath "['/home/vhost/malignity.net/data/django/'] + sys.path"
        PythonDebug On
    </Location>

With that done you should see a basic status page at the configured URL (i.e. http://malignity.net/djangotest/).

All Good!

Now to follow the great tutorial that starts here: http://www.djangoproject.com/documentation/0.96/tutorial01/ I’m mildly impressed already, and web stuff usually pisses me off before I even touch it. We’ll see how it fares after I try and actually do something with it though.

How Not To Do It

Backporting python-django from Debian etch to sarge

This was a supposed to be a command log for a successfully back-ported Django package build. But it didn’t turn out so well.

We start with a Debian sarge server box. I really don’t want to risk dist-upgrading a box that’s in another country a 2 hour flight away. The box runs with some backports.org packages for the likes of Apache2, SpamAssassin, and ClamAV updates. Consider it a fully up to date general server with this sources.list file:

:; cat /etc/apt/sources.list
deb ftp://ftp.de.debian.org/debian sarge main contrib non-free
deb http://security.debian.org/ sarge/updates main
deb http://www.backports.org/debian sarge-backports main

What I want on this box is this cool new Django thing I’ve heard so much about, but rather than download and work with the tarball I’d prefer a Debian package (call it a form of OCD if you like). A fairly up-to-date package is available in current the currently stable etch and we can trust Debian to track this for security fixes, if I track this package manually we ought to be able to have a relatively safe install. So! Time to try and build the etch sources for sarge.

;: su
:; cat << END >> /etc/apt/source.list
deb-src ftp://ftp.de.debian.org/debian etch main contrib non-free
deb-src http://security.debian.org/ etch/updates main
deb-src http://www.backports.org/debian etch-backports main
END
:; apt-get update
:; exit

Create and install python-django?

:; mkdir -p pkgbuild
:; cd pkgbuild
:; apt-get source python-django
:; pushd python-django-0.95.1
:; fakeroot dpkg-buildpackage
...
dpkg-checkbuilddeps: Unmet build dependencies: debhelper (>= 5.0.37.2)
    python-dev cdbs (>= 0.4.42) python-setuptools (>= 0.6b3-1)
    python-support (>= 0.3)
...

In a perfect world this would build your package! But things are rarely perfect. The dpkg-buildpackage reports any missing build dependencies, I had to install the parts that I could:

:; su -c 'apt-get install debhelper python-dev cdbs python-setuptools'
...
Setting up cdbs (0.4.28-1) ...
Setting up python2.3-dev (2.3.5-3sarge2) ...
Setting up python-dev (2.3.5-2) ...
Setting up python2.3-setuptools (0.0.1.041214-1) ...
...

Notice the versions aren’t really what I’m after, I’ll be punished for that later.

Create and install python-support?

Alas is isn’t as simple as that, another build dependency was python-support which wasn’t available for sarge, so…

:; popd
:; apt-get source python-support
:; pushd python-support-0.5.6
:; fakeroot dpkg-buildpackage
:; su -c 'dpkg -i ../python-support_0.5.6_all.deb'
...
 python-support conflicts with debhelper (<< 5.0.38)
...

Nope, no luck. Looks like the version of debhelper installed really needs an upgrade… I’ll prefer to trust the package maintainers and not try and force an older debhelper version on things.

Create and install debhelper?

:; popd
:; apt-get source debhelper
:; pushd debhelper-5.0.42
:; fakeroot dpkg-buildpackage
...
dpkg-checkbuilddeps: Unmet build dependencies: po4a (>= 0.24)
...
:; su -c 'apt-get install po4a'
...
Setting up po4a (0.20-2) ...
...
:; popd

Create and install po4a?

It is getting a bit over the top now… another route would be advisable at this point. While there’s got to be a better way I’m possessed of a zombie-like persistence so keep going…

:; apt-get source po4a
:; pushd po4a-0.29
:; fakeroot dpkg-buildpackage
:; # Joy! It builds!!
:; popd
:; su -c 'dpkg -i po4a_0.29-1_all.deb'
:; # Joyjoy! It installs!!!

Pass 2 of: create and install debhelper?

:; fakeroot dpkg-buildpackage
:; # Joy! It builds!!
:; popd
:; su -c 'dpkg -i debhelper_5.0.42_all.deb'
 debhelper depends on dpkg-dev (>= 1.13.13); however
  Version of dpkg-dev on system is 1.10.28.

Sigh. Is it really worth following this path any further?

Create and install dpkg-dev >= 1.13.13?

No, I really don’t want to do this. Time to stop! Additionally:

:; popd
:; apt-get source dpkg-dev
:; pushd dpkg-1.13.25
:; fakeroot dpkg-buildpackage
...
dpkg-checkbuilddeps: Unmet build dependencies: debhelper (>= 4.1.81)
  pkg-config libncurses5-dev | libncurses-dev zlib1g-dev (>= 1:1.1.3-19.1)
  libbz2-dev libselinux1-dev (>= 1.28-4)

A circular dependency on my debhelper/dpkg requirements. Version requirements for libselinux1-dev and zlib1g-dev that are too up to date for sarge and would require builds. No, it’s gone too far.

Another route would be to try modifying the package build information to get a Django package that doesn’t have any of these requirements. But that also tends to be a slippery slope.

Yeah… just don’t do this. OK?

Radio(head) Kills The Music Mafia

Note: This entry has been restored from old archives.

Check out the new Radiohead album. They’re self-distrbuting and seemingly not involving a big label. The deal is pretty sweet too, the “discbox” contains a nice swag of stuff (but I’m not into things so I’m less likely to go for that) and the “download” comes with a checkout page where you choose the price.

It’ll be interesting to see how this experiment goes.

The “music industry” might have more to worry about than filesharing. In a couple of decades they might not have any new music to sell.