Irritating Mongrel install problem

Here I am again, reporting about an issue that I've spent hours and hours trying to fix. This time I was doing a round of gem update --system and gem update to get my servers and laptop up to date. I had some problems on my Ubuntu servers, but those problems seem to have been known/common and/or just down to bad temper.
But one problem persisted. This on my laptop which is a MacBook which is about one year old and came with Mac OS X 10.4 Tiger and I upgraded it to Mac OS X 10.5 Leopard. The problem was that I could not get the gem install command to compile the gem native extensions for Mongrel. I kept getting this sort of error:
Computer:~ j$ sudo gem install mongrel
Bulk updating Gem source index for: http://gems.rubyforge.org/
Building native extensions. This could take a while...
ERROR: Error installing mongrel:
ERROR: Failed to build gem native extension.

/usr/local/bin/ruby extconf.rb install mongrel
checking for main() in -lc... no
creating Makefile

gcc -I. -I. -I/usr/local/lib/ruby/1.8/i686-darwin8.9.3 -I. -fno-common -g -O2 -pipe -fno-common -c http11.c
gcc -I. -I. -I/usr/local/lib/ruby/1.8/i686-darwin8.9.3 -I. -fno-common -g -O2 -pipe -fno-common -c http11_parser.c
cc -dynamic -bundle -undefined suppress -flat_namespace -L"/usr/local/lib" -o http11.bundle http11.o http11_parser.o -lpthread -ldl -lobjc
/usr/bin/ld: /usr/lib/gcc/i686-apple-darwin8/4.0.1/../../../libpthread.dylib unknown flags (type) of section 6 (__TEXT,__literal16) in load command 0
/usr/bin/ld: /usr/lib/gcc/i686-apple-darwin8/4.0.1/../../../libdl.dylib unknown flags (type) of section 6 (__TEXT,__literal16) in load command 0
/usr/bin/ld: /usr/lib/gcc/i686-apple-darwin8/4.0.1/../../../libobjc.dylib load command 9 unknown cmd field
/usr/bin/ld: /usr/lib/gcc/i686-apple-darwin8/4.0.1/../../../libSystem.dylib unknown flags (type) of section 6 (__TEXT,__literal16) in load command 0
collect2: ld returned 1 exit status
make: *** [http11.bundle] Error 1

Gem files will remain installed in /usr/local/lib/ruby/gems/1.8/gems/mongrel-1.1.5 for inspection.
Results logged to /usr/local/lib/ruby/gems/1.8/gems/mongrel-1.1.5/ext/http11/gem_make.out

I was totally puzzled and started googling my way through the problem. First I came across this problem with install vs. ginstall. I thought I was onto a winner. But, a) I don't have an /opt//usr/local/lib/ruby/gems/1.8/gems/mongrel-1.1.5 directory, and b) after loads of poking around in especially looking at the Makefile etc, I found that it was indeed reporting the install correctly (INSTALL = /usr/bin/install -c).
I then thought it was about the http11 and friends, well I knew it was compiling, but I thought something else about it was wrong.
By now I was hours into this. I had to figure out some other reason for this failing, so my attention swerved onto that it seemed to be /usr/bin/ld that was the problem. Having read half of this page, I realised that I probably needed the Mac OS developer tools, called XCode 3.0 for Leopard. Remember earlier when I said that the machine came with Tiger. Back then I installed XCode for Tiger, but I never got around to upgrading it (or maybe I thought it would have been upgraded autoMaccically). Said and done, I fetched the install DVD and popped it in. A good while later XCode 3.0 was installed and I tried the Mongrel install again:
Computer:~ j$ sudo gem install mongrel
Bulk updating Gem source index for: http://gems.rubyforge.org/
Building native extensions. This could take a while...
Successfully installed mongrel-1.1.5
1 gem installed
Installing ri documentation for mongrel-1.1.5...
Installing RDoc documentation for mongrel-1.1.5...
Computer:~ j$

Done! There it was! XCode was at an old version. So if you get some weirdness when installing/compiling on Mac OS X, make sure you've got the latest XCode installed.



Greetings from Cornwall. It's very lovely here, time doesn't seen to matter in the same way as it does close to London. :) we're right now on a little walk and this is the view we've got. Spectacular!


Novatech offers Ubuntu Laptops!

I have been annoyed for a long time that you've not been able to buy a computer without operating system (Windows tax anyone?!) and even more annoyed that I've had to resort to being a geek to get a proper (Unix based, most likely Linux) operating system onto the computers. This is especially hard when you've got a laptop with some strange hardware.
I was, in fact, so annoyed that on April 25th 2006 I wrote about Linux on New Computers - Why not? on my web site. I have since updated that article when there's been progress. The first major one to give you the option (albeit restricted) to buy a Linux computer "off the shelf" was Dell. Good onya, Dell!
I'm very glad to tell you that the Linux "epidemic" is spreading! This morning I had a newsletter in my inbox from Novatech offering laptops with either Ubuntu Linux or Microsoft Windows installed. The price difference is a quite staggering £50, you pay a whole 16% more for Windows (£249.99 vs. £299.99). Quite cool, for Linux.
Sadly I'm disappointed that there's no mention of having Ubuntu actually installed on the Novatech web site. Instead the web site mentions "no operating system installed" and several Windows versions installed. Let's hope they'll get that fixed soon.
On a side note, I've used Novatech's services several times in the past, and they've always been spot on. Recommended!

Hugh's Chicken Out campaign takes an interesting turn.

Having seen Hugh Fearnley-Whittingstall's TV shows about how badly chicken are reared we've become supporters of the Chicken Out campaign. The reason for us is very simple; chickens should be able to walk and shouldn't have to get chemical burns on ther feet/legs because they're sitting in their own feces. They should also get more than 30minutes rest from eating per 24 hours.
Anyway, I got a newsletter from Hugh Hairy this morning and while he's happy for the progress (30-50% more "proper chicken" depending on whose figures you look at), he's also disappointed that Tesco hasn't done much about the whole thing. So Hugh has cunningly made sure he's a Tesco shareholder. This means that he can put forward a resolution for the Annual General Meeting. Clever! The resolution is simply asking for that the animals should be treated according to the Tesco's animal welfare standard. Which include
  • Freedom from Hunger and Thirst

  • Freedom from Discomfort

  • Freedom from Pain, Injury or Disease

  • Freedom to Express Normal Behaviour

  • Freedom from Fear and Distress.

You can read the full resolution on the site. Hugh is then asking to be allowed to vote for you, if you're a Tesco shareholder (proxy voting that is). Hugh is also providing means for people who aren't Tesco shareholders to easily become Tesco shareholders.
Very clever!
Go Hugh and the chickens!!


Git push deletes your files on the server?

I've been scratching my head again (feels quite good actually). This time about setting up Git the SCM. I've been lured into trying out Git hoping that it'll help me not only "Branch often", but even better "Merge often". I wanted to have a CVS/SVN-like set-up with a centralish server repository, but with the freedom of Gitting around locally without worrying about breaking the server. It seems to be very cool when it comes to doing a lot of branching and merging, even stashing your temporary changes (very handy if you suddenly have to do some real work). But another cool thing is that the Git repository is very isolated in the sense that it contains all the changes. But it still isn't isolated in the sense that you can copy it (simply copy the files if you want) but even better you can clone it locally or over the net (preferably SSH).
Having experienced some major problems with merging in both CVS and Subversion (SVN) in the past I'm hoping that I can do better with Git.
Let me also explain why I wanted the centralish server á la CVS/SVN; First of all there's the back-up issues. If I have one "official" location for the code I can easily back it up (in my case mirrored disks). The second is that I'm using Capistrano to deploy my code onwards to the Interwebs. This means that Capistrano needs one specific place from where to get the code. Third is obviously the hope that one day there will be more than myself working on this code and a central location for the "official product" is a Good Thing.
I've followed a few different ways of converting your SVN repository to Git, and this one kind of describes it best: Cleanly Migrate Your Subversion Repository To a GIT Repository. I've also watched the Git Peepcode and Scott Chacon's Git With Rails Tutorial, not to mention the SVN to Git Crash Course.
There's however one thing that isn't really clearly spelled out anywhere. It's mentioned here and there, but it's not really totally in your face. I finally found a good explanation, back at the source, the PeepCode mailing lists. The "problem" that I'm talking about is happening because of Git's incredible flexibility. Because every Git-enabled piece of code is a repository on it's own you can use it as a source, and you can stick your changes into it. In Geoffrey's Peepcode he's simply cloning a repository onto the server and uses that as the "remote server". This works well until you go in there with your curious nose to check for changes (as I did). The issue with happens when you "git push" you local changes to the "server" (server used loosely). The command "git push" does not like it when you push to a branch that's checked out (because "git push" only updates the remote refs, not the file system per se). Therefore if you're using the method mentioned in the Peepcode you should consider your files on the filesystem (except the .git/ directory) as obsolete and out of sync. The alternative is to use the --bare option when you "git clone" your Git-enabled-source. The --bare option is basically just the .git/ directory without the "files" we normally work with. This way the files can't get out of sync, and if you , on the server, want to check what the files look like you can simply do a "git clone" somewhere locally to get a "working copy".
Read up on what Geoffrey has to say about it in the One doubt about Git and pushing to the origin-thread in the PeepCode mailing list.
Summary; There's two ways to have your "server"; just a normal Git repository, or one cloned with the help of the --bare switch. The plain vanilla one will have a set of files lying around that'll be obsolete the moment someone does a "git push". The bare nekkid one won't have the files there, so you can't go in there and sneak in a change or just look at the status.
Hope this saves someone the confusion.

One sided bank-security..?

Andrew makes an interesting and valid point with regards to banks that simply ring you up and ask for your security details without providing any sort of security credentials for themselves. Go ahead and read Banks (still) don't get it....


Sky TV and Virgin Media Annoy me

I'm a geek and a gadgeteer; granted. I love all things gadgets, I also love stuff like TV and movies. I love the Internet (naturally!). But I absolutely loathe "bundles". When it comes to TV and Internet, I'm a pick-n-mix-type-a-guy. Let's start from the beginning. I've got ADSL broadband (with one fixed IP), for that I require a BT Landline (which I don't use for anything else). I've also got Freeview terrestrial digital TV, because it's "free" (the word free in quotes because I have to pay TV license even if I would never use the BBC's services, but I'm fairly OK with that as they provide an unbiased [yeah!] service for the nation). I also have a mobile phone on a separate pay-monthly-contract (only because I like getting new handsets, otherwise I think it's a rip-off).
To the point; I constantly get these ridiculous offers from Sky TV and Virgin Media. They tell me that I have to get rid of my current ADSL and telephone to be able to view their TV programmes. Sky allows me to get TV only, but at a frankly ridiculous price. Virgin Media is the worst of them. I once went to the Virgin Media website hoping that I could simply subscribe/buy into the "V+" box so I could get media on demand and tons of TV channels (I miss Discovery and such from "normal TV"). But I was unable to do so. To get the V+ box I have to cancel my fully working fixed IP Internet connection, I have to cancel my BT landline (don't mind that), but not only that, I also have to subscribe to a Virgin telephone, a Virgin optical broadband (sounds cool, but the speeds aren't near "optical"). This only to get TV.
Here's some news for you Sky and Virgin; My ADSL stays with an Internet Service Provider. That's just how it goes, I'm not going to buy telephone services from the garage that fixes my car, and I'm surely not going to ditch my Internet to get TV.
Same goes with Sky's TV packages, I want to choose the channels I want, not a "bundle". I have absolutely no interest in watching football on TV (nor in real life). I have never done so, and I probably will never do so. Therefore don't offer me a bundle that includes stuff that I don't want to pay for.
Bottom line is, Sky and Virgin are offering services that aren't to the satisfaction of the consumer (me in this case). The moment there's a service provider that would offer the same services on the consumer's conditions, Sky and Virgin are going to have to change their manners or simply disappear. Or they could simply just do it now and offer me a TV-only deal. Sure I'd pay up to £19.95 per month for that service, which is exactly the same as the Virgin XL package costs (after the "introductory offer") which includes telephone and Internet. If the Virgin (and Sky) marketing monkeys can't see that I'd pay the same for "lesser" services they should simply just quit their job and get an allotment and start growing their own fruit n' veg.


Back home

We're all back home safe. No incidents, just a lot of fun. Hard working fun, but fun nevertheless. Even the drive home was totally undramatic. The Espace didn't miss a single beat. Not even grumpy starting. This after having some quite dramatic incidents (like stalling in the outer lane on the M40). Caz did most of the driving back home and I'm grateful for that, I felt quite shattered, so the help was fantastic.
On the bike on the other hand I'm not so very super duper happy. The Evoluzione O2 sensor doesn't seem to work with my bike. It does a few things that makes it worse (Lowers idling, stalls when idling, no improvement in the driveability, and worst of all, a flat spot around 6,000rpm). The Wilbers shocks are OK, nothing dramatic, but this is probably due to that I simply don't have the skills to set-up a pair of shocks with high- and low speed compression damping, and rebound damping. To do the set-up by feel, I'd either need about 100 laps of the Nordschleife or tons of time on a shorter piece of tarmac. Preferably all datalogged. Probably not going to happen. On the totally positive side the Earl's brake hoses were fantastic and probably increased my margins a few times when needed. The Pirelli Diablo Corsa III were fantastic too, well in a way that they were round and black and kept sticking to the tarmac.
Lappingwise I didn't do enough laps for a few reasons, first was traffic and closures (busy weekend!), second was the driveabilty lowered by the O2 sensor replacement, third was that I was knackered (I'm soooo out of shape!).
Either way, it was a good trip, had lovely time, no damage. All good!


Aaaaahh! Pils!

It's been a long day so far. The Espace took up here to Nurburg just perfectly. The track opened at 18:00, but when we arrived at the gates at 18:20, we were denied access. A drink later, and we managed 1.5 laps before closure. Now beer!

To the Ring!

Yay! We're off to the Nurburgring again! Right now we're having a quick stop at Claket Lane. Onwards!


What do I do when I get back home?

Well naturally one goes out to the garage to put a few more bling-bits on the bike. Evo o2 sensor box and Pazzo levers (thanks Caz darling!!). Then out for a proper bike ride with my honey! Oh how I have missed this all!