Something in your application worked a few days ago, and now it doesn’t. Aararrgh! We’ve all been there. Nobody has any idea which change broke it. You use git log -p
to thrawl through the recent changes, but nothing stands out. What to do, what to do? Use git bisect to minimize or completely automate finding the commit that broke it!
Run git bisect start
to begin. Since the current version is bad, you run git bisect bad
to let git know.
Now you need to tell it which commit/version is the latest “good one” - often that’d be the last released tag, so for example git bisect good v25.0.3
.
git will now decide on a commit you should test, check it out and tell you how many are left.
If you are just testing manually, you’ll now manually run your tests and tell git if it’s good or bad. In our web environment that’d often involve restarting the application (just to make sure) and then running the tests.
When you know if this revision has the bug or not, you run git bisect bad
if it’s bad and git bisect good
if things are good and after a little while of that git will tell you which commit introduced the bug!
If you can do the ‘is it good or bad’ test automatically, you can run ‘git bisect run command’ (where command is make
, prove t/sometest.t
or a custom shell script) and git will figure it out automatically while you get coffee or go to lunch!
In 2007 I used my old Treo 650 a few times while traveling outside the country (unlocked and better control of data charges), but by now I don’t think I will be able to figure out how to use it anymore, so I’d rather just use any old goofy “standard phone”.
Do you have a phone you use for traveling?
]]>The other side of it is writing the instructions so future maintainers can work with the code without making too many mistakes. That’s much harder.
The future maintainer (often a slightly older version of yourself) needs to:
Mistakes (or bad planning or bad luck) that leads to problems changing the application often take a long time to really show up. You can always squeeze something in to change “just one more thing”.
Problems that makes it hard to expand the code can be hidden the same way; most expansions are really just a bunch of small changes and until some fundamental change (“now also do it in Chinese”, “do 10 times more requests”, “support multiple currencies”, “now with 2 terabyte data, please”) the technical debt won’t completely stop you.
Mistakes that makes the code hard to understand on the other hand can show up very quickly. All you need is a break from the code and then coming back to it a few weeks or months later (or days if your memory is more like mine).
One of the easiest mistakes to make is poor naming of a variable. The obvious mistake is to call some important variable x
, obj
or data
. Which data is that now?
In the YellowBot code most of our data revolves around “locations”, so we have a lot of location this and location that. A common mistake then is to do something like
for my $loc ($foo->location_widgets) {
...
}
The mistake here is that $loc
isn’t a location, but a location_widget
. If it’s 3 lines it’s not so bad, but soon enough the loop is 10, 15 or 30 lines long. When you go to make a change it’s easy to think $loc
is a location object and not a widget
. Boom; game over!
Another similar mistake is to have multiple data structures for (too) similar kinds of data.
A month ago I redid our “process videos” infrastructure (mostly used by Weblocal where they have a lot of videos). Today I noticed that we don’t process videos with more than 2 audio tracks properly to all the different output formats. Simple change! The “what codecs are in the file” code already extracts the number of audio tracks, so I can just add a few lines to force the number of output channels to 2 when needed.
Around where I need to add this code there are some lines with $meta->{bitrate}
. Since I could see from our debug output that the key for the audio channel data is called audio_channels
, I can just add $meta->{audio_channels}
, right?
Wrong, of course.
In my infinite wisdom I had separated out the data from the “detect video meta data” and “other data we’re calculating in this code”, so the former is stored in $video_info
and the latter in $meta
. At the time I’m sure it seemed cleaner to keep the two data structures separate; but now a month later it makes no sense - how can I, easily, know if the data I need is $video_info
data or $meta
data?
Take a month away from a bit of code and then come back and see if it still makes sense. It probably won’t.
]]>Fortunately the wizard making hard drives have come up with magical 2TB disks recently. That’s about 2,000,000 megabytes.
(Update summer 2010: Last summer a 2TB disk cost more than $200; now it’s getting really close to $100!) I like Western Digital disks because their warranty process is really easy, at least in the United States. I don’t know if they break more or less than other disks; but all disks eventually break.
To replace the disk I followed the hardmac.com instructions with a few modifications.
Because our Time Capsule have been running for a year and a half, the glue on the “rubber protection” had mostly hardened and it was hard getting the rubber off without tearing it completely to pieces. Keep a knife handy.
I didn’t want to start over on our backups, so I put the new disk into the TC; started it up and formatted the disk with the Airport Utility. I didn’t bother reassembling it for this stage, because after formatting I removed the disk again and put it (and the old disk) into external enclosures and used Super Duper to copy the old disk to the new. You could also should use rsync (/usr/bin/rsync -avPE "/Volumes/Old TC" "/Volumes/New TC"
), because Super Duper will miss some “hidden” files that Time Machine needs to operate.
It’s still copying (estimated time to completion: 16 hours, yikes!), so I haven’t put it back together yet; I’ll update the post when I have and checked that the backups resumed as expected. (Update: Yup, the rsync worked fine - after putting the drive back everything was in place the same as on the old drive; except with twice the space).
Two “AP*” partitions will show up as well for each disk when you plug them in; just ignore or unmount those.
Footnote
We use Time Machine Editor to configure Time Machine to only run every N hours instead of every hour which is a bit much.
]]>Traveling and needing to get on our network and being on various goofy corporate and hotel networks (and wanting to watch Netflix outside the US reminded me how excellent it is.
We’ve used Viscosity VPN since it was in beta and I can’t recommend it enough. It’s as reliable as can be, the support is excellent, it looks good and while it’s not free it’s super cheap, licenses start at $9 each. We bought more than twice as many licenses as we needed just because I felt bad not paying more!
Now if only the iPhone supported OpenVPN, too, so we could put more services on the private network without having to setup another VPN service…
]]>You get better commit messages.
There’s no difference in the interface to writing the commit messages, but they just tend to be more useful when using git (rather than subversion).
A subtle difference is that many of the git tools only shows the first line of the commit message. You’d think that’d be worse, but it actually works out to be better because it encourages you to truly summarize the change in a way that’s useful for quick browsing. “Alright, you only have ~60 characters — what really changed?”
Git also encourages you to review and allows you to edit the commit message before you share it with others.
With subversion browsing the history is at best clunky and a bit slow. With git it is super fast, and the tools are awesome (on OS X I recommend GitX). Other than being super fast a big difference is that the client side tools lets you see the actual diff instantaneously, so browsing a long list of patches is easy.
All these things adds up to the commit history being genuinely useful in your day-to-day development work. With subversion I’d browse the log to make a change history and occasionally to assist tracking down a bug. With git I use the commit log ALL DAY. In turn this encourages you to write better commit messages, because more likely than not you are going to read them again.
]]>Firstly, I would like to inform you that the domain name perl.org has been unsuspended.
We had received phishing complaints on several hundred domain names belonging to a particular network. Since we need to act on these complaints immediately, the domain name perl.org was accidentally suspended as well. We have now verified that this site does not contain any phishing material and have thus unsuspended the domain name.
I understand the consequences faced by you and you clients/people using the site due to this suspension. I sincerely apologize for this on behalf of ResellerClub. Be assured that this was a one-off case and we have made sure such a thing is not repeated.
Apologies once again.
This is even worse than if they had been overzealous with an abuse complaint actually on perl.org. Excuse me while I go look for my jaw on the floor.
]]>This is the mail I got a few hours ago:
We received a complaint about your domain name perl.org being involved in phishing activities. Using domain names for any such activity, is strictly against Registrar PublicDomainRegistry.com’s AUP.
On account of the breach of the PublicDomainRegistry.com DomainRegistrant Agreement (available within your Control Panel at Help -> Legal Agreements) we have Suspended this domain name.
Very clever. In particular doing it on a Saturday morning! Also note the exquisite details that allows us to respond (that’s sarcasm, there obviously was no detail). In particular it’s insane because if it actually was happening we’d want to stop it, but they give us no help for that. Lots of DNS resolver will have the domain cached for a day or two, so just turning off the domain wouldn’t protect people. Did I mention incompetence?
They actually did this stunt with the xrl.us domain (the short domain for the metamark service) some time ago. That time they also didn’t communicate anything or seem to care much about the disruption they caused. Foolishly thought they’d be able to manage the other domains.
I’ve recommended DirectI in the past, but obviously no more. They have very good pricing and a decent web interface, but clearly they are useless for anything more important than parked “to be used later” domains. If you want to turn off one of their customers, just open a free email account and send some abuse complaints. It sounds like you just need to include your targets email address in the abuse complaint. Works out well if your competitors are using them for their business!
I’ve opened a ticket with them which is the only sort of contact they allow. Being the weekend now I don’t know when they’ll respond, much less fix it. Anyone have a contact at DirectI / Resellerclub?
Also - anyone have tips for how to as automatically as possible transfer a bunch of domains from them to OpenSRS?
Update - it’s back now - they say they suspended it by mistake while suspending other domains. Unbelievable.
]]>On recommendation from Duncan I tried to print some photos from Mpix.
The technical quality is great. I printed about a hundred 4x6es and some bigger ones and they all look really good (well; as good as the photos allow them to be).
By default they print matte; but you can add a "lustre" coating. It's not quite glossy, but gives colorful photos a little extra boost. On a black and white photo where I tried the coating I think it was a mistake. On most of the color photos where I tried it, the lustre version is better.
On a 16x20" print I tried their mounting (double weight matboard) and it's much more impressive than I expected.
Likewise I had some 8x12 photos framed and the frames and beautiful and well done - much much better than their website makes it look. The framing (with non-glare glass) was about $30 which seemed like a lot for a small frame, but it's high quality so the price is about right.
However, the neatest thing is that they package my $3 prints like it's an expensive piece of art. And don't even get me started on how well they packaged the framed photos. You know how it's neat to open an Apple product? Yeah - between the excitement of seeing the photos and how nicely they packaged them, that's about how it's been to open the boxes they sent. Shipping is $7 per order or $11 for overnight shipping. That's somewhere between very reasonable and insanely cheap all things considered.
So, Mpix: Highly recommended so far. I'm looking forward to trying them again.
]]>“It would be hard to overstate how fervently vast stretches of the globe wanted the election to turn out as it did”
(New York Times on The Promise - For Many Abroad, an Ideal Renewed)
Yup, here too! TPM have a full transcript of Obama’s speech.
The new American president sure is setting himself up with some high expectations. Awesome! Enough with pandering to the regular guy. Let’s have smart people govern instead. What a concept!
(Update - If you have a BitTorrent client then I have a .torrent for a high resolution (1280x720) version of the victory speech. Right now there are about 60 seeders, so it should be pretty fast).
Immensely disappointing (and frustrating to many) on the other hand is that the voters seem to be passing prop 8. Yikes.
On one of the news programs earlier they had interviews with some supporters. The best arguments they had was some hand-waving about their children. That’s even more offensive than them voting yes on the proposition! Don’t attribute political opinions or labels to your children - and much less sexuality. Look lady, your kid is 5 years old. I promise that other than your indoctrination he has absolutely no opinion or judgement on the matter.
In the mall the other day I walked by a stand with baby clothes. One had a text on it saying “Lifelong Democrat”. It’s the same thing: Not cool. While our children are overwhelmingly likely to grow up with the political leanings of their parents, it’s completely unfair to label them as such until they at least have had a chance of forming their own opinions.
]]>Fivethirtyeight.com. says there's a 98.1% chance Obama will win tonight. But at least in this household we're nervously biting our nails all the same.
Here in California there's of course not any suspense over where the electoral votes will go, but there are a couple of looney awful propositions on the ballot. In particular prop 4 and prop 8. Both are basically too close to call in the latest polls. Unbelievable, but true. Please vote no.
(Of course the better solution is to not have to government marry people but rather just recognize unions and leave the marrying stuff to peoples personal lives - but in the US we're pretty far from there).
Anyway - hopefully early tonight it'll be clear that the next president won't be the one who's energy/security policy includes the illusion that there are no global markets and oil isn't a fungible commodity. Hopefully early tonight it'll be clear that the next president won't be the one who spent the last few weeks of his campaign talking about taxes with a guy who doesn't understand the tax system (and didn't get it explained). Someone please bring Joe the "plumber" to an accountant who can explain to him about how you don't pay income taxes on money that your business spends on salaries for employees. The number of weird and stupid things McCain and his campaign says is just amazing.
Or leave that alone and say McCain would make a decent president. I don't agree, but sure - whatever. Hello bad vice presidential choice. It's a joke! It's offensive. As Josh Marshall wrote today:
The woman is an ignoramus of almost unprecedented magnitude in the annals of national politics. It's not just that virtually every-non-Republican has a negative view of her. I just don't see a national party getting behind someone like that. And before you snark, "What about George Bush?" Sorry but there's no comparison. Whatever else I think of him, he's not a moron. And while he appears to be astoundingly incurious, there's simply no comparison to Palin.
The number of sane conservative thinkers who's endorsed Obama in the last week is incredible. How come there's even a contest anymore?
I can't imagine I'll ever be a republican, but I sure hope that after this the GOP will get it together and 1) kick out the looney evangelical christians out of "people we pay attention to", 2) quit with the Karl Rove inspired hate and fear mongering and 3) pick the smartest and most mentally alert guy in the room for their candidate for once. What's with electing the folksy "regular" folks? It's not a regular job!
]]>In late June, three weeks after my birthday, I got the most amazing birthday present you can imagine. My wonderful wife gave birth to our amazing now two month old daughter.
She sure keeps us busy. There are many awesome things to tell - about how cute, clever and strong she is already - but most of all then I am so grateful to Vani who by far is going through the most and doing the most. Whenever I play with Saffron, give her a bath or a bottle of breastmilk it is just such a joy.
Right now she is sweetly asleep upstairs and I have to fight the urge to go check on her or just see and listen every five minutes.
If you have a flickr account (and are listed as friend & family) then we have a few photos from July there.
]]>First of all - at only 170 pages it is short. Even though some of the key points are repeated through the book it's dense with information. You don't need any JavaScript experience, but it's not a "beginning programming" book so if you haven't been programming before this is not the right book for you.
Reading this book a couple of times will give you an appreciation for the JavaScript language that you almost certainly didn't have before. It'll give you tools to write better programs that you and others will actually be able to maintain over time.
I've learned lots of little things that I maybe knew from experience, but now I know and I know why.
This book will help you battle with JavaScript rather than against it.
(this review was also posted on amazon.com)
After reading half the book I went and bought a bunch of extra copies and had them sent to people I work with who are working with JavaScript.
]]>I’m considering putting up a version of the slides with sound. Would anyone want a 3 hour quicktime movie of that?
Well, maybe split up into smaller bits, but you get the idea. You wouldn’t get the hand waving, but you would get a bunch more detail, obviously.
I actually had audio recorded, but I haven’t checked how it came out, yet, and I’d have to make a synchronized movie version of the slides (and likely be tempted to just redo the audio anyway). Thoughts? Would it be a worthwhile effort?
]]>A couple weeks ago I ordered a Dash Express. Let me tell you: that is the future. The essence of the Dash is that it tracks your speed as your drive and uploads it to the Dash servers (anonymously, supposedly - they don't explain what they do to keep it that way) as you drive. Of course it also downloads traffic data from other drivers and historical data when no Dash owner has driven on a particular street recently.
It is so cool to watch. Entirely accurate? Not yet. Pretty darn good already? Check. A glimmer of the future? Definitely.
The Dash has wifi, but most of the time it uses GPRS for communication via Jasper Wireless. As a user you don't know that or even care, but as a geek it was fun to find out that there's a wireless carrier that doesn't do anything else than GPRS for mobile gadgets.
Since it's always connected they have built in local search via Yahoo local (not as good as YellowBot, of course, but pretty neat all the same ;-) ). They're working on giving you more "online data". For example there are gas prices and movie showtimes available, but the UI for that isn't very useful.
Dash are suggesting that the Dash Express will be the first GPS for "daily use" rather than just to be used when you are going somewhere new. it suggests multiple routes with distance and time based on current traffic. Pretty darn neat; but as I mentioned earlier - at least for my short-ish city-only stretches it's still not entirely accurate. I'm sure they'll improve on it though. The wonder of automatic software updates and constantly improved traffic data. I've noticed a big improvement in available traffic information just over the last two weeks since they started selling units to non-beta testers.
So to get better data it is of course in my interest to tell you to go buy one. In particular if you live in Los Angeles. :-)
If you are driving a lot and a lot to new places where you don't know the regular flow of traffic then I'm sure it's worth getting the first generation unit.
However, there's a lot to improve before I'm going to recommend it to "regular drivers" and non-early-adopter types. The physical form factor is, well, not really sleek. It's GIGANTIC compared to a modern Garmin Nuvi. They say it's to have room for the wifi and GPRS antennas (huh? The iPhone fits both a fraction of the space) and for the battery (which still only lasts a couple of hours; keep that 12v outlet available). The screen is a well working touch-screen, but the "hardware buttons" on top are crazy annoying when it's not mounted in the window or on the dash board.
Speaking of the screen - My first unit had a slightly defective screen, but Dash were great about getting it fixed and I could see that they had already improved their process on the replacement unit so it won't be an issue again. My second unit has been flawless so far.
Other than the issues mentioned above, it's mostly software improvements that are missing and those will hopefully be updated over time. It's done a bad job telling us a quick sequence of directions on freeways. It doesn't zoom in to show how clearly to navigate intersections or freeway interchanges. The UI is somewhat inconsistent (for example some buttons disappear when not usable, others stay on the screen and just "don't work"). It seems very slow at updating the screen at times. As I mentioned above the UI needs some work for some of the features to work better. Etc etc.
All that being said though - if you are in the market for a new GPS, give it a serious look. It's lots of fun and I trust the software will get better month for month. It will be interesting to see if or when Garmin catches up with the traffic data system or if one of the big players just goes and buys Dash. A device built with the Dash traffic data and with Garmins long UI, navigation and hardware expertise: drool. I guess if the Dash is the future; then that device is the future of the future.
Dash has a great help section that answers lots of questions in addition to their general feature overview.
]]>