Scotch

June 21st, 2014 1 comment

I was one of the rare holdouts in college who didn’t drink at all until he was 21.  When I started to drink, I decided to go top-shelf right from the start. The way I saw it, why should I slog through swill? I didn’t want to drink to get drunk; I wanted to drink to enjoy the taste of the drink.

I ran the numbers and found that if I drank only rarely, I could afford really good spirits. Thus, I had one of the fancier liquor collections in the fraternity house.

The liquor shelf in 2003 (partially Tyler’s). The three bottles in the center were about $100 each. Fancy for college — or now.

The problem with that approach wasn’t so much that I didn’t learn to appreciate the good stuff — I did — but rather that it was easy to skip over cheaper options that I might have liked even better.

Jump ahead 11 years to last night. My friend Andrew decided to hold a Scotch tasting. About ten of us showed up, each with a bottle or two — or three — in tow. We made an effort to obscure the labels of the bottles using bags and masking tape, numbered each bottle, and proceeded to try very (very!) wee drams of all 23 options.

First, the good news: nobody had snuck* in anything particularly foul. Yes, there was an Irish whiskey and a bourbon, but they were both of decent quality. It was really a matter of distinguishing between the inspired drams and the ones that could be dismissed with a shrug.

Here’s the complete list of what we tasted, in no particular order, but probably with a few spelling errors:

  1. Macallan 15
  2. Glenlivet 12
  3. Laphroaig 18
  4. Macallan 12
  5. Macallan Fine Oak 10
  6. Laphroaig 10
  7. Cutty Sark Blended
  8. Glenlivet 21
  9. Speyburn 10
  10. Auchentoshan Classic
  11. Glenfiddich Reserve Cask
  12. Balvenie 14
  13. Dalmore 15
  14. Breckenridge Bourbon
  15. Glenlivet 12 (again)
  16. Glenlivet 18
  17. Speyburn 10 (again)
  18. Silver Seal 16
  19. Cragganmore 12
  20. Dalwhinnie 15
  21. Finlaggan Old Reserve
  22. Lagavulin 16
  23. Tullamore Dew

There was lots of swirling and sniffing and sipping during the blind tasting. Our palates became increasingly fatigued. Nobody got drunk — the tasting pours were just a few milliliters each — but whether from the alcohol or the shared challenge, discussion flowed, punctuated by brief fits of laughter. The temperature rose around the dining room table thanks to the bodies of the tasters and the occasional curious wine drinkers. The room smelled vaguely of vanilla.

Finally, everybody had tried and rated every option. We unblinded the bottles one by one. We each had favorite horses.

The Glenlivet Scotches on hand

As we reported our scores for each bottle, it quickly became clear that we had different tastes. One man’s 5 (out of 10) was sometimes another man’s 9. Most polarizing were the Islay (pronounced eye-luh) Scotches, of which there were four. Some loved the peatiness. Others hated it and spent the next three drams complaining about how they still could taste nothing but smoke.

There were accidental duplicates in the running, present as unintentional consistency checks: both the Glenlivet 12 and the Speyburn 10 appeared twice. To the latter, I scored it 6 the first time and 7 on the second pass.  For the Glenlivet 12, it was 7 and then 9.  And that was the really big surprise: that second pass of the Glenlivet 12 was my highest-rated dram — but the bottle was one of the cheapest present.  With a range among the 23 options from $15 to $150, with a mean of $60, I had expected the higher-priced bottles to do better, but that was not the case.

For me, the more expensive bottles often tasted unduly flat, almost too smooth. I seemed to prefer the options with a bit more fire and spice to them.

It’s not clear why that was the case. I suspect that I burned up my palate early on while trying to suss out how much water to add to each pour. That, in turn, might have prevented me from tasting much at all in the “better” Scotches and having the harshness muted in those of not quite so high stature. Or it could have been fatigue, or it could have been the temperature of the bottles, or maybe the types of glasses that we used. Hard to say.

Or maybe, just maybe, I actually prefer the cheaper stuff. My college self would have liked to have known that.

* As an aside, “snuck” is an example of a recent evolution of the English language. The generally accepted past tense of “sneak” is “sneaked”, but that appears to be changing. Also, I’ve decided that I’m no longer a fan of the American convention of putting commas and periods inside of trailing quotation marks unless actually quoting somebody; doing so causes problems when using quotation marks to call out literals. Thus, I’m going to start following the British method there.

By the trail

June 1st, 2014 Comments off

I took my coffee and sat down at the picnic table a couple dozen steps from the trail. A 20-year-old birch tree shielded me from the morning sun. The pavement of the trail looked hot even at that early hour.

I appreciated what passes for silence in an urban park: only the distant din of traffic and the wind rustling through the leaves. A robin flew into my tree, sang a few  times, and flew off. An old man on a rusty department-store cruiser rode by, squeaking a response to the bird’s song.

Most of the cyclists on the trail were alone. There were exceptions. Couples on a slow helmet-less roll. Clubs of half a dozen whizzing past. A mother with her two young children, not far removed from trailing wheels. A father with his teenage son in matching Lycra kit, both on slick road bikes.

Some of the cyclists rode new carbon steeds. Others tooled around on whatever had been in their garages for the past couple of decades. Skill did not always match tools. A guy on a beat-up mountain bike flew by with a high cadence and a steady hand. A white-haired man with a paunch on a $10,000 Obrea tri bike strained along in the big chainring and little cassette cog.

Most riders gave a respectful “On your left!” to the pedestrians. Some were more inclined to treat their unmounted brethren as inconvenient pylons meant for passing as close as possible. Few dogs were on the trail; they played in the large green field behind me.

My coffee gone, I too set out on foot again.

How I introduced a 27-year-old computer to the web

December 12th, 2013 165 comments

Reviving an old computer is like restoring a classic car: there’s a thrill from bringing the ancient into the modern world. So it was with my first “real” computer, my Mac Plus, when I decided to bring it forward three decades and introduce it to the modern web.

My Macintosh Plus. Spoiler alert: here it is surfing Wikipedia.

It’s a lowly machine, my Mac. The specs pale in comparison to even my Kindle: 8 MHz CPU, 4 MB RAM, 50 MB hard drive, and 512 x 384 512 x 342 pixel black-and-white screen. My current desktop PC is on the order of 200,000 times faster – not even including the GPU. Still, that Mac Plus was where I cut my computing teeth as a child. It introduced me to C, hard drives, modems, and the internet.

Yes, in a certain sense, my Mac has already been on the internet, first via BBSes and later via Lynx through a dial-up shell sessions. (There’s nothing quite like erotic literature at 2400 bps when you’re 13 years old.) What it never did was run a TCP/IP stack of its own. It was always just a dumb terminal on the ‘net, never a full-fledged member.

How hard could it be to right that wrong?

Everything went smoothly at first. I had my mom ship the computer to me. It arrived in good condition, having been stored undisturbed in her basement since the mid-1990s. I plugged it and its external hard drive in, flipped the power switches, and watched the happy Mac glow to life on the tiny CRT. Sure, the hard drive gave a groan of protest when it first spun up, but that quieted down, and everything seemed stable with the data intact. At least for the first few minutes.

I was far down nostalgia lane playing a game of Glider when all of a sudden there was a loud *POP* and the smell of smoke. Panicked, I slammed the power switches off and pulled the plugs. It didn’t take much sniffing to find the source of the acrid odor: the external hard drive.  The stress of current after years of disuse had proved too much for one of the filter caps in the external drive’s power supply.

A cracked XY cap from the external hard drive’s power supply.

Fortunately, Digikey still sold those exact caps(!), and I’m handy with a soldering iron, so a few days later I was back in business. On to the networking!

To accomplish my goal, I needed a web browser, a TCP/IP stack, and some way to connect the Mac to my home network.

The web browser was relatively easy to find thanks to guys running long-forgotten FTP sites in the dusty corners of the internet. MacWeb 2.0 was both old enough to run on my Plus and new enough to render HTML and speak HTTP. Sort of. But we’ll get to that in a minute.

A whole 4 MB of RAM to play with! Good thing, because MacWeb required 2 MB.

Likewise, MacTCP existed in a version just barely able to run on System 7.0.  It didn’t support niceties like DHCP, but MacWeb was happy to use it, and it installed without problems, so there was the TCP/IP stack.

Getting the Mac physically hooked to the network was a bigger challenge. The Mac Plus didn’t have an Ethernet port, and things like WiFi were years from being invented when it was manufactured. A couple of companies made SCSI-to-Ethernet adapters about 15 years ago, but those were rare and expensive. I thought about the problem for a while, and it occurred to me that I could channel the early days again: I could use the serial port and PPP or SLIP to bridge to the outside world. Like dialup without the modem.

I set up my Raspberry Pi and ran some Cat-5 to it from the router.  Using a level shifter and a variety of old adapters, I managed to get a serial cable working between the Pi and the Mac. That took care of the hardware.

On the software side, I scrounged around and after several failed attempts found a PPP client that would run on the Plus and a super-simple PPP server called SLiRP for the Pi. Documentation for the combination of MacTCP, MacPPP, and SLiRP was, surprisingly, still available.  After a bit of tinkering with the configuration, I was able to get MacTCP to talk to MacPPP, MacPPP to talk to SLiRP, SLiRP to use the Ethernet connection, and so on through my router and out to the internet.  Since serial I/O on the Mac Plus was processor-intensive, throughput was limited to about 19 kbits/s, but 19 was a lot higher than 0.

A Raspberry Pi doing the heavy lifting for the computer that’s tens of thousands of times slower. The mess in the upper right is a level shifter, a null modem, a DB-9 to DB-25 adapter, and a serial cable.

Now, you might be wondering, “Wait, how did you get all of that abandonware on there in the first place?”  Good question! The Mac’s floppy drive was old enough to be fundamentally incompatible with PC drives, and I didn’t have any floppy drives in my modern computers anyway.

I tried going down the avenue of 100 MB ZIP disks, since ZIP drives were made in both USB and SCSI-1 versions. While I did manage to get the Mac to use the ZIP disks (and in fact switched to one for the primary boot drive), and even got my Windows PC reading the HFS formatted disks using some nifty tools, every attempt to move data from the PC to the Plus resulted in nothing but corrupted files on the Zip disk.

That left the serial port. I happened to have an old terminal emulator called Microphone already installed on the Mac. Microphone supported ZMODEM for file transfers, which you’re probably nodding your head about if you remember BBSes. Thus, to transfer files to the Mac, I SFTPed the questionably legal  files I needed from my PC onto the Raspberry Pi, plugged the Pi into the serial port, fired up Microphone on the Mac as a terminal, and launched Minicom on the Pi from the Mac. I nervously struck the keys to initiate a ZMODEM transfer from Minicom, selected the files, and hit enter. Minicom obliged, there was a BEEP! and a “Save incoming file?” dialog popped up on the Mac. Some un-binhexing later, I found myself running new software on my old Plus. Huzzah!

So, with the Raspberry Pi, MacTCP, and MacWeb all in place, it was time to surf the web! Right? Right?!

No. No surfing yet.

The MacWeb developers apparently took a look at the HTTP 1.0 spec, decided, “Who would ever need name-based virtual hosting?” and left out the feature that 99% of the sites on the modern web relied on. No support for virtual hostnames meant you got whatever you saw when you used the server’s IP address alone in the HTTP request, and for most sites, that was jack squat. Oh, and HTTPS, cookies, and CSS hadn’t been invented yet.

AAARGH!!!

I vented about the problems to Tyler, mentioned that I was in for a long stretch of coding to solve it, and was surprised when he whipped up a filtering proxy solution in about 20 minutes using Python, Requests, Flask, and Beautiful Soup.  (Update 12/16: here’s the code from Tyler.) The key to it all was that MacWeb would include the full URL, with the hostname, when making a proxy request.  Requests fetched the URL, stripped SSL and managed any cookies. BeautifulSoup stripped out things that MacWeb couldn’t understand, like CSS, Javascript, images, and DIVs. Flask pulled proxy duties, reading the request and sending the filtered result back to the Mac.

And that, friends, was sufficient to surf the web. It even looked surprisingly decent, almost like a mobile browser:

The Mac Plus Wikipedia page, as viewed on my Mac Plus

 

Hacker News as viewed by the Mac. Surprisingly readable given that MacWeb doesn’t support CSS

Sure, it was slow as hell, but it worked! Data loaded, pages rendered, and links were clickable. Even forms sort of worked.

Did I mention it was slow? It was slow. Soooo sloooow. Slow slow slow.  Like, minutes to read and render a page slow. Here’s a video showing how slow:

Whatever. The goal was simply to introduce the Mac to the web.

The meet-and-greet was successful.

Epic Race

November 12th, 2013 1 comment

A few weeks ago, I was at Luke’s house when his roommate mentioned a contest: be one of the first to visit all of the ski resorts covered by the Epic Pass this season, and win an Epic Pass for life. I was intrigued.

I mentioned the idea to Tyler to see if he’d be interested in making it happen. We both are fortunate to have flexible schedules, so it seemed at least plausible that we could make a run for the prize. We’d already embarked on a race to be first in line to opening day 2012-13 (and lost), so how much harder could the Epic Race be?

Epic Race: Ski 26 resorts first, get a ski pass for life

We fired up some web browsers to work out the logistics.

The full Epic Pass covered 26 ski resorts spread among 4 countries, including 5 states in the US. We would need to visit each resort, but no earlier than November 22, and no more than one resort per day (in the US), nor more than two per day abroad. Since 12 of the resorts are in the US, with the remaining 14 in Europe, that meant an absolute minimum of 19 days of skiing.

Skiing with friends at A-Basin, one of the Epic Race locations, in May 2013

Further complicating the matter was that not all of the resorts open at the same time. Some had been open since October, others wouldn’t open until mid-December. Not all of the resorts had published opening dates, but history, recent weather patterns, and the resorts’ web sites suggested that openings would be, and I quote, “SOON!”

Here’s how it broke down:

Resort Opening date Location
Vail 11/22/2013 Colorado
Beaver Creek 11/27/2013 Colorado
Breckenridge Open Colorado
Keystone Open Colorado
A-Basin Open Colorado
Eldora 11/22/2013 Colorado
Canyons 11/29/2013 Utah
Heavenly 11/22/2013 California
Northstar 11/22/2013 California
Kirkwood 11/22/2013 California
Afton Alps Soon! Minnesota
Mt. Brighton Soon! Michigan
Verbier 11/30/2013 Switzerland
St Anton 12/6/2013 Austria
Lech 12/6/2013 Austria
Zurs 12/6/2013 Austria
St. Christoph 12/6/2013 Austria
Stuben 12/6/2013 Austria
Courchevel 11/23/2013 France
La Tania 11/23/2013 France
Meribel 11/23/2013 France
Brides-les-Bains 11/23/2013 France
Les Menuires 11/23/2013 France
Saint Martin de BelleVille 11/23/2013 France
Val Thorens 11/23/2013 France
Orelle 11/23/2013 France

Add in a few days for travel, and the minimum time quickly swelled to about three and a half weeks. In order to maximize the odds of winning, the final resort visits needed to be on the first day those resorts were open, which implied that the trek really needed to start as soon as possible. That meant starting on November 22 and doing nothing except skiing and traveling for the better part of a month.

That’s just the cost in time. The cost in terms of money was even more onerous.  Being that we lived in Denver and knew people around Minneapolis, Detroit, and Salt Lake City, we figured we could get away with paying for lodging and transportation just in the Tahoe region and Europe. Still, the cost for flights, hotels/hostels, and cars looked significant. That didn’t even cover the opportunity cost.

We had jobs that could be done remotely from anywhere in the world. We were fortunate in that regard. Unfortunately, the burden of travel meant that only a portion of our usual work output would be likely to get done for the month on the road. Thus, the need to factor in some lost wages.

Here’s a quick accounting of the cost per person, assuming two people went and split relevant expenses:

Item Cost
Flight to/from Europe 1200
Flight to/from Reno (for Tahoe) 200
Flight to/from Minneapolis 200
Flight to/from Detroit 200
Flight to/from Salt Lake City 150
Hostel for 9 nights in Europe 450
Hotel for 3 nights in Tahoe 225
Car for Tahoe, inc. gas 80
Car for Europe, inc. gas 500
Ski rentals everywhere (cheaper than paying to bag-check skis) 400
Lost income 10000
Total $13,605

…or whatever my actual income was :)

And what would we win if we were to accomplish the task first? An Epic Pass every year for the remainder of our lives. With a nominal value of $729 per year, and perhaps 30 years of skiing left, that was a value in nominal dollars of about $22,000. But wait! We’d owe taxes on that, so a better approximation of the value was be more like $13,000.

The expected value got even lower after factoring in the possibility of not being one of the first 10 finishers (thus winning nothing) or considering the potential future value of the money applied towards the expenses.

It would have been a fun trip, and no doubt it would have served as great fodder for stories, but the cost in time and money was too high for the potential payoff.

 

Are Yahoo Mail users better customers than Gmail users?

September 28th, 2013 Comments off

I admit it: I judge you by your email address. Whenever I see a Hotmail, AOL, or Yahoo address, I can’t help but think that somebody didn’t get the memo in the mid-aughts about moving to Gmail.

It’s ridiculous. Somebody is no better or worse of a person for using Hotmail. All a non-Gmail, non-custom, address probably means is that the person is outside of my peer group. And yet the bias remains.

But what if that bias is justified? What if I can tell how good a customer will be based on the customer’s email address?

I run a µISV, in which I develop and sell blur reduction software called Blurity. As such, I have a large collection of purchase data, and that purchase data contains email addresses.

Let’s start with a look at the distribution of email address domain names. I queried the purchase logs for Blurity from October 2012 to find the percentage of purchases associated with each second-level domain name. Domain names that appeared in less than 1% of purchases were consolidated in an “other” group. From that, we can see that Gmail, Yahoo Mail, and Hotmail are the most popular email services for Blurity customers:

Proportions of email address domains for Blurity customers. Domain names representing less than 1% of customers are grouped in “Other.”

Not surprisingly, people also liked holding on to email addresses more commonly associated with ISPs, such as AOL, Comcast, Verizon, BT, and COX. (Time Warner fell just below the 1% threshold and is part of “other.”)

It’s tempting to say that Yahoo Mail is under-represented among customers, since in terms of US email service market share Yahoo Mail is roughly equal with Gmail. However, the share percentages are about equal for international users of the two services, and I couldn’t split my purchase data on international lines. Dead-end there.

So what can we look at? Maybe something to do with money? Well, how about return rates?

Blurity has a somewhat high return rate. Embarrassingly high. I attribute that to a combination of my poor UI design skills and the general misalignment between public perception of blur removal and the reality of blur removal. That, in turn, is complicated by the fact that many people don’t read directions, which is further compounded by people believing that the purchased version will somehow do better blur removal than the free trial version. As a result, I occasionally get emails asking for refunds. It’s the cost of doing business. I’d rather have a happy former customer than an angry customer.

I’d always felt that refund requests were more likely to come from the group of people using email services associated with lower degrees of technical prowess, such as Hotmail and AOL, so I was a bit surprised when I pulled the actual data. Turned out that the refund rate of Gmail users was not significantly different from that of Hotmail users, nor for Gmail versus AOL users.

Yahoo was a different story. The refund rate for Yahoo Mail users was higher than that for Gmail users at a statistically significant level (chi-squared, p<0.05). Found one!

That's enough data to recover the underlying numbers. Or is it?

Return rate by email domain name.

A higher return rate is nothing to celebrate, but it’s always nice to discover data vindicating one’s intuition.

Overall, Gmail was pretty good, or at least average. The only group that had a non-zero return rate significantly lower than Gmail was the “other” bin (chi-sqaured, p<0.01). The rest of the domains, including Hotmail, were all roughly the same, ignoring a few small-sample-size zero counts.

After seeing those results, I wondered what other predictors for refunds I could find in the data. I suspected that customers paying with PayPal might have higher return rates than those paying with Stripe. Maybe it was a manifestation of the general ill-will towards PayPal in the tech community? Much to my surprise, the refund rate for PayPal is actually lower than for Stripe (6.4% vs. 9.5%), though not significantly so.

I’m not sure that this is very actionable. The vast majority of customers of all sorts, including those with Yahoo email addresses, have been fantastic. I might try an A/B test for users entering Yahoo email addresses, perhaps with a more explicit link to the user manual or a tutorial video. Of course, if I think that might work for Yahoo users, then I might as well try it with everybody. Yahoo might have more refund seekers than other services, but it by no means has a monopoly.

Update: Had another customer make a purchase today and then almost immediately (three minutes later) ask for a refund. The customer’s email service? You guessed it: Yahoo.

Update 2: My friend Luke pointed out that the Live.com, Hotmail, and MSN domains are really all just Microsoft services. I agree, they should probably be grouped. I ran the numbers again with that grouping and found that the refund rate for the Microsoft properties does not differ significantly from that of Gmail, and the confidence interval is very similar to that charted for Hotmail alone. Thus, the conclusions above remain unchanged. Regardless, a good catch!