February 2009 Archives

Nectar - Moscow, ID

Last Friday, Catherine and I went out to dinner at Nectar, in Moscow. Located on Sixth Ave between Jackson and Main, Nectar labels itself as a Restaurant and Wine Bar. Having dined there, I suspect I’d only be returning for the wine bar.

Now, this isn’t to say that the food wasn’t good. It was. Just, as a restaurant, I wasn’t horribly impressed.

We showed up for our reservation, only to have it revealed that our reservation, which I’d made a week prior, wasn’t on the books. This was mildly annoying, but the restaurant did manage to get us seated within about twenty minutes or so, which was fine given that we were able to get drinks for the interim.

This does reveal one of the biggest problems I found with the restaurant. The layout is weird. The bar and the kitchen sit in the center, but the tables are spread out across a series of narrow hallways. This was a bit awkward before we had a table, since we’d had nowhere to sit. The greeter confided that they’d had our situation happen with several other tables, so there was a bit of a crowd in the waiting area.

By the time we got a table, we were ready to order, so we did. Our appetizers were a small bowl of their Macaroni and Cheese, and a plate with several different kinds of cured meats, cheeses, olives, and some homemade fennel crackers. The crackers had a bit more fennel than I might have liked, but the salamis and cheese more than made up for it. Catherine even loved the olives, and she generally dislikes them. Though, I suppose these olives were far more mild than most grocery store olives. The macaroni and cheese was excellent as well. The cheese sauce consisted of Cougar Gold, Gruyère, and Parmesan. The Gruyère did an excellent job of reducing the sharpness of the Cougar Gold, and the macaroni and cheese had an amazing flavor. While it was available in an entree size, it was so rich, I’m glad we only got it as an appetizer.

For our entrees, Catherine ordered a Bacon-wrapped Meatloaf with a Chipotle glaze served with baby carrots and mashed potatoes, while I had a white truffle pasta served with a poached egg on top. The meatloaf was excellent, aside from the fact that the bacon was completely unnecessary. It completely disappeared behind the Chipotle glaze, and was a bit too gristly anyway. The carrots were well cooked, and the potatoes had just the right amount of seasoning (including some garlic and horseradish). My pasta was perfectly prepared, the sauce, particularly after breaking apart the egg (whose yolk was beautifully orange), was deliciously creamy, with again a perfect amount of seasoning.

The wine, since such a thing must be discussed at a wine bar, was also an excellent experience. I had a glass of a Chilean red wine, and a flight of Alsatian White Wines, including the best Riesling I’ve ever had (a 2007 Trimbach), it being dry instead of ridiculously sweet like many Rieslings are. The only wine we tried that night that I didn’t like, was the 2006 Hugel et Fils Gentil Hugel Blend, which had a distinct over-aged goat-cheese flavor right in the middle. I’m sure somebody will love that wine. I’m not that somebody.

So, given that we loved the food and wine so much, why am I not liable to return? Well, the price is one reason. After tip, we paid just shy of $100 for the meal, and for that money, I’ve had a far better dining experience. One that didn’t feel as cramped. One where I felt that my service was better than just passable. However, I would gladly go back for a few glasses of wine, and an appetizer. I just feel that experience would be more worthwhile than the full dining experience.

The Return of Star Conquest

Five years ago, one of the greatest online games I’ve ever played. Star Conquest, will finally return. Sure, it’s a text-based MOO, but it’s got great depth, lots to do, and a stable advancement system that even allows new players a change to catch up with long-time players.

While the game had shut down five years ago, I hadn’t played in at least a year prior to that due to a conflict with another player that had simply gone too far. I had always intended to return, but the situation was complicated, and I’m not going to go into it right now. Needless to say, the relaunch greatly excites me.

In the intervening years, the story has moved forward with Humanity being subjugated by an alien species known only as the “Outsiders”. They’d been around in the old story, but had seemed to be background characters, in fact the only aliens I was ever a part of the a conflict with was the IFS (which I beleive stands for Intergalactic Federation of Species).

As part of the subjugation, Humanity had almost of their ships destroyed, their will broken, Humanity exists almost solely as production colonies for the Outsiders, with virtually no trade between worlds. This included almost completely destroying humanities capability for research. In the midst of the turmoil, a new Alliance formed, consisting of formerly Unregistered pilots. The Sovereign Mutuality of Disparate Freemen is an interesting new development, bringing new technology and places to the table.

Star Conquest isn’t likely to be for everyone. The Hosts have assured us that the game will be strictly Roleplay enforced (there is an OOC communication mode, for certain use, and if two people want to have a private conversation OOC, as long as it’s in a private place), which should do an excellent job of keeping people on focus and keep the game from falling apart, particularly since the game has fair rules for avoiding excessive bullshit. In keeping with the RP-heavy spirit, the game has a requirement that you submit a written history for your character before you’re accepted as a Full pilot. Mine is probably excessive (it’s 4k), but the requirement should set a good base level for player involvement.

The game officially launches on Saturday, though you can already log in now, by pointing your telnet (or MUD/MOO client) to squidsoft.net port 7777. Create your character, fill out your profile, and wait for approval.

CURRENT OPENING NIGHT ITINERARY:

6PM Game Time (7PM eastern, 4PM pacific, 12AM GMT): DOORS OPEN. Your character 
can buy a ship and explore what the beta characters have been seeing. IF YOUR 
PROFILE HAS NOT BEEN APPROVED BY THIS TIME, you may join as a cadet.

8PM Game Time (9PM eastern, 6PM pacific, 2AM GMT): OPENING EVENT WILL BEGIN. 
Your character should be in possession of a ship and ready for anything at 
this time if you wish to participate!

OPENING NIGHT IS SCHEDULED TO BE SATURDAY, FEBRUARY 28TH.

If you’re interested in a great role-playing experience, if you’re ready to be the Catalyst to shrug off the oppressors of Humanity, you should definitely try out Star Conquest. The opening event promises to be a good time. And I’ll be in as Caleb Tyrin, so don’t be afraid to wave if you see me.

Obnoxious Sounds As Deterrents to Young People

Train Horns

Created by Train Horns

Yeah, I passed the test, but it made me feel like my brain was trying to escape out of my ears.

I’d heard about this tone quite a while ago, first as a cellphone ringtone because kids loved that their cell phones could ring without their teachers realizing, but also as a deterrent by shop keepers. I know that I wouldn’t stick around if I could hear that tone. This reminds me, in a sense of Jeff Atwood’s post on Monday about Rate Limiting, where he features a photograph of a door which has a sign limiting only three students in the store at a time.

Personally, I’m not sure it’s ever worth alienating an entire class of customers, particularly young customers. Hell, even at times when having younger people around definitely can interfere with my shopping (Comic Book Shops or Game Stores), I just can’t imagine begrudging them the right to be at the store. Sure, things occasionally need to be dealt with, but it just seems to me that those issues can be dealt with on a case by case basis. But, if that’s what the shop-owner wants…whatever.

Not that I’m against sound as a deterrent. LRAD is one of the coolest things ever, and it has been used successfully to fend off pirates on the high seas and stuff. However, LRAD is at least a directed sound beam, whereas using this mosquito sound as a deterrent is just painfully generally.

Cleaning Hard Water Off Your Shower Head

Our current apartment has an awful hard water problem. It’s kind of strange, because we only moved across town, but the water here is harder enough that our soap usage has almost doubled. Even worse, as the hard water deposits start forming on our showerhead, the water pressure in our shower would drop dramatically, and the water would come out in strange directions which tended to make quite a mess.

Now, to deal with this we could have looked for a commercial product, like C-L-R Calcium Lime and Rust Remover, but that product has 2 ingredients listed as OSHA hazards. Not to mention that it’s listed as an irritant to virtually every soft tissue on the human body. Surely, there has to be a safer and easier way.

Like in almost all cleaning issues, I turned to Jill Potvin Schoff’s book, Green Up Your Cleanup. The answer she provided, was simple, and really quite effective. White Distilled Vinegar.

Now, she suggests putting the vinegar in a plastic bag and rubber banding it around the head, keeping everything together. This wasn’t really going to work for us, since we have a large shower head, which is wide enough to make it impractical to do the plastic bag trick. However, simple removing the showerhead and setting it in a pan with a deep enough layer of vinegar accomplished the same effect.

This isn’t exactly a fast operation, mind you. But given that we don’t need to run our shower all day, taking the shower head off for a couple hours (which given the amount of build-up, probably was a bit overkill). After we felt it had had plenty of time, I just took a terry washcloth, and rubbed the vinegar off the showerhead, with it came all the white mineral deposits. After putting the showerhead back on, the stream was stronger, and best of all it was straight.

Given the low cost of vinegar, I know we’ll be putting this tip into practice more often. Hopefully, you can too.

Code Review Practices

3 Comments

A while back, I saw an ad on Stack Overflow for a free book from Smart Bear Software entitled “Best Kept Secrets of Peer Code Review”. While Smart Bear does sell their own Peer Code Review tool, CodeCollaborator, the majority of the book doesn’t stress how great their software is, but instead lays out a series of arguments on why you should consider doing Peer Code Review in the first place.

It’s generally well recognized that to become a better programmer, we all need to be reading code as much as possible. Jeff Atwood, over at Coding Horror, talks about this as developing your nose for “Code Smells”, or learning to identify those pieces of code which simply need to be refactored or replaced. A lot of what makes code “good” can only come from experience. This isn’t to say that the new is necessarily bad, but that as we read more code our ability to determine what is good versus what is bad generally improves.

Peer Code Review, which has been practiced informally in the Open Source World forever, also ties into Eric Raymond’s axiom in his work “The Cathedral and the Bazaar”: “Given enough eyeballs, all bugs are shallow.” Basically, with enough people looking at the code, someone is bound to be able to fix any problems that might exist there. In practice, it’s not that simple, and the fact that this peer code review in use is so informal, means that sometimes code isn’t reviewed as strenuously as it could be, for one thing, a maintainers code is very rarely as exactingly reviewed as the code being added by a previously unknown contributor.

I’m not really advocating Formal Peer Code Review in the open source world, but many organizations lack even the informal methods used in the Open Source world, but due to their nature, they might benefit more readily from formal code review process. Part of this comes down to the economics of commercial software. It is well understood that detecting a defect early makes that defect far less expensive to correct. By catching a fault before it gets to the customer, a company can often save thousands of dollars per defect (taking into account the support personnel’s time, developers tracking an error message to the fault, etc). Catching an error before the QA team at the company finds it, can still result in an impressive cost savings, as the developer doesn’t have to task switch to solve a problem.

Peer Code review is simply one tool to catch defects sooner, rather than later. The case studies in this book do an excellent job of starting at the beginning of Code Review theory, namely that code review was best conducted in regular meetings, highly formalized, and ran with strict rules and guidelines. These methods worked well for many companies, particularly those who worked in software for mission- or life-critical industries (don’t want a bug in your X-Ray machine software that causes the system to inadvertently dump fatal levels of radiation into a patient 0.01% of the time). However, in many situations, it’s hard to convince people of the need, and such formal meetings don’t allow for developers to schedule their time in the manner most productive for themselves.

Enter the tools. Beginning with E-mail, we had the ability to perform review (formally or informally), while allowing reviewers to choose when they would review, allowing them to choose times when they might be able to review more effectively. While it may not seem as convenient as having all the reviewers in a room together for an hour a day, it has the potential to be much more effective, as developers won’t be forced to interrupt effective development time with the meeting, allowing them to review when they need a break from writing code.

However, as the book does stress, E-mail doesn’t provide any methods for collecting metrics. You don’t know how long a person has spent performing a review. How many defects they’re finding per hour, or per thousand lines of code (kLOC). No one likes having an enormous amount of pomp and circumstance around what should be a fairly simple process, particularly when the payoff, in the form of statistics about the process, aren’t obviously beneficial to the person who’s supposed to be collecting them. A fair portion of the book is to tell you how great CodeCollaborator is at collecting these sorts of metrics for you.

I’m not planning on pushing code review at work just yet, though I’m giving a lot of thought to thinking about doing so. And while there was little in this book that I hadn’t heard before (having studied software engineering at a University not too long ago), but it is probably the best overview of the problem of Peer Code Review that I’ve been able to find. If, by the end of the book, you’re not at least thinking about how you could use Peer Code Review, then I posit you’ll never be convinced. With that said, the latter part of the book is certainly trying to sell you on their specific product.

It certainly looks like a good product, but lately, I’ve begun looking at an Open Source alternative: Review Board. Review Board was originally developed by VMWare, but has since begun being used by some teams at Yahoo! among many others, which certainly lends it some “if it’s good enough for them…” clout. It’s also designed using Django, which makes it fairly easy to deploy to most any operating system you might have available.

It’s a similar project to CodeCollaborator. It integrates in with your Source Control. It allows online browsing and commenting on a patch. Unlike CodeCollaborator, Review Board doesn’t allow IM-style communication, doesn’t collect the same types of metrics, and doesn’t seem to allow line-level comments. However, it is free software. Allowing you to freely extend it, and not requiring a lot of financial outlay at the beginning. As a developer, and given this is a tool for developer’s, I certainly prefer that. Others will disagree. Plus, since Review Board is MIT licensed, you’re not required to share your changes (though often, it will likely be better to do so).

I like what I’ve seen of Review Board, and I’m not convinced that the additional features that CodeCollaborator offers (today) are worth whatever they want for them (Smart Bear doesn’t list licensing rates on their website). If you’re not sure if you’re interested in Code Review, order this book. If you’re trying to convince management to use Code Review, order this book. If you’re looking for a code review product… I’d try Review Board first, myself.

Sometimes Case Insensitivity is a bit Overkill

I’m working on a set of code that requires that I synchronize some data fields between my modern Microsoft SQL Server 2005 instance, and our ancient ADABAS-based mainframe database. There are two circumstances where I need to validate this information, the first, is when I need to validate that data has been entered into the mainframe correctly (currently, this entry is being done by hand, we’re lobbying for a programmatic method), and the second is when I need to reimport the results of a particular task that modifies the mainframe, but not my own system.

However, I ran into a fairly bizarre issue related to how C#, LINQ, and SQL Server all operate. For one particular course, a text field was set defining the title of the course as ‘Ballet’. Since the ADABAS doesn’t support lower-case characters this got entered into the mainframe as ‘BALLET’. It’s minor, and forcing the word to capital in our system shouldn’t have been a problem, so during the reimport, I allow such things to be “corrected”.

However, during the import, this particular change resulted in a LINQ Exception. It didn’t think the row had changed, and therefore threw an exception when I tried to update it.

var masterSection = (from m in writer.MasterSections
                                        where m.Id == secInfo.MasterSectionId
                                        select m).Single();

masterSection.Title = ssData.SectionTitle;
writer.SubmitChanges();

ssData is the collection of data from the Mainframe, which I’ve queried using other tools. As I say above, this would consistently throw an exception, claiming that there were no changes to commit. Eventually, I ended up assigning this:

var masterSection = (from m in writer.MasterSections
                                            where m.Id == secInfo.MasterSectionId
                                            select m).Single();

if (!masterSection.Title.Equals(ssData.SectionTitle, StringComparison.OrdinalIgnoreCase)) {
    masterSection.Title = ssData.SectionTitle;
    writer.SubmitChanges();
}

It’s really important to note that the Equals call requires an IgnoreCase flag. Without it, C# sees the two strings as being different, and will continue to except on this SubmitChanges call. But, I thought to myself, isn’t ‘Ballet ’ and ‘BALLET ’ different? Apparently not, at least in Microsoft SQL Server.

SELECT CASE WHEN 'Test ' = 'TEST ' THEN 1 ELSE 0 END;

Execute that query in SQL Server, and you will get the value 1 in the result set. I’m going back and forth on this issue. It seems to me that this is a bug, but I’m not sure how many would agree. Any opinions?

Upcoming YUI 3 Features: Collections

A new YUI module is being introduced in the next version of YUI3. I don’t know for sure when it will be released, but the code is already up on Github, if you want to check it out now. The current member of the Collection object consists of a set of Array operations that were deemed useful enough to be in the library, but not common enough problems to justify being in the core YUI Array library. The extension consist of a few methods I wrote, and a set of array operations contributed by Gabe Moothart, which originated from Christian Romney’s yui-functional package.

The functions being added are as follows: 1. some: This used to exist the the core Array library, but has been moved to this Collection library. It will execute a given function on each element in an array until the enclosing function returns ‘true’. This method will use the native implementation, if available. 2. lastIndexOf: Similar to the wrapper on indexOf, this returns the lastIndexOf of a given element. It will use a native implementation, if available, and like the native implementation, it will not work when comparing objects. This was one of my contributions. 3. unique: Returns a copy of an array to the caller with all the duplicate entries of an array removed. Like indexOf and lastIndexOf, this does not work on Objects. It takes an optional argument to determine whether it should sort the array or not. By default, it will sort the array. This function was contributed by myself. 4. filter: Executes a filtering function on each element of an array, returning a new array containing only those elements for which the filtering function returns true. Will use a native implementation, if available. 5. reject: Same as filter, but selects those elements for which the function returns false. 6. every: Like some, but executes on all members of an array. Uses a native implementation, if available. 7. map: Returns a new array containing the result of executing a given function on each element of the argument array. Will use a native implementation if available. 8. reduce: Executes a function on each item in the array, ultimately folding the values into a single value. Will use a native implementation, if available. 9. find: Like indexOf, but bases the return on the result of an argument function, allowing better control over what is returned. 10. grep: Returns a new array of elements from the initial array which match the given regular expression. 11. partition: Execute a function on each element of an array, returning a pair of arrays, one which contains matches, one which contains rejects from the method. 12. zip: Takes two arrays, and returns a new array with members taking from each of the child elements.

Aside from the fact that some of my code was accepted (which is exciting for me in it’s own right), there are some things in this which should be really useful. The ability to do Map-Reduce in JavaScript (regardless of native implementation) stands to be really useful. Aside from maybe the zip method, I can think of circumstances where I’d want to use just about every one of these methods. I’ll be talking about Map-Reduce in a later post, but the wikipedia page (linked above) is a good place to start.

Using the collection module will be really easy, once it’s released. If you’re doing something where you need to pull only the unique entries out of an array, you’ll just do this, which is an update I’ll be making the Laconica widget on my sidebar.

_uriLink: function(msg) {
    var Y = YUI().use("collection");
    var le = /([A-Za-z]+\:\/\/[^"\s\<\>]*[^.,;'">\:\s\<\>\)\]\!])/g;
    var r = msg.match(le), s = '$&', i;
    r = Y.Array.unique(r);
    console.log(r.length);
    for (i = 0 ; r && i < r.length ; i += 1) {
        le = new RegExp(r[i].replace('?','\\?'), 'g');
        msg = msg.replace(le,s);
    }
    return msg;
}

This is necessary, because if I try to do a RegExp replace on a URI more than once, the output is completely hosed (as it should be). In the above code, I’m using the more traditional YUI usage, where my Y object is initialized to only the code I need. I could use the recommend syntax, where use() takes a function argument in which I use the YUI object, but that would be best done with a significant rewrite of the module, which is forthcoming.

I’m really happy with YUI up on GitHub, not only can we now easily follow what changes are coming down the pipe from Yahoo! it is now a lot easier to get changes integrated into the codebase. I suspect this will lead to more ancillary libraries like this one, which gather together useful functions which don’t necessarily belong in the core of YUI, but certainly can be (and should be) easily added into any project which uses YUI. Now, I just need to get my tests checked in, and move onto other projects as I think of them.

Speaking of tests, if you use YUI, and don’t have any ideas for code to contribute, start looking at writing tests. The test coverage on YUI3 could be better than it is, and that’s an easy way to get started in contributing. Plus, it’s a great way to familiarize yourself with YUITest, which I believe to be the best JavaScript test framework available today.

The Wrestler

Mickey Rourke makes an amazing return as a leading man in 2008’s The Wrestler, where he plays former Professional Wrestler Randy “The Ram” Robinson in 2008, twenty years since he’s been on top. His life is a mess. He works part time at a grocery store stocking shelves, trying to wrestle in the local New Jersey amateur circuits. He barely knows his daughter. The only woman who he’s interested in, a stripper played by Marisa Tomei, can’t reconcile the fact that he’s a customer with the fact that she likes him.

All that comes to a head when he suffers a heart attack after a brutal “extreme” match, and the Doctor’s tell him that wrestling again could likely kill him. With that in mind, Randy tries to find a life outside of the wrestling world.

The film is shot in a grainy format, making it feel like a recording from the 1980s. I suspect this was done because, in many ways, Randy is still living in those days when he was on the national stage, and he’s just refused to move ahead, even as the world keeps moving around him. As he tries to get his life back together, he takes a full-time position at the grocery store in the deli (even though he has to work weekends), reaches out to his daughter, and starts trying to interface with Tomei’s character, Cassidy, on a more personal level.

I really enjoyed this film. Randy “The Ram” Robinson is a person who has ruined his own life in single-minded pursuit of his dream, and he’s never been able to deal with his life falling apart. At the time the film begins, we meet Randy as he’s been self-destructing for nearly twenty years, and it’s really touching to watch as he tries to actually live in the now, instead of focusing on the past.

I think that Mickey Rourke really deserves the nomination he got for Best Actor in a Leading Role this year, and I think he probably deserves to win. The character is believable, and everything about the way the film was structured made sense, at least in the context of the character. If you’ve ever followed Professional Wrestling, then this film is a reasonable introduction to the lives a lot of these men face.

I will warn, however, that this film had no business being rated R. It was definitely an NC-17 film. Nearly a quarter of the file took place in a strip club, one scene features Mickey Rourke with his pants around his ankles taking a girl from behind while her breasts flop around outside her shirt. But I guess if you don’t see genitals, it isn’t pornography (or something). I suppose I just think about what an R-rated movie looked like even ten years ago compared to today, and it’s ridiculous what slides. Same thing for the shift of PG-13 over the years.

But that is beside the point really. This movie is fantastic. It’s honest. It’s brutal, but in a kind of beautiful way. I’ve never been as interested in Professional Wrestling as many of the people around me, but even with my fairly trivial background, I was really impressed, and greatly enjoyed watching this movie.

All Aboard the IE8 Fail Bus

After installing the very first IE8 beta, and having it completely hose my Vista box at work, I was a bit wary about installing the latest releases of this as-yet-unofficial web browser. When IE8 was first announced, I thought there was a lot to be excited about. Microsoft appeared to finally be taking standard’s compliance seriously. They seemed to be a lot more open to what the problems with their browser were.

However, as things have progressed, I have less and less faith in Microsoft to finally put out a reasonable browser, let alone a good one.

IE8 does manage to fix a lot of layout issues. I have code that required some weird futzing with styles to make work in IE6/7, that actually render correctly on IE8. It’s still noticeably off of what either Webkit or Gecko would do, but generally it felt a lot closer than IE7 did. However, it’s still off, and if you’re really working on getting your layouts ‘correct’, then you’re still going to have to work around IE being non-standard.

It’s JavaScript engine comes a lot closer to the W3C standard than IE7, but while it implements a lot of methods it didn’t used to implement, it doesn’t appear to be implementing a lot of it correctly. For this, I’m going to pick on a relatively new feature that is appearing in all the new browsers, querySelectorAll(). John Resig, of the Mozilla Foundation, has recently implemented a test suite to test the implementation of this feature. Most browsers (at least nightlies) are passing in the 99%+ range. IE8, fails 54% of the tests.

Now, IE8 might be performing better in the latest internal builds at Microsoft, but I find this to be another huge failure of Microsoft’s. With WebKit and Firefox, I can grab a new build of those browsers every single day. I don’t, generally speaking, but I could. That is awesome. Microsoft has already identified that IE isn’t a money-making proposition (at least not directly), in that they supply it free with the OS. And one consistent issue I’ve seen in the bug-reporting forums for IE8, is that often Microsoft comes back with a response as simple as “Oh, we’ve fixed that already in the latest build.” That statement isn’t always true, mind you, but how much time could Microsoft save themselves if they just released a new IE8 build every week?

It probably isn’t that simple, given how the fact that IE seems to be frighteningly tightly bound into Windows, but Microsoft needs to take things a step further in how tightly they work with their users. Right now, I’m finding bugs that are on a version at least a month old. I have no way of knowing if those are fixed, aside from Microsoft’s word, which may not be completely accurate.

IE8 is shaping up to be an improvement, but an immensely disappointing improvement compared to what many of us were expecting. Maybe that was unfair on our part, but I think we’d all be a lot happier if Microsoft just adopted an existing Open Source rendering engine and built IE around that (they could probably even keep their crappy ActiveX, too). Oh well, maybe for IE9…

A Note About Perforce Triggers

At work we decided to go with Perforce for revision control. Now, I’m a well admitted git lover, so that I went along with this still kind of surprises me to an extent. Needless to say that we made the decision over six months ago when I believed:

  1. Git didn’t work well on Windows
  2. Git didn’t have good GUI tools for Windows
  3. PlasticSCM was lacking features that Perforce had (like triggers)

In the intervening months since we made that decision, all of those things turn out not to be true anymore. So, rather than having a flexible, easily-branched SCM, we’re using a very capable, but wholly boring, SCM system. Oh well, as I said, it works very well, but there are certainly things I prefer about git.

In an effort to better use the tools available to us, we’ve also gone one step further and implemented Continuous Integration using Hudson. I’ll talk more about Hudson at a later date, but I wanted to quickly cover an issue we’d had with the Perforce Integration with Hudson.

First, we’ve chosen to manage our own workspaces, simply because there are shared components between various projects. So, for a given project we may have:

//depot//ProjectRoot/... //workspace/ProjectRoot/...
//depot//SharedProjects/SubProject/... //workspace/SharedProjects/SubProject/...

This allows me to use the same Visual Studio SLN file on both our development workstations and the CI server, which is really convenient. Unfortunately, while the Hudson Perforce plugin should handle this sort of view situation cleanly, it didn’t seem to be for me. I went through the code, and while I suppose it’s possible the Perforce plugin for Hudson I installed wasn’t the same as the code I read, we decided to go a different route.

We chose to go the route of the Perforce Trigger, setting up a Python script to scan a changeset once it’s submitted, triggering a Hudson build if it determines it’s necessary. I’ll share the details of this script later, once I’ve had a chance to clean up a few things, there are a few potential performance issues, and while this works fine for our small installation, I can see issues with my current implementation if I were to try to use it on a larger installation.

However, I had a big problem when first setting up the trigger in Windows. I had set C:\Python25 in the System-level PATH. But when I set the following in my p4 triggers:

Triggers:
        startci change-commit //depot/... "d:\p4scripts\P4CIBuild\P4CIClient.py %changelist%"

I would get an error on every submit, stating that the process could not be started. This was confusing, because that very command would work perfectly from a command terminal. Try as I might, I couldn’t find any documentation from Perforce, as almost all the Trigger documentation was clearly targeted toward UNIX servers.

It turns out, that in order to get the Trigger to work correctly when executing a script, you need to call the interpreter first, and you need to declare the full path. Replacing the above with this:

Triggers:
        startci change-commit //depot/... "c:\python25\python.exe d:\p4scripts\P4CIBuild\P4CIClient.py %changelist%"

Works like a dream. I’ve sent an e-mail to Perforce, which will hopefully result in their documentation getting updated, but if you’re having trouble getting Python-based triggers to work on Windows Sever 2008, this is what it took for me.

The Key to Healthy Air? Plants

I doubt there were very many people surprised by this, but a New Dehli, India business complex, Paharpur Business Center and Software Technology Incubator Park, published an interesting report at last week’s TED Conferece.

We have tried and tested these plants for 15 years at Paharpur Business Centre and Software Technology Incubator Park (PBC™ - STIP) in New Delhi, India. It is a 20 year old, 50,000 ft2 building, with over 1,200 plants for 300 building occupants.

PBC™ - STIP is rated the healthiest building in Delhi by the Government of India.* Their study found that there is a 42% probability of increasing blood oxygen by 1% if one is inside the building for 10 hours.

Also, compared to other buildings in Delhi, the incidence of eye irritation reduced by 52%, lower respiratory symptoms by 34%, headaches by 24%, upper respiratory symptoms by 20%, lung impairment by 10-12% and Asthma by 9%. As a result of fewer sick days — employee productivity also increased.

Certainly everyone knows that plants scrub CO2 from the air, and replace it with fresh oxygen, but I was ecstatic to read that some plants, like the Money Plant can scrub chemicals like formaldehyde, xylene or benzene as well. Since my wife and I are already trying to stop using chemical cleaners, due in part to the EPA’s assertion that indoor air is often more polluted than outdoor air, this one little extra step is easy, and pretty amazing.

Of course, not everyone has a green thumb. I am a notorious plant killer, so my Wife does almost all the plant care in our house. Actually for a while, I had a plant at work which ideally was helping with this indoor air quality issue. Of course, I neglected watering it for a while (and it was in nutrient poor soil, another thing I knew and neglected), so it is now a pitiful brown twig. I should do something about that. The nice thing about the plants that the PBC - STIP uses is that they really are low-maintenance plants, even with my poor plant skills, with a few gentle reminders, I could probably keep them alive for the extent of their natural lifespans.

This study was really interesting, particularly in a day and age where so many people have forgotten how to take advantage of the natural world. A world where the natural is almost despised for being somehow unsanitary. When really, the world we’ve created for ourselves often has done far more harm than good to us.

I’m not trying to be a Tree-Hugger here. I’m not trying to say that we shouldn’t utilize technology, but I think too many of us have a blind faith in all sorts of technology, and never question that the technology we’re presented with. From a business perspective, this is entirely foolish. As a society we allows want the cheapest product, while business is trying to maximize profits. A lot of products are developed for the market with the deepest concern for maximizing profit. In that pursuit, other issues, like health concerns are often overlooked. In some ways, this is hard to blame on the companies. It is somewhat unreasonable for a company to test their product in conjunction with every other product and/or medication on the market. But, since we don’t really need cleaning chemicals outside of Baking Soda, Washing Soda, Oxygen Bleach, Water and Vinegar it’s pretty easy to avoid those problems.

As an aside, this issue of the natural world affecting our health had an interesting development last week in the New York Times. Per Lenore Skenazy at Free Range Kids:

Otherwise, healthy children and germs and dirt have had a long and happy relationship since the beginning of time. Ms. Brody even says that that may be why babies put everything in their mouths. Not to feel it or taste it. To get a great big mouthful of germs. (And worms.)

Not only do we need the natural world to keep us healthy, but we need to the natural world to make us sick as well (at least when we’re young). And I’ve experienced this one first hand.

When I was younger, I never had a problem with pine pollen, but then I spent a reasonable amount of time outside in the spring and summer, when the pollen tends to be thick. As I got older, I spent more time inside vegging out watching TV, or playing on my computer (and my health during that time is indicative of that, it still is). In the summer of 2002, I got a job working for Spokane County Parks and Recreation (I was employed with them through a temp agency). I worked five days a week out at Liberty Lake County Park doing basic park maintenance. That summer, I started having severe problems with pollen. To the point that it would lead to sneezing fits that caused my head to explode in pain. Since then, I’ve had to start using allergy pills during the summer.

Of course, I’d had issues before that, but it was then that made it clear how bad the problem had become. Had I not spent the majority of my teens indoors instead of out, I might not have that problem today. I can’t say for sure either way, but it seems like a reasonable argument considering the study detailed in the NY Times. It seems to me that we’ve clearly forgotten a huge amount of what it means to live well not only for ourselves, but ultimately for the planet as well. Hopefully, with all the doom and gloom surrounding the question of the environment today, will convince more people to take this sort of thing seriously.

Gran Torino

Clint Eastwood’s latest film, Gran Torino is a film that deserved some recognition at these upcoming 81st Academy Awards.

Instead, what ended up being one of the most emotionally charged, and powerful dramas of 2008 got snubbed. That’s really, really, too bad.

I’ll admit, I wasn’t expecting much from the Trailer for this film. It came across as a movie about a man with nothing to lose who decides to take on the gangs who plague his neighborhood.

There is some truth to that assessment. But to say that is to say that Cyrano De Bergerac is a play about a funny looking man who helps a coworker woo a beautiful, intelligent woman.

It completely misses the point.

In Gran Torino, Eastwood plays an old man named Walter Kowalski. A man who served in Korea, before returning the midwest to work in a Ford factory and care for his family in a suburb of Detroit. Fast forward to the start of the film, and Walter’s wife has passed, his sons are grown and resentful, and his community has been overrun by minorities, including a large asian population. Walt is an old racist, and his distrust of his asian neighbors has led to him having strained relationships with everyone around him.

Shortly after the funeral, the young Hmong boy next door attempts to steal Walt’s 1971 Gran Torino as part of an induction into a local gang the boy is being forced into. Walt stops him, and is forced by the boy’s mother to take the boy in to work for him to pay off his debt. In the process, Walter begins to become fond of the boy, and his family.

While Walter becomes fond, his own existence as a racist old man doesn’t really change. He still refers to the Hmong neighbors constantly as “gooks” or “zipper heads”. What’s really fascinating about it, is that while Walt’s incredibly grating personality never abates, we get to watch while the Hmong neighbors accept Walt as he is, racial slurs and all, and he realizes that they are perfectly good people themselves. And while all this is happening, Walt’s own family pulls themselves further and further away from him.

All the while, Walt is having conversations with the local Catholic Priest, played by Christopher Carley, who had promised Walt’s wife that he’d get the old man to take confession. Father Janovich’s conversation with Walt delve deep into the issues of morality and guilt. Faith and redemption.

In the end, the film never apologizes for Walt. While Walt does grow as a person over the course of the film, his surface remains rough. It’s good to watch Walt’s relationship with the Hmong, whom he’d initially resented for taking over his neighborhood, as he realizes that they aren’t the kind of people he remembered from Korea (which is not to say that Korean’s are somehow evil, only that a Korean Veteran would likely view them as such). And, to come to grips with his own past.

The last ten minutes of the film were some of the most powerful minutes I’ve seen in a movie in years. I firmly believe that Eastwood should have received at least a nomination for this film, and I’m somewhat disappointed it was overlooked. Gran Torino was absolutely worth the price of admission, and I definitely suggest going to see it.

Installing the Silverlight SDK Minus Visual Studio

Microsoft is currently only redistributing a single installer package for Silverlight. While the complete set of Visual Studio tools is great for my workstation, my goal is not to install Visual Studio on the build server I’m in the process of configuring. While Microsoft doesn’t explicitly distribute the SDK installer in it’s own package, it’s easy to extract from the Silverlight Tools installer.

After downloading the installer, open up a command line and navigate to the directory where the file exists. Simply execute the Silverlight_Tools.exe file with the /? flag, and take note of the dialog box which pops up:

silverlight_extract.png

The “To Directory” row will show you the directory the files are being extracted into. Open up explorer and navigate to that directory. After a few moments, a “Silverlight Tools Installation Wizard” dialog will open, providing usage instrutions for the command line arguments valid against the Silverlight_Tools.exe. Do not close the window, but move it out of your way.

Grab the silverlightsdk.cab and the silverlightsdk.msi files and copy them out of this directory. Once that is done, you are free to close that dialog from the previous paragraph. It will automatically delete the temp directory it created.

silverlight_files.png

Now, just copy those two files to whatever system you want the SDK, but not the Visual Studio tools, and double click on the MSI. It will install the SDK files, and you’re MSBuild scripts will now be able to build Silverlight projects. Hopefully Microsoft will soon release an SDK-only installer for just this situation. But then, hopefully Microsoft will also make MSTest installable without installing at least 1.7 GiB of Visual Studio.

Time To End The Billable Hour?

There was a story on the front page of the New York Times last Friday which discussed that many Law Firms are starting to reduce the importance of the concept of the Billable Hour. For decades, law firms have billed their clients based on the number of hours it’s taken to accomplish a task, often to the tune of hundreds to thousands of dollars per hour worked. Firms judge the performance of their associates based on how many billable hours they put forward. It’s been the standard for decades, and it’s deeply entrenched in the current law culture.

But has it’s time come?

My older sister works for Garvey Schubert Barer in Seattle, WA. As a new Associate, she’s apparently required to put in 1750 Billable Hours per year, which was apparently one of the lower hours requirements she was presented with. However, her interests mean she’s likely to do a fair amount of Pro Bono work, for instance right now she’s working on a case involving police mistreatment of the local homeless population. Clearly, the firm is not being financially compensated for this work, but my sister was able to get the partner’s to agree that her work on this case would count toward her billable hours quota.

Many firms would not have done that. Which could be incredibly difficult for a lawyer with a deep background in human rights issues. But should most firms deny that sort of request? I would argue no. The press interest and good will generated by working this case is going to extend far beyond the homeless population of Seattle.

But it’s deeper than that. Most people, including a lot of smaller companies, don’t want to deal with the uncertainty that ‘billable hours’ creates. I had a conversation with Spokane Lawyer and Family Friend, John Clark, of Crary, Clark, and Domanico, regarding this issue yesterday before the Superbowl kicked off, and he was ecstatic to have seen the NY Times article, though he was immensely disappointed that so few had seemed to have read it.

John’s practice, which involves a lot of DUI cases as well as Legal Malpractice and other Personal Injury-type cases, has long pushed the fixed-fee method of billing clients. In fact, if John qoutes you a rate, he’ll take your case to court if necessary, even if he didn’t think he’d have to when he made the quote initially. I think is probably the main reason John has proven to be such a popular lawyer in the Spokane area, customers know what he’s going to cost up-front. This works out well for John, because while some cases take longer than he originally thought, most don’t, so he ends up doing well overall.

Now, I still do programming consulting, and I have had several clients that I have made hourly arrangements with. However, I try to make available a flat-rate service, only going the hourly rate when it is expected that my service will be ongoing, and they want some flexibility in changing their priorities. If I bid well, I can get the job done at a price that keeps the client happy, but results in good income for me. If I bid poorly, I’ve at least made the customer happy, and I’ve learned something that will make me bid more effectively next time.

Plus, by bidding a flat rate, I am have no desire to drag out completion of the job. I’ll still want to deliver quality, since a happy customer is the best form of advertising in the world, but I at a flat rate the faster I can produce that quality, the better off I am. And my customer can benefit from my speed as well.

As in Law, I think that the time of hourly billing in my field is going to filter away as well. It will always be an option, but customers (business and otherwise), are likely going to want to know up front what a job is worth to them. Sure, I’ll misjudge my proposals sometimes, but just because I might occasionally mis-estimate, doesn’t mean my clients should suffer. The bid should only fail if the bid actually changes.