November 2008 Archives

Flash on 64-bit Linux

Flash has always been one of those necessary evils of the ‘net. Early on, it was far more evil, as we’d have tons of websites built entirely on flash, which were enormous in the days of 56k. Not that you don’t still see a lot of Flash, and pretty shitty Flash, but these days, Flash is mostly used in places where it really adds something, like video playback, or certain types of applications.

When Flash finally came to Linux, it was a really big deal for those of us who were using the platform as our only OS. Finally, there was nary a site on the ‘net that was inaccessible on our platform of choice. However, for me (and some others) the joy was fairly short lived. 64-bit processors were just around the corner, and with them 64-bit Linuxes.

Now, there is probably no good reason for me to want or use a 64-bit Linux, at least not until I get that 8 GiB of RAM I’ve been eying, but that’s starting to change (this seems to be driven largely by Microsoft finally starting to push 64-bit more seriously with Windows). Ubuntu has been really good to me, as far as running 64-bit, but Flash was always a problem.

See, Ubuntu insisted on including a 64-bit build of Firefox, but no 32-bit build. And you couldn’t load a 32-bit plugin into the 64-bit build. Enter nspluginwrapper, a project which allowed the loading of 32-bit code into that 64-bit image. Unfortunately, it was flaky, and didn’t work very well. YouTube would often crash my browser, sometimes the Flash simply wouldn’t appear.

So, you’ll imagine my surprise (and glee) when I read on Steven Harms’ blog that 64-bit Builds of Flash for Linux were available. Not only that, but Flash player 10, which means the Linux version (finally) hit almost the same time as Mac/Windows.

Installation is simple, just make sure the 32-bit version isn’t installed, and drop the contents of the tarball from Adobe into /usr/lib/mozilla/plugins. You might have to restart Firefox, but I didn’t, and it just worked. Thank You Adobe.

Finally, I can go back to watching videos of funny cats.

Sustainable Living: Turkey

The slaughter has already occurred, and most of you are preparing to take part in the yearly American Turkey Feast, Thanksgiving. In recent years, many people have taken to considering paying more for ‘free-range’ or ‘Organic’ birds to replace the cheaper birds, believing that they are treated more humanely, or that they simply taste better.

I’m not trying to denounce Thanksgiving. I love Thanksgiving, both for it’s culinary tradition as well as it’s message of togetherness and sharing that it promotes among family and friends. This is, in my family, a particularly special Thanksgiving, as my mother has invited my in-laws over for dinner. It is, in many ways, one more symbolic step to the joining of our two families.

On the front of Turkey, it’s a common question which asks why it has become the traditional food for Thanksgiving. A Computer Science Professor from the University of Waterloo in Canada, Daniel Berry, claims to have a Hebrew manuscript (titled Haggada Shel Hodaya, literally Telling of Thanksgiving) from a classified dig outside of Salem, MA which provides evidence of why this is. The guy raises some interesting points, but most of them are without any evidence (like Ben Franklin proposing Hebrew as the National Language), so my consideration of his scholarly paper is with the Bozo Bit prominently set, but it’s still an interesting read.

The entire paper is based around two things: First, this manuscript he “isn’t supposed to have”, and the fact that the Hebrew word “Hodu” means both ‘thank’ and ‘India’. Which is relevant because much of the world calls Turkey something which translates to “India Fowl” because Columbus believed strongly he’d landed in India. It’s an interesting supposition, but as I said, there is so little evidence presented by Dr. Berry, that I can only really consider his paper as a curiosity.

I believe Turkey caught on because it’s a large, hearty bird which is indigenous to North America. They were plentiful, and a single bird could feed a sizable number of people. In short, it had a lot going for it as the centerpiece of a harvest feast. Whatever the reason, we’re stuck with it today, and I suspect we’ll be eating the bird for a long time to come.

However you prepare the bird: fried, roasted, smoked, or baked, it’s important to consider where your bird is coming from. Some people still hunt their own wild turkeys, which from what I understand can lead to an incredibly flavorful bird because it’s had a far more active life. And wild turkeys can get just as big as commercial turkeys, though generally they do so over a dozen years instead of the few dozen weeks.

Which is, in my opinion, the biggest problem with all commercial turkeys, be they Free Range, Organic, or not. The birds have been bred to grow to enormous size, very quickly. They tend to be unhealthy (evidently it’s not uncommon to lose 13% of a flock in the first eight weeks), but it’s not considered terribly important due to their short lifespans.

Some people are going to be concerned with the practice of beak and toe severing, which is done to prevent the birds from pecking and slashing each other to death (at least partially a sign of the incredibly tight quarters in which these birds live), but also to ensure that their instinctive tendency to toss their food around isn’t exercised. I will note that these practices are common at basically all levels of Commercial Turkey Farm, though I’m sure there are some that don’t. It makes a lot of sense from a business perspective, and I don’t claim to know about the long term effects regarding the bird’s comfort, but some people are simply not going to be able to accept the practice.

The problem is that free-range, organic turkeys are still nearly four times as expensive as turkeys that don’t make this claim, once they reach the store shelves. In some cases, these birds no doubt have a better life than their less-expensive brethren, but the USDA regulations are so open, that a Free-Range Turkey may not be more humane than any other.

Birds are omnivores. We have this iconic image of a robin feeding it’s screaming chicks a worm from it’s beak, but for some reason we feed most of our poultry destined for human consumption a strict vegetarian diet. Since the birds don’t get all the nutrients their bodies are designed for, they can’t convey those nutrients to us, and their own health is often put in jeopardy.

If you’re concerned about buying a humane turkey, or a turkey that you feel is healthier for your own family (not to mention probably tastes better), do yourself a favor and buy from a local farmer. Consider the circumstances in which the bird was raised. Consider the age of the bird and what that tells of it’s breeding. American’s have developed this (frankly troubling) fear of their food. They don’t want to know what it looked like when it was alive. They don’t want to see it die (or kill it themselves). I suspect even many hunters would have trouble raising an animal with the intent of killing it.

A few months back, Catherine and I bought a copy of Meatpaper, a journal that considers the social aspect of meat consumption (and no, it’s not anti-meat eating), rather than just the how. It’s an interesting read, and one article stuck out in particular to me. A woman who had, for a long time been a vegetarian, who started eating meat again on a single condition: she only eats what she’s raised.

If she doesn’t know the animal, she won’t eat it. She even does a fair amount of her own butchery. I don’t expect most American’s to go this far, but I think the lesson is important. We, as a people, need to be more socially responsible with our food. We should have greater ownership in the food supply, where it’s coming from, how it’s raised, and what that means for us in the long term.

I think Thanksgiving is a great place to start. If you haven’t bought your bird yet, go find a local turkey farm that will sell to you directly. Check out the conditions in which your birds are raised and cared for. Maybe even pick out the bird you want from the flock, while it’s still standing.

I’m being a little facetious, as most of us would have no way to do this, even if we were inclined. The point, however, is that if you’re truly concerned with the health value and humane treatment of your turkey, don’t just pay three dollars a pound for that ‘organic’ label. Odds are good it means a lot less than you think.

But if you can hunt a wild turkey, or find a heritage bird, raised on a good farm, it’s almost certainly worth the extra hassle. In taste, health, and conscience if you’re concerned with such things.

WCF with Silverlight

2 Comments

We’ve been using Microsoft’s Silverlight and the Windows Communication Foundation for the last several months as the User Interface and Web Service framework for our just-launched course schedule proofing project, ROOMS. As people have been doing the scheduling off of paper since long before Washington State University was known as such, people have been really excited at the possibility of working with a newer technology.

I’m not going to be talking too much about Silverlight itself, as I’ve been the developer responsible for the back-end work since I got brought onto this project. Silverlight isn’t a bad framework, by any means, but I do personally favor interface technologies that don’t require plug-ins. That said, the interface Silverlight has allowed us to create would have likely been harder without the benefit of Silverlight.

For building Web Services, WCF has a lot going for it. You’re interface for the service is easily provided in a single interface class, your Data Transfer Objects (DTOs) decorated with a few attributes, and your transport protocol can be changed with a few simple configuration options. It’s really flexible. We’re using it not only for our webservices, but also to communicate with a Windows Service which handles several long running tasks, saving us from triggering those events with a timer.

However, there are some unfortunate side-effects of pairing a WCF web-service with Silverlight that aren’t necessarily obvious.

Security

Our application needs to run over a Secure Shell connection, so that we can protect the user’s login state, and eventually any confidential data that becomes a part of this application. While Silverlight supports cross-domain web-service requests, it does not appear to support these over a secure link. The article linked does refer to Silverlight2 Beta 1, but it appears to me that this was not changed for the RTW.

Because of this lack of Secure Cross-Domain Web Services, you’re required to host your WCF services in IIS, which requires some modifications to your web.config file, which I have posted about here. More troublesome, is that if you do need secure webservices from a cross-domain source, this technology forces you to take steps like we need to with modern AJAX applications, wrapping up a webservice ‘locally’ so that it can be called the way we need it to be. Lot’s of unnecessary work there.

The other step, that is easy to forget, is that the Security method needs to be set correctly on both sides of the connection. In your ServiceReferences.ClientConfig file, as well as your web.config, your security is likely set to ‘None’. This must be set to ‘Transport’ for SSL to work, but more importantly, if you set it to ‘Transport’ it will only work over SSL. You could set up a different, insecure, Endpoint in your web.config if you wanted to offer both, but this is extra work that seems, to me, to be somewhat more inflexible than necessary.

Configuration

The biggest challenge we’ve had using WCF for our services has been maanaging the configuration steps. Anytime we add a new method to our service, we’re forced to rebuild the Service Reference in the Silverlight config. There is no obvious way to automate this as a build step, partially because our web project depends on our Silverlight app, and the web server must be running in order to update the service reference. In an ideal world, I’d build my services, start up the web server, load the service references, and then finish building the Silverlight. I’ve looked into doing this with MSBuild and svcutil, but it does not appear to be that simple.

The tools Visual Studio provides in 2008 are also insufficient. Generally, we are forced to delete the entire service reference and re-adding it, and even then I’ll occasionally end up with strange errors that are non-obvious as to their source. The way the configuration management works to date feels like a house of cards, and I’ve periodically lost significant amounts of time to rebuilding service references.

To save time and configuration, we’ve opted to store our DTOs in a shared assembly between our Web project that hosts the services and the Silverlight project. To support this, we first attempted to create the DTO project as a Silverlight project, and merely add a reference to both projects. Regrettably, this does not work. Silverlight is not .NET 3.5 (which introduced WCF), and while there are a lot of similarities between the two technologies, the assemblies the Silverlight project links that declare the necessary attributes are not the same version as used by .NET 3.5 and WCF, so the linking fails.

We got lucky on this issue, and we were able to use Visual Studio to create links from the Web Project to the DTO files, so we can build them once for the Silverlight and once for the WCF without breaking any code. Cumbersome, yes, but relatively easy, once you know what to do. In my opinion, it’s completely worth moving your DTOs into a seperate assembly like this, if for no other reason than it will be more conducive to performing Unit Tests later.

This does leave us with one major problem, namely, how do we keep our service references up to date as part of our build process? I haven’t quite figure that out yet, but the procedure we are most likely going to use is as follows:

  1. Examine WCF ServiceContract Interface.
  2. Find all OperationContracts under the Interface, and generate Callback Events and ASync Call methods for all of them
  3. Write out source to reference file, to be included in Silverlight project

We’re investigating a few methods to accomplish this. We met a guy at the Seattle Code Camp who had written XSLT’s to convert a chunk of XML into the Interface files, and that’s not necessarily a bad way to go. I could even use this to generate my DTO classes. I am investigating the possibility of doing a simple compile on the code itself, but I suspect the XML route would be easiest. Plus, that way our Interface files in the WCF project will be auto0generated as well. Plenty of interesting possibilities there. Once I’ve got it written, I’ll likely be sharing the solution we arrived at.

Both Silverlight and WCF are interesting technologies. Personally, I’m likely to do more WCF, but that’s largely because I’m a back-end systems developer, and Silverlight is definitely aimed at the UI-set. I know my limitations.

As I often think when dealing with Microsoft technologies, I wish they integrated a little more cleanly. Still, Silverlight and WCF are very usable together, and if you’re still writing web services in ASMX files, you owe it to yourself to at least consider WCF. I suspect it may make your life a lot easier in the long term.

A Farewell to Jerry Yang

Today Jerry Yang announced that which was probably inevitable: his resigning as CEO of Yahoo!. To be honest, I can’t blame him. He recieved heavy fire for refusing Microsoft’s buyout offer a few months back, and now the companies stock is worth half of what it was when the offer was still a possibility. He was forced to bring a pirate onto his board, in the form of [Carl Icahn](http://en.wikipedia.org/wiki/Carl_Icahn], and then was still at the helm as the economy began to collapse around him, a collapse that was, unfortunately, long overdue.

Personally, I don’t blame Jerry for any of Yahoo!’s woes, largely because I don’t think Yahoo! is that bad off, but also because I firmly believe that a Microsoft-Yahoo! merger would have been distinctly negative, at any price. Yahoo! shareholders will probably disagree with me on that one, and while I’m more concerned with the long-term health of the industry, if I did own Yahoo! stock, I suppose I might be annoyed that I’d missed my payday.

So, why do I think Yahoo! isn’t that bad off? According to Alexa’s Top 500 websites list as of right now, Yahoo! has more traffic than Google. More traffic than anyone else on the Web. It’s really easy for those of us who follow the Tech news and try to keep up on what’s hot in web technologies to forget that Yahoo! is the oldest name on the Web, and for many, many people, they’re still the only name that matters. I even know people who are starting to leave Google for Yahoo! in the search arena because they’re finding that Yahoo!’s results are often more relevant. Admittedly, this is just because everyone is concerned today with gaming Google for high Pagerank, but I challenge each and everyone one of you to use Yahoo! as your default search engine for a week, and see if you really depend on the Big G that much.

So what is Yahoo!’s problem? Traditionally they’ve had trouble monetizing their products. Yes, they sell ads. Yes, they are the second biggest ad company on the web today, but if you look at the Yahoo! Publisher Network, their ad offerings look like a poor attempt to beat Google at their own game. Yahoo! does innovate, but their innovations as of late have not been in big money making areas. Honestly though, Google has traditionally had the same problem, where the majority of their properties do not earn them money directly, and it’s questionable if even the indirect money making efforts (advertising and collecting data for more effective advertising), directly pay for a site. Admittedly, Google is profitable, so their entire ad network is clearly effective, but at the moment, I’m talking about individual applications.

This is not meant to be an indictment of Google. Google has done well for themselves, and while it seems to me that Yahoo! is merely trying to play catch-up, it’s hard for me to condemn them for doing so. Google has done very well, and tapping into even a small part of that success would be huge for Yahoo! Jerry Yang knows this, I believe. While some people believe that the handing of the company over to a new leader will cause talks to reopen with Microsoft, I tend to believe that from Microsoft’s perspective that ship has truly sailed. The hope now, is that new leadership will be able to make the decisions that Jerry Lang wouldn’t, perhaps couldn’t. That new Leadership could help drive innovation in spaces that stand to actually make Yahoo! more profitable.

I say ‘more’ profitable, because Yahoo!’s P/E is almost as good as Google’s, even if their Market Cap is significantly lower. Admittedly, the Earnings Per Share is significantly lower, but this, this is what the new CEO may be able to correct. I believe that Yahoo! still has a place. I believe that Yahoo! can still be relevant, and while Yang is leaving as CEO and is returning only as the Chief Yahoo! (essentially the Chief Architect for the company), I hope that the company survives. Perhaps this will be like when Bill Gates handed the reins of Microsoft over to Steve Ballmer, and Yahoo! will come out stronger than before. Yang may “bleed purple”, but I’m definitely hoping that turning the company over to someone more business minded will spell a new future for Yahoo!

Sustainable Living: Rice

I am not descended from a people who traditionally ate rice. As an American European descent (hodgepodge of East and West), with probably a touch of Native American, my family has traditionally ate wheat and barley. Good grains to be sure, but when it comes to the importance to the human race, these staples have nothing on rice.

Rice has always been the traditional cereal grain of Asia, being first cultivated by the Chinese around 11,500 BCE. It’s position as the core staple in Asia basically means that it’s the core food for something like 1/3 of the global population. Beyond that it’s used heavily in Middle Eastern and African cuisine, and has taken on increasing importance globally both as populations mingle, and the nutritional values are acknowledged.

Rice has over 7 grams of protein, and over 1 gram of dietary fiber. To be fair, this is significantly lower than barley, but Rice is easy to grow and doesn’t deplete soil the way that barley will. It’s a highly sustainable crop, which a large part of what makes it such an excellent staple crop globally.

Of all the cereal grains, I believe rice to be one of the most filling. All the cereal grains work well for making us ‘feel’ full, helping us be more comfortable with less food. Rice does an excellent job with this, and given that Japanese cooking in general has traditionally been filled with nutrient rich seafood, the relatively nutrient poor nature of rice has been less important.

So, given the important of rice, how does one prepare it? Most American’s probably just do what I do. Water, butter and the rice in a pan, brought to a boil than left to simmer until the rice is done. This works alright for me, though some care is required, as nothing is quite as bad as rice burnt to the bottom of a pan.

For those people who really eat a lot of rice, it may be worth investing in a rice cooker. A rice cooker is a small dedicated appliance that is able to sort of ‘track’ the temperature and humidity in the pot, helping ensure the rice is always properly done and not burnt. Some models can double as a steamer as well. Even if you’re not willing or able to buy a dedicated appliance for this, learning to cook rice effectively will always pay off culinarily in the long run. Rice is a good expensive staple, which can be dressed up easily however you’d like it.

Off to Code Camp

This weekend marks the Seattle Code Camp, which I’m lucky enough to be being sent to by my employer. I’ve talked about Code Camp last year when we went to Boise, which was a fun trip, but this time we’re going to be at the DigiPen campus in Redmond, WA, which is practically in the heart of the Microsoft Campus.

What I love about Code Camps, is that they are designed to be completely vendor agnostic. Admittedly, every code camp I’ve been to so far (read: almost two), has been centered heavily around Microsoft technologies, and Microsoft tends to sponsor these events, they are open to almost anything. There was a fair amount of Java stuff in Boise, there is a presentation on git in Seattle. I would love to see more Open Source stuff, but I’m not presenting, so my complaints aren’t quite as loud as they could be. Hopefully next year, my work schedule will be more conducive to preparing something like this (ie, not working 12+ hour days), but no luck this year.

Since I do work mostly with MS technologies, the way Code Camps are currently structured is much more relevant to my current line of work, which is nice, but I rarely derive a ton of enjoyment from using most Microsoft platforms. I’m growing to love ASP.NET MVC, but I tend to attribute that more to how unlike MVC is from most Microsoft frameworks.

Still, there is much to learn, and hopefully much to share. We’ll be recording the sessions we attend, so we’ll have video, which I’ll try to get some of posted. And I’ll just continue reminding myself, .NET is a platform with a rich Open Source tradition these days, I’ll just have to take solace in the .NET ;)

Building Scalable Web Sites

ScalableWebsites.pngI had never planned to be a web-developer. I just sort of fell into it, which has worked out well overall, but I’ve had a rough time learning what is generally considered best practices in the industry. The first company I was working for had been built on virtually no infrastructure, by people whom I honestly feel knew a lot less than even I did. The application was poorly constructed, hard to reuse, and the servers configuration left a lot to be desired when our webmaster left, leaving me the sole IT staff of a company that wanted to be an eCommerce powerhouse. It was difficult to learn best practices, when the code-base which has been thrust upon you is in such a frightening state of disarray.

The code wasn’t in source-control, edits were made against the production site. There were multiple copies of each include file, typically only one of which was actually in use. The code was the best example of PHP Spaghetti Code I had ever seen. Whenever anyone says that PHP is a terrible language, this was exactly the application they were talking about (it was closed source, so I’m being a bit facetious, but it was awful). For a lot of reasons I wasn’t very disappointed when my life took me away from that place and into my current position. Sure the technology was classic ASP and VBScript, neither of which I knew, but the software was at least well designed. Within two weeks, I was contributing code changes, fixing bugs, and implementing features. Still no source control, but we’re close to fixing that.

I knew I needed to know more. The problems that we typically face are rarely real scaling problems, we do things that work well for our relatively small data sets and relatively small user-bases, that I know wouldn’t serve us forever if we expected real growth. But we’re a niche internal service, and we’ve got plenty of room to grow for the time being. Still, I needed to know more, so I found this book, Building Scalable Web Sites by Flickr engineer Cal Henderson.

For those unfamiliar with Flickr, it’s the photo sharing site that has over 4000 photos uploaded every minute as of this writing. It’s enormous. Clearly, Cal and his team know a thing or two about scaling web applications to the massive scale. And it shows. This book is written very generally, talking more about things you should consider than actually walking through building an application, and it walks it’s way through the stack from hardware to software to the NOC you host in. It’s best piece of advice is simple: build the site you need right now, not the site you hope to be next year.

The book is a little older, having been published in 2006, but a lot has happened in the web space in the last two years. We’ve had the launch of Google’s App Engine, Amazon’s Elastic Compute Cloud, and Microsoft’s Windows Azure, which will have an enormous impact on the Web Applications space, because now you can host applications that scale nearly infinitely without ever investing in hardware, but I still believe that applications can and will outgrow these services, and even the hardware considerations discussed by Henderson will be useful.

More than that, however, is the discussion of how to design the software. Henderson talks again and again about a layered approach to application design, whether it be the business logic to support multiple front-ends, the database caching system, or the server farm in general, Henderson does an excellent job of walking you through all the considerations you might need in order to build the infrastructure for a successful web application which can scale with your company. A lot of what Henderson has to say seems like common sense once you’ve read it, but most really good ideas do, in my experience.

I now understand why Remember the Milk practically lists this book as required reading for their job applicants, and I suspect a lot of others feel the same way. Even if you’re planning on deploying to one of the newer cloud-services, you owe it to yourself to read this book. It won’t guarantee you success, nothing can, but it will at least make sure you’re as armed as you can be to enter the playing field on a fair level.

Book Meme

Picked up this Meme off Jono Bacon on Planet Ubuntu.

Instructions:

  • Grab the nearest book
  • Open it to page 56
  • Find the 5th sentence
  • Post the text of that sentence in your blog with these instructions.
  • Don’t go searching for a book, just grab the closest

Mine, from the Invertbrate Identification Manual by Richard A Pimentel, first published in 1967 and now owned by the WSU Library.

Sea Lichens (Order Cyclostomata) - encrusting, fanlike to circular.

Celebrate Armistice Day

At this, the eleventh hour of the eleventh day of the eleventh month we as a planet remember the end of hostilities in World War I. Ninety years have passed since the War to End All Wars we find humanity still caught up many sectarian conflicts globally, and the United States caught in two wars in the Middle East.

Clearly, those men who fought in World War I were a little too ambitious in how they named that conflict, which I believe is why, in 1953, the United States has since extended this celebration to focus not solely on the Armistice of 1918, but to celebrate all those people who have served in the military, particularly in war time.

I have several uncles, and some more distant cousins, who either are serving today, or have served in the last several decades (was Desert Storm really almost 20 years ago? Wow.) While I personally have never served, nor would I plan to enlist barring serious global conditions I don’t feel are very likely, I try to honor those who have, and those who still are. Ours is a nation with a history which is rife with wars, though not much worse than most countries, I suppose. The only thing that’s interesting on this note, is that in the last century, we’ve had virtually no combat experienced on American shores.

America as a nation has many times sent our military to support other nations, mostly in the name of ‘spreading the light of freedom’. It’s interesting, largely because the American public shows virtually no interest in the politics of other nations. How many people can name the current Prime Minister of Britain? The President of France? How many people can say why the last presidential election in Russia was so controversial on a global scale? Sure some can, but I guarantee you that a lot more people globally were aware and invested in the election in this country last week.

American’s need to be invested more, not only in our own politics, but in the politics globally. Should we all have opinions on elections in other countries? No, but we should at least be aware of what is happening, and what that means for us and our nation. Our Veterans, particularly our veterans in this conflict, have an interesting perspective on global politics. They see the results of these actions on a daily basis, either because of the situation wherever they serve, or the non-American’s with whom they often serve.

Their experiences are to be commended, as they provide our veterans with perspective that most of the rest of us could never hope to match. The experience is not always positive, certainly, but I hope that the good outweighs the bad. Certainly part of that could accomplished by showing your veterans your appreciation. Whether you agree with any current deployments or not, these people have (or are) been in service of this nation. And I wish to thank them all for that service.

Sustainable Living: Milk

Dairy in America is in a really strange state. For the generation of my parents (people who around 50), their entire childhood Dairy meant cow’s milk. That was the only option. Things have changed a bit, in that these days we can more easily get dairy derived from goats, or even fake dairy derived from soy. In more recent years, people have once again began acknowledging that milk can (and should) be a part of a healthy diet.

According to some studies, however, Lactose intolerance is an incredibly common problem, particularly among older people. However, I think that this has more to do with the way in which we consume milk, than anything innate in Milk. And I’m pretty sure I’m not just blowing smoke here. My wife had, for many years, complained of lactose intolerance. She suffered through it out of love for Cheese and Ice Cream, as I would likely have.

After some badgering, I convinced her to start taking Lactase supplements, which helped quite a bit. The interesting thing is, since we switched to drinking whole milk, she hasn’t needed it. It turns out that lactose is more concentrated in the liquid portions of milk, and therefore dairy with a higher fat content, is going to naturally have less lactose to be processed. But that’s far, far from the whole story.

Ultimately, the problem isn’t that most Americans tend to buy 2%, 1%, or even Skim milk instead of whole milk (which usually has under 4% milk fat), the real problem, is that we drink pasteurized milk.

That’s right, pasteurization is absolutely destroying our ability to process milk. My father has even commented on the fact that, when he was younger and they were drinking milk that had fat floating on it’s surface, he never had digestive issues with it. Now, one could argue that this is related to age, since most people developing difficulty with milk with age, but among European Americans, the incidence of lactose intolerance tends to be around 10-15%. There is a biologic predisposition in mammals toward the body stopping production of the lactase enzyme, however, it seems that among peoples who consistently have access the lactose, the gene is far less likely to shut off. Even among the Japanese, who typically developed a near 100% lactose intolerance after weaning, are starting to become happy milk drinkers, even into adulthood. I’m no geneticist, but it seems that the gene is generally turned off when the lactase is no longer needed. Makes sense, but I’d like to see more research on the subject, myself.

Still, even among people who consistently enjoy Milk products, it seems that lactase production does slow with age. That genetic predisposition, poking it’s head up again. I believe this can be linked directly to the pasteurization of milk. But, let’s start at the beginning while we form this argument.

Pasteurization was ‘invented’ by Louis Pasteur in 1862. I put ‘invented’ in quotes, because people had been performing the basic process for literally thousands of years by this point, but Pasteur did codify the scientific basis for the practice, and that does deserve some respect. Pasteurization is basically cooking. It’s raising the temperature of food to the point where various microorganisms can no longer survive, or at least so that enough don’t survive that what’s left can be easily dealt with by the bodies natural immune system.

Now, heat does break apart the microorganisms, but it also breaks apart a lot of other chemicals in the food. This changes the way the food tastes (sometimes for the worse), and can break apart proteins and other structures. It changes the chemical structure of the food, and this is one potential problem with the process (but it’s fairly minor in my opinion). And with milk, where ‘ultra-pasteurization’ is often used, this process involves quickly taking the dairy from ambient temperature to 250 degrees Fahrenheit, and back down within a fraction of a second.

So, this is meant to kill disease right? What could possibly be wrong with that?

The stated purpose is honorable, I’ll agree. The problem is that pasteurization doesn’t just kill the bad microbes, it also kills a lot of the good ones. Including, in milk, some bacteria that naturally help the body break down the lactose. But, 1862 was at the height of the industrial revolution. More people were moving into the cities, and living denser than they ever had before. The lack of refrigeration required perishable food sources, including dairy, to be close to the cities. People didn’t truly understand at that time what was responsible for causing disease, and standards for food safety were non-existent.

Cow got sick, and it’s milk contained blood or pus? That didn’t matter, ship it to the waiting customers. Cows all covered in shit and it’s falling into the buckets as they get milked? No problem! If you’re luckily, the milker will pick it out quickly. Contamination was common, partially due to ignorance, partially because the customers didn’t bother to ask, and the Dairies weren’t being held accountable.

This just isn’t true anymore. In this country we have the Food and Drug Administration who has set out standards for food production because consumers weren’t able to make an educated choice for healthier products. However, even the FDA occasionally gets the science wrong. The FDA, as recent as 2004 has still been against Raw Milk. Now, I’m of the opinion that this is largely due to the Dairy Lobby in this country, who tends to work really hard to stop people who don’t want to play by their rules.

Luckily, the FDA hasn’t tried to completely stop the sale of Raw Milk. They’ve left that power up the individual states. In Washington, it is possible to get licensed to produce Raw Milk commercially (interesting fact: Raw Milk in Washington must have a lower bacterial content that that which is allowed in already pasteurized milk). Idaho technically could have a Raw Milk dairy, but the state hasn’t licensed anyone for decades, and that is seen as unlikely to change. However, the FDA has banned interstate commerce of raw milk, so a Washington Dairy can’t transport it’s product to Idaho for sale. Which is too bad, because the local organic food Co-op here in my area, is in Moscow. But, if you own your own cow, or participate in a Cow Share, you can still have access to Raw Milk pretty much anywhere in the US.

As I’m not a Food Scientist, I’m going to defer to some people who are. The Weston A Price Foundation has begun the Campaign for Real Milk (a cute play on Britain’s Campaign for Real Ale from years ago), and that site has tons of information on why you should want raw milk, and how to get it in your state. They suggest reading ‘Medical Maverick’ Dr. William Cambell Douglass II’s book, The Milk Book: The Milk of Human Kindness Is Not Pasteurized. I suggest to do so as well.

There is big, big business in the milk market. The factories which pump out the pasteurized milk, are owned by a small group of companies, and they can sell their product hundreds of miles away from where they bottle it because of pasteurization and refrigeration. And the US Government has essentially forced this system on us since the 1930s. Raw Milk is more expensive, no doubt about that, but mostly that’s because so few people are producing it. If we move to a less centralized dairy system where Raw Milk is more economically feasible, the price will drop. Will it drop to what we’re paying for milk today? Maybe not, but since it won’t have to travel as far, or go through as much processing…it just might.

The End of One Thing, the Beginning of the Next

Tarot Card - DeathOn Tuesday, November 4, 2008 at about 8pm Pacfic Time, the major news networks officially called the election for the President of the United States of America for Democratic Candidate Barack Obama. But you all already knew that. A lot of people have been expressing excitement over the issue, but to be honest, I’m not one of them. I did not vote for Barack Obama, but neither was I able to bring myself to cast my vote for John McCain.

Don’t get me wrong, I respect John McCain as one of this country’s greatest public servants. I know some would disagree, but John McCain’s devotion to this country and it’s people is inspiring. So, why did I not vote for him? Because the last six months have made me wonder if John McCain was his own man anymore. He refused to bring the party away from it’s modern message of hate and fear. While he didn’t come out himself and try to mark Obama as a Muslim, he did virtually nothing to stop his people (particularly his Vice Presidential Nominee) from making those suggestions. While McCain didn’t act poorly himself, he did not, in my opinion, do enough to put a stop to it. In essence, this honorable man, through his own inaction, let his own name get sullied, more so, because it was in the name of an unfair attempt to sully the name of another.

Which brings me to Sarah Palin. When Palin was first announced as the VP Nominee, I was interested. Here was a virtual unknown, from a state that is often poorly represented on the national stage, and it seemed for a moment to be a decision that way like the McCain I respected. It seemed that he did something in spite of the party, intended to shake things up. I was disappointed that the nomination wasn’t given to Joe Liberman, or another more moderate candidate, but I knew that the Party would never allow McCain to select a Non-Republican as his running mate.

Then Palin started talking, really falling into her stride. It was clear that she appealed to a large number of people, and energized them for the campaign. Unfortunately, this is exactly why the race became such a blowout. Sarah Palin attracted exactly the kind of people the Republic Party shouldn’t be pandering to, but has been anyway. The willfully ignorant. People who revel in Barack Obama being referred to with his middle name, Hussein. People for whom McCain’s energy policy boiled down to three words, “Drill Baby, Drill.” People who honestly believed, and looked forward to, McCain and Palin attempting to overturn Roe V. Wade.

In other words, people who were never going to vote for Barack Obama. However, this choice as Vice President, and the apparent recklessness with which it was made, drove a lot of middle-ground voters away from McCain. If you look at the National Polls, such as the one below, McCain suffered a major hit in early September, right after announcing Palin as his running mate. It continued to drop for well over a month. Yes, eventually there was a bit of a rebound, but Obama’s numbers were consistently rising over that same time period.

As much as I disagree with Obama’s policies, I did find it amazing how he ran his campaign. I had never, in my entire life, seen people as excited about a candidate as many were about Barack Obama. For once, this election felt to be more about the man than the party for a lot of people. I watched as expatriate citizens and resident aliens worked hard to volunteer on behalf of Obama. I watched hundreds of students here at Washington State University not only walked around wearing Obama merchandise, but actively engaging people in discussion regarding the issues. I still fundamentally disagree with most of the policies they were arguing for, but I can not argue with the passion embodied by many Obama supporters.

And so, in an early night, just after the polls closed here on the West Coast, the election was called. Barack Obama won, by a healthy margin. And John McCain spoke. His concession speech is exactly what I’ve been wanting to see out of John McCain for the past six months or so, but have been desperately missing. It was respectful, and clear. McCain still disagrees with Obama on many core issues, and no doubt this will show in McCain’s voting record for the remainder of his time as the Senator from Arizona. Had the McCain I saw in the following video been the McCain who I had been watching for the last few months, I would have gladly cast my vote for him, a man I wanted desperately to believe in, but found myself unable to. Had McCain stayed true to himself, I firmly believe this election would have been far closer.

But, like McCain, I do not wish to dwell on the loss. I do not wish to enter upcoming Obama presidency full of spite and anger. Obama ran a good campaign. One of the best campaigns I’ve ever watched. In his speech on Tuesday, accepting the role of President-Elect, Obama continued his message of change, passing on his “Yes We Can” message beyond not only his supporters, but this entire nation. It’s like what Trent, over at The Simple Dollar, said in his morning post yesterday.

Today (Tuesday), America pretty clearly voted for change, for better or worse. But today is when the real change begins, and it begins with you.

Ultimately, if change is supposed to happen, we cannot depend on Barack Obama. We cannot depend on anyone, except for ourselves. This country is hurting, and the healing it requires is something deeper than any one man, however charismatic, can accomplish. It requires work from all of us.

I agree with some of Barack Obama’s platform. I believe in Open, Transparent Government. I believe in Network Neutrality. I believe that while Faith is important to life, it should not be a consideration in governing. I don’t believe that Obama is right when it comes to Health Care Reform. I don’t believe Obama is right where it comes to Tax Reform (note, I will benefit from Obama’s Tax Plan). I don’t believe that Obama’s Energy Policy is enough.

But, I’m willing to give him a chance. What choice to do I have? I won’t leave this country for this reason alone, not yet. If Obama can bring about positive change in the way our government operates, but more importantly in the way people view their role in government, I will be content. If Obama makes it easier for me to track the government in a meaningful way, I’ll be content.

Obama’s acceptance speech didn’t feel terribly important to me. It was mostly reiterating what he’s been saying for months, which is reasonable. The only thing I feel is truly important in his speech, are comments beginning around the tenth minute of the speech. Where he begins to call out toward the people to be better, to work together, and work harder. He invokes Abraham Lincoln (a Republican), and in so doing reminds America (albeit briefly) of the qualities on which that party was founded, and the principles for which I tend to agree with that party.

To those American’s whose support I have yet to earn, I may not have won your vote tonight. But I hear your voices, I need you help, and I will be your President too.

I do not believe that Barack Obama’s Presidency will be what many people do. I believe that Barack Obama will work hard. I believe that he will, by his very nature, improve the way many people globally view this nation. But I don’t believe that many of his plans are ultimately good for this country. I could be wrong, but there are ways to improve health care, that don’t involve government subsidies. I believe that while alternative energy is important enough to invest in, but that Nuclear Power is safe, reliable and effective now and needs to be a part of a major energy policy. I believe that America electing a Black Man as President does not make America not racist.

I wonder how many people voted for John McCain, purely not to vote for a Black Man. I know that some Black People went out of their way to be remarkably racist, going so far as to keep white people from getting to the polls. That story is backed up by personal reports. Admittedly, America has come a long way at breaking down it’s racist roots, and there is a good chance Obama will take that even a little further, but the next step in breaking down racism in this country is to recognize that white people are not the only one’s capable of being racists.

So, for the next four years, we as a country are saddled with a President, who, while I respect his campaign, and agree with a few of his stances on the issues, I disagree with on enough important issues that he could not earn my vote. Perhaps that will change by 2012, I know that I’ll be watching with interest. There is change in this countries future, and I know that some of it is for the better. It’s not the change I chose, but often, that really an option.

LINQ and Premature Optimization

s a Man, the word ‘Premature’ sends shudders down my spine. As a Developer, it does as well, for additional reasons. Every student of computer programming has been taught about optimization at some point. Whether it’s been learning assembler for the hand-tuning of certain methods, to moving variable declarations outside of loops to save cycles, we’ve all been taught it. And at least for the last several decades, we’ve had the words of Donald Knuth from the December 1974 issue of the ACM Journal Computing Surveys thrown at us.

Premature optimization is the root of all evil.

And he’s right (maybe). And a lot of people have pointed that out. Luckily, we live in a day where compilers can often do a better job of optimizing our code than we can. These days, if we’re having a performance problem, we probably have a problem in our algorithm, something that requires changing (not tweaking) the algorithm. Note: I’m not including performance problems relating to bugs here, that’s a different problem.

Unfortunately, sometimes optimization done by the compiler can be premature and broken. I have encountered such cases in LINQ, and needless to say, I’m not terribly pleased by it.

In our database, we have our permissions split up among three different tables (representing the three different types of permissions). I’m not going to argue about whether or not this is the best way to have done this, I’m probably going to be changing this behavior, but it’s what I have right now, and that small refactor isn’t important enough to change right this instant. It did, however, lead to the following problem. But first, the schema:

DBSchema.png

As you can see, each set of permissions requires just a bit more data to qualify it. However, I found (at least) one case where I wanted to combine the records from each of these tables into a single return from a web service, using the following structure:

    public class PermissionRecord
    {
        public UserRecord User { get; set; }
        public PermissionType Type { get; set; }
        public String Campus { get; set; }
        public int Year { get; set; }
        public int Term { get; set; }
        public string Prefix { get; set; }
        public int? MinCourseNumber { get; set; }
        public int? MaxCourseNumber { get; set; }
    }

For those unfamiliar with newer versions of C#, the ‘?’ after a variable type means that the type is ‘null-able’. Obviously this only works on primitives, which couldn’t be null before. Anyway, back onto the code. I wanted to fill in a List of these with records from PermissionsPrefix and PermissionsCourse. But, as is shown in the schema above, MinCourseNumber and MaxCourseNumber don’t exists is Permissions_Course and should be set to null. Sounds easy enough, right? That’s what I thought, which led to the writing of the following LINQ query.

(from t in Permissions_Prefixes
 join u in UserNames on t.UserWsuId equals u.WsuId
 where t.Campus == "Pullman" && t.Year == 2009 && t.Term == 1 && t.Prefix == "A S"
 select new PermissionInfo()
 {
    Campus = "Pullman",
    Year = 2009,
    Term = 1,
    Type = 4,
    Prefix = "A S",
    MaxCourseNumber = null,
    MinCourseNumber = null,
    User = new
    {
    UserId = t.UserWsuId,
    FullName = u.FullName,
    EMail = u.EMail
    }
    }).Union(from t in Permissions_Courses
                     join u in UserNames on t.UserWsuId equals u.WsuId
                     where t.Campus == "Pullman" && t.Year == 2009 && t.Term ==  && t.Prefix == "A S"
                     select new PermissionInfo()
                     {
                        Campus = "Pullman",
                        Year = 2009,
                        Term = 1,
                        Type = 8,
                        Prefix = "A S",
                        MaxCourseNumber = t.MaxNumber,
                        MinCourseNumber = t.MinNumber,
                        User = new
                        {
                            UserId = t.UserWsuId,
                            FullName = u.FullName,
                            EMail = u.EMail
                        }
                    })

Fairly straight forward union. Request all the relevant permissions out of PermissionsPrefix (setting the Mix and Max course values to null), and union them with the relevant records from PermissionsCourses. This looked to me like it should work, no problem. Unfortunately, it produces the following T-SQL statement:

SELECT [t4].[value] AS [Campus], [t4].[value2] AS [Year], [t4].[value3] AS [Term],
            [t4].[value4] AS [Type], [t4].[value5] AS [Prefix], 
            [t4].[value6] AS [MaxCourseNumber], [t4].[value6] AS [MinCourseNumber], 
            [t4].[UserWsuId] AS [UserId], [t4].[FullName], [t4].[EMail]
FROM (
    SELECT @p4 AS [value], @p5 AS [value2], @p6 AS [value3], @p7 AS [value4], 
                @p8 AS [value5], @p9 AS [value6], [t0].[UserWsuId], [t1].[FullName], 
                [t1].[EMail]
    FROM [Permissions_Prefix] AS [t0]
    INNER JOIN [UserNames] AS [t1] ON ([t0].[UserWsuId]) = [t1].[WsuId]
    WHERE ([t0].[Campus] = @p0) AND ([t0].[Year] = @p1) AND ([t0].[Term] = @p2) AND ([t0].[Prefix] = @p3)
    UNION
    SELECT @p14 AS [value], @p15 AS [value2], @p16 AS [value3], @p17 AS [value4], 
                @p18 AS [value5], [t2].[MaxNumber], [t2].[MinNumber], [t2].[UserWsuId], 
                [t3].[FullName], [t3].[EMail]
    FROM [Permissions_Course] AS [t2]
    INNER JOIN [UserNames] AS [t3] ON ([t2].[UserWsuId]) = [t3].[WsuId]
    WHERE ([t2].[Campus] = @p10) AND ([t2].[Year] = @p11) AND ([t2].[Term] = @p12) AND ([t2].[Prefix] = @p13)
    ) AS [t4]

Take a close look at that query. Notice how in the outer select both MaxCourseNumber and MinCourseNumber are being set to value6? And that the first union select, has 9 items and the second has 10? Needless to say, this doesn’t work. Since both Min and Max Course Number are being set to the same value, LINQ decides to try to optimize the select by only selecting the value once and assigning the value twice. Oh, easy fix, I hear you say, just switch the two queries, and put the request from Permissions_Course first!

SELECT [t4].[value] AS [Campus], [t4].[value2] AS [Year], [t4].[value3] AS [Term],
    [t4].[value4] AS [Type], [t4].[value5] AS [Prefix], 
    [t4].[MaxNumber] AS [MaxCourseNumber], [t4].[MinNumber] AS [MinCourseNumber], 
    [t4].[UserWsuId] AS [UserId], [t4].[FullName], [t4].[EMail]
FROM (
    SELECT @p4 AS [value], @p5 AS [value2], @p6 AS [value3], @p7 AS [value4], 
        @p8 AS [value5], [t0].[MaxNumber], [t0].[MinNumber], [t0].[UserWsuId], 
        [t1].[FullName], [t1].[EMail]
    FROM [Permissions_Course] AS [t0]
    INNER JOIN [UserNames] AS [t1] ON ([t0].[UserWsuId]) = [t1].[WsuId]
    WHERE ([t0].[Campus] = @p0) AND ([t0].[Year] = @p1) AND ([t0].[Term] = @p2) AND ([t0].[Prefix] = @p3)
    UNION
    SELECT @p13 AS [value], @p14 AS [value2], @p15 AS [value3], @p16 AS [value4], 
        @p17 AS [value5], @p18 AS [value6], [t2].[UserWsuId], [t3].[FullName], 
        [t3].[EMail]
    FROM [Permissions_Prefix] AS [t2]
    INNER JOIN [UserNames] AS [t3] ON ([t2].[UserWsuId]) = [t3].[WsuId]
    WHERE ([t2].[Campus] = @p9) AND ([t2].[Year] = @p10) AND ([t2].[Term] = @p11) AND ([t2].[Prefix] = @p12)
    ) AS [t4]

No dice. The outer select is correct now, but the second inner select (from PermissionsPrefix) still isn’t selecting value6 twice, and it’s saving it as value6, when clearly the outer select is expecting properties named MaxNumber and MinNumber. So, these unions are completely broken because LINQ is trying to optimize them independently of one another, which clearly doesn’t work. Not only that, but the optimizations are only half done. They’ll gladly try to combine values in the select part of the statement (if I were to change the ‘null’ to ‘1’ in the query against PermissionsPrefix, it would combine it with the Term value), but they insist on sending in multiple parameters with the exact same values in other parts of the query. For instance, the value for Term is the same in both the Where clause and the select clause, but it’s clear that the SQL generated is sending the parameter in with two different names. This happens in the same query, not just across the two queries.

Why is this such a big deal? Simply because I now am forced to separate this out into two distinct LINQ queries, which each ToList() inside of .NET. Once I’ve done that, I have to then append the results of the second query onto the first list, resort the list how I want it, and dump the data back across the wire. Essentially, I have to redo in .NET a lot of things that SQL is specially tuned to do. Now, in this case, the number of records we’re dealing with is trivially small, and as we know, everything is fast for small n.

I fail to understand why Microsoft has built some optimizations into the LINQ processor and not others. Admittedly there are times when optimization is difficult because it’s not always clear when you should optimize the SQL. Sometimes (as in the query above) you can optimize ahead of time. Other times, you’ll need to wait until runtime, but that can always be determined by code-analysis. And if you’re not sure, you should always err on the side of caution, and not combine my select elements simply because a cursory glance suggests it might work, because there is a very good chance it won’t.

Premature Optimization may be the root of all evil, as it can lead to a large amount of wasted developer time, but as a developer I should be able to trust my compiler to not change my code in a breaking fashion. LINQ is a great tool, but it still has some issues I really hope Microsoft corrects for .NET 4.0.

Sustainable Living: Transportation

American’s have a love affair with the open road. At least, that’s what we’ve been told for years. And really, even with high gas prices I don’t think that changed much. People still traveled, they still drove. Maybe not as far, but people still drove, complaining all the while. However, the previous incredible rise in oil costs, while we’re only now starting to see a respite from, has brought to many people’s attention to need to lessen our dependence on fossil fuels.

There are plenty of ways to accomplish this, even in todays market. Carpooling, public transportation, bicycling, walking, etc. I’m lucky enough to live now where I can walk to work every day, even in the dead of winter. This is a huge improvement on my gasoline usage, even though I only used to have to drive across town, and I’m lucky enough to live in a town with a pretty decent Public Transportation system. Unfortunately, here in the Western United States, things tend to be spread out, many areas lack effective public transport, and personally I’ve never liked the idea of carpooling because I don’t like being beholden to other people’s schedules. My wife’s horse lives quite a ways out of town. We try to drive less, but we absolutely have to drive.

In part because our current car, a ‘95 Mazda B4000 Pickup, get’s fairly bad gas mileage (I think the mass airflow sensor is going out too, which isn’t helping), we decided to buy another car. We bought a 1971 Volkswagen Type 3 Squareback (which should get near 30 mpg). I’ll get some pictures up soon, the weather wasn’t that great this weekend. The car isn’t perfect. It lacks seatbelts, some of the vinyl needs to be replaced, it’s got a few rust spots, the high beams seem more controlled by the angle of rotation of the steering wheel than any other control, the tires are ancient, the transmission is loose, the rear break cylinders are leaking, but the car runs and runs great.

And all the rest of that stuff, I can fix myself.

I do feel that part of sustainability on a global scale needs to involve either doing things for yourself, and while I don’t necessarily want to put a mechanic out of business, I want to have enough knowledge that I don’t have to go to the mechanic, and that I can communicate with the mechanic in a meaningful way. If something is acting up, I want enough knowledge to diagnose the problem, if not correct it. Plus, with this car I can build up that knowledge base pretty easily.

We love the open road, we love to travel, but mostly, I think American’s just love to drive. Admittedly, out little Square isn’t a racecar, it isn’t fun to drive because it’s fast and feels out of control. It’s fun to drive because it’s smooth and simple, and it just looks nice. Of course, that’ll be a lot better once I get accustomed to it. I haven’t driven a manual transmission much in years, and it just takes time to learn the intricacies of any vehicle. Still, the car feels and looks good, and it’s something my wife and I can work on ourselves, and it gets much, much better gas mileage, further reducing our ‘carbon footprint’.

The need to get from place to place is never going to change, and currently, gasoline is still the best way to drive an engine. Admittedly, I’m excited by the work of Tesla Motors. I’ve given a lot of thought to converting a car to electric. I think that’s where the future lies, but right now today, I need gas. And in getting a vehicle with better gas mileage, I’ve also got a vehicle I stand to learn a lot from. Sort of a best of both worlds for me.