November 2007 Archives

The Death of Copyright

Copyright will still be waved around for a good long while, but all that’s really being waved around is the dessicated corpse of a ideal which has long since lost the luster with which it’s creators bore it. In order to understand how far we’ve fallen, let’s begin with a discussion of classic copyright.

Per the Berne Convention, which the US signed in 1889, Copyright is implicitly created the first time a work is made available in a physical form. Even though the US signed the Berne Convention in 1889, it wasn’t until 1989 that the Berne Convention Implementation Act was passed, finally brining legal status for noticeless copyright to the American people. Copyright allowed the creator of a work control over production of copies of a work, distribution of a work, creation of derivative works, public display of the work, transmission of the work, and transferance/sale of the above rights.

Importantly, Copyright had limits. Copyright protects against an expression of an idea, not the idea itself. This is what makes JK Rowling’s suit against the Harry Potter Lexicon potentially so ridiculous. The Lexicon presents information gleaned from Rowling’s novels, but their expression of those ideas is markedly different from Rowling’s own. If someone were to release a book of magic spells based on the Harry Potter world, they could do that, provided they don’t use the Harry Potter story in their work. The idea of the world is not covered by copyright, only Rowling’s expression of it. Further, fair use doctrine (formalized in the US in 1976), even allows a person to use portions of a work without licensing the Copyright. The guidelines laid out in law require that a Copyright holder prove damages, and the potential infringer can not have used a significant amount of the work. It is this portion of law that allows for parody, as well as allowing a home user to record from television or radio. If fair use dies, so does your TiVo.

Copyright typically lasts about 70 years after the death of a creator, depending on the jurisdiction under which the work was created, or 75 years if the work was created by a Corporate entity. Unfortuntaely, this is where the first attacks on Copyright occured. In 1998, Sonny Bono and Disney lobbied hard for the Copyright Term Extension Act, which was seen by many as an attempt to protect Disney’s control over Mickey Mouse, created in 1928. In the end it worked, and Mickey is safe from public domain until 2023.

Does the Disney Corporation really need to continue to own Mickey any longer? Mickey Mouse is one of the most beloved symbols of our global society in this day and age, a fact that has helped Disney net billions of dollars. The purpose of copyright, to allow a creator control over an expression for a reasonable period of time, so that they can make a resonable profit before the expression enters public domain, has been more than satisfied. If anything, Disney’s desire for expiry-free copyright is hampering creativity, not fostering it.

So as the public domain foundation of Copyright is whittled away at, what other attacks are flailing away at this once beautiful thing? Fair Use is being attacked from every angle, by the DMCA. The DMCA and the DRM bastard children that it has spawned have the media conglomerates telling me when and where and how I can view the media that I’ve rightfully purchased. I’m not talking about file-sharing, that is a clear violation of fair-use, but rather things like DVD and it’s Content Scrambling System. The irony of CSS was that, in treating customers like potential criminals, many were forced to behave as such, in order to view DVDs on Linux, or to import DVDs.

Still, DRM continues to flourish, through online music stores, Video Games, and High Definition DVD standards. If you don’t own an iPod or a Zune, the DRM-laden music schemes are useless to you, though there are alternatives. Amazon has a DRM-Free music store, and Magnatune bills themselves as “not-evil” (Incidentally, Radiohead’s recent name-your-own-price album sale has been done by Magnatune for a while). Without the DRM-layer, users are finding themselves able to use their music where they want to, and don’t have to fear losing the encryption keys that allow them to unlock the media they purchased.

Perhaps not all DRM is bad. The Nintendo Wii’s Virtual Console uses DRM to protect the downloaded games from copying, but I’d have no desire to play those games anywhere else. However, it still limits my ability to use the media as I would want, and that is dangerous. That is where the problems lie in the online music downloads, and that is where I think people are going to end up fighting DRM first. The second front I want to see taken down, is the region-locking of DVDs, an attempt by the movie producers to prevent importing. If I want to pay a premium to import a European film before the production house releases it in this country, I should have that ability.

Canada is getting ready to pass their own DMCA law, one that goes far further to gutting traditional copyright than the one here in the US. It’s too late for us, but you Canadians still have a chance. Don’t descend into Copyright hell, as we have. We as a people are going to spend the next decade trying to dig our way out, and it might be futile. You don’t need to. Fight this today. Hopefully you’re victory will steel us, and help us find ours.

Financial Institutions and Weak Passwords

I fail to understand why ANY institution would limit any character from appearing in a password, but it happens all the time. When I first registered to do online banking at US Bank six years ago, they limited passwords to eight to twelve characters, and only letters and numbers. This only allows for 8^52 - 12^52 possible passwords. Yes, those numbers are big, but computers are getting faster, and such limits can be prohibitive, particularly if they’re stored unsalted, and thus vulnerable to a rainbow attack.

They’ve since changed this, allowing up to 24 characters, and symbols. Ignoring special ASCII characters, like linefeeds, this probably allows about 200 characters, opening the password space to 8^200 - 24^200, a lot harder to attack (this could still be susceptible to a Rainbow Attack though, I don’t know if US Bank salts their passwords before hashing them). Even if you only add in the symbols people are actually likely to use, there are still ~80 characters that could consist of the password (ie, the ones easily typed on an US English keyboard), and that exponent is a powerful thing.

So, imagine my surpise when I go to TIAA-CREF’s website, and find that my password must be 8 - 12 characters long, and can only consist of letters and numbers. Amazingly, I know I’ve seen this sort of requirement still in effect at other major financial institutions, and I’ve yet to be able to figure out where in the hell it’s coming from. Password schemes are easy to implement in a safe manner. They should use a cryptographically secure hashing algorithm (I’ve used MD5 and SHA-256, but lately I’ve migrated to bcrypt), users should be encouraged to use a long password with a variety of characters to prevent it from being cracked if an attacker can get the hashes, and the password should be properly salted before hashing to prevent rainbow attacks.

I just doesn’t make sense. Computers are fast, and hashing algorithms are designed to be fast. To a certain degree, this speed is a problem, as a fast hash on the server, means a fast hash on the client. As Ptacek explains in the post linked, an adaptive system like bcrypt is great, because it will likely keep pace with advances in computing technology.

So, if the technology and procedures are out there to create good passwords, why don’t more people use them? Plain and simple ignorance is the only explanation. Which is unfortunate, because ignorance costs a lot of money in this day and age. As I said, I’ve used MD5 and SHA-256 for passwords before, and in the places where I can, I’m trying to correct that error. Ultimately, though, as long as a password isn’t thousands of characters, it’s not going to create a bottlneck in transfer or processing. Minimum password lengths are far, far more important than maximum lengths, as short passwords will always be brute-forceable. Put a maximum length on a password, for sanity’s sake, but set it as high as you reasonable can given the storage requirements and constraints of your system. I’ve used a maximum length of 1024 characters before, which was far more space than anyone needed, and didn’t put any strain on the database.

In short, when designing new systems, use established mechanisms like bcrypt, or Kerberos if you’re in an environment you can trust it. If you don’t you’re likely to end 30 years behind the times, and in a world of hurt because of your poor security. And if you’re a financial institution, fix your lousy schemes immediately. There is no excuse for poor security when you handle people’s money.

New Front on the War of the Unexpected: Burning Homes

Bruce Scneier posts another story in his series on the war on the unexpected. I’ve commented on this before, but I feel the need to once again raise the issue here on how paranoid and stupid our world is becoming.

When going to private residences, for example, they are told to be alert for a person who is hostile, uncooperative or expressing hate or discontent with the United States; unusual chemicals or other materials that seem out of place; ammunition, firearms or weapons boxes; surveillance equipment; still and video cameras; night-vision goggles; maps, photos, blueprints; police manuals, training manuals, flight manuals; and little or no furniture other than a bed or mattress.

First off, most people are hostile and uncooperative when under duress, I doubt that firefighters encounter many people that aren’t when their houses are burning down. Second, what constitutes an ‘unusual chemical’? If I were into etching, I’d have a lot of acids lying around. As a brewer, I have a fair supply of chemicals sitting around the house for cleaning and sanitation purposes. As for the rest of that stuff, everyone has something off of that list. Some of us, have quite a bit. I collect surveillance equipment because I find it interesting, and it could always come in useful some day. I knew plenty of people in college who had plenty of guns. Training manuals is such a vague term, I don’t even know what to think about it.

Someone on Schneier’s blog left a comment about ‘thought crime’, and while I hate such a Bradburian reference, it’s almost relevant. As our government, and other governments, step up the ‘war on terror’, they chip away at our ability to learn and to think. As this continues, we as a people will begin to lose our competitive edge, falling deeper and deeper into the quagmire of entitlement that we already live in. Ultimately, this is our fault.

The American Dream is dead. People want to be lifted up, not pull themselves up. This attitude has become so predominant within our culture, that most people are gladly trading in freedom, and the ability to become more than they are today. My great fear is how much further will we slide? And will we, as a country, as a people, choose freedom soon enough that we can regain it?

Home Networking Needs

My old Netgear router is finally showing itself to be almost completely inadequate for my home needs. It doesn’t support WPA, and it’s WEP support doesn’t work on my Nintendo DS. I’ve been running the network unencrypted, but MAC limited for months. While still living with my parents, this wasn’t a big deal, because the only traffic running unencrypted was my DS and my Wii. Now, however, all the traffic from my Mac Mini, desktop, Nintendo Products, and Catherine’s laptop run over unencrypted links. I practice good password rules, and I try to ensure that as much of my traffic is encrypted via SSL as I can, but the situation needs to be rectified, and sooner would be better. Unfortunately, it’s not the poor wireless security that’s forcing my hand, it’s that Catherine’s Dell Insprion 1420N notebook barely works when connected to the router. I’m not sure if it’s a hardware issue, since I don’t recall having this problem with my Ubuntu box when it was connected directly to this router (which it was for years), but the problem persists on wired and wireless connections from the Dell.

I’m stuck, because I don’t want one new Wireless network, I want two. One for the Nintendo products, which will be WEP protected, MAC address locked, not allowed to access the systems on the rest of the network, and can be a relatively slow link (11Mpbs). One for everything else, which will need to use WPA and should be a much faster link (Wireless-N would be nice). The router should have mature firewall capabilities, and if it can serve as a VPN gateway, that would be nice too. The problem is, no one sells wireless routers with two wireless interfaces (I can get by with a WAN and a single LAN port). I need to support WEP encryption on one interface, because the Nintendo DS can’t do WPA, and I like the idea of having my Nintendo products in their own DMZ. Ideally, whatever product I get will provide Web and SSH interfaces for configuration.

Due to my requirements, I am left considering a Soekris net4521 or net4826, probably running m0n0wall. The net4826 has the benefits of sporting a faster processor, more RAM, and a PATA interface so it can have a laptop hard drive, ideal if the device is going to be doing any of it’s own logging, plus the flash card is soldered on, which is also not ideal. This leaves me more attracted to the net4521, which has an easily replaceable CompactFlash interface, and a pair of Ethernet ports, so I could have a WAN and LAN without having to do a wireless bridge. Regretfully, the net4521 appears to be out of stock.

Ultimately, I face another problem with the Soekris engineering route: it’s expensive. Certainly, the hardware looks good, and the power requirements are low, but I’d probably be spending at least $300 on the board, case, power supply, and two PC-Card NICs, given the prevelance of products that almost fit my needs, but that cost far less, I’m unsure about taking this route. Would anyone else be interesting in buying a VPN/Firewall/Wireless Router could manage or connect up to two Wireless networks?

We Ate, We Drank, We Were Merry

Catherine and I just got back last night from a long weekend over in Seattle visiting my mother’s family for Thanksgiving. I hadn’t been to a gathering of that side of the family in nearly eight years, and Catherine had never met most of them. Still, it was a lot of fun, there was a ton of food, lots of beer, wine, and liquor. Overall, we spent the latter half of the week having a really good time with everyone.

The drive over last Wednesday was fantastic, we took Highway 28 from Colfax to Vantage, and then I-90 into Seattle to meet a friend in the U-District for dinner at a restaurant named Costas (University and 27th). The food was Mediterranean (Greek and Italian), and the food was very good, but frankly, Niko’s in Spokane is far superior. Back to the drive, we made it from Pullman to Seattle in just under five hours, which included some of the most amazing fog I’d ever had to drive through. From about 55 miles to Vantage for 20 miles or so on Highway 28 we were ensconced by fog that didn’t allow us to see more than a hundred yards or so in any direction. We hit the same kind of fog around Ellensburg, which was a lot more nerve-wracking because traffic dictated that we continue driving at 70 miles per hour with limited visibility. It was kind of strange driving along the highway with limited visibility, usually only to see the rest of the traffic based on their lights in the distance. It was the kind of fog you tell horror stories about, that you half expect some strange monster to come flying out of, aside from the raptors that live and hunt throughout Eastern Washington.

This was my first trip to Seattle that required me to drive those streets myself, and I’m exceedingly glad that I don’t have to do it on a regular basis. The roads have more lanes of traffic than they’re wide enough to properly support, people are parked next to curbs within what appears to be a travel lane based on the lines painted on the road, people are consistently speeding, weaving in and out of traffic, maybe it’s just because I’ve never lived in a truly urban environment (even in Spokane, I mostly stuck to the Valley), but I can’t even imagine driving through Seattle on a daily basis. If I were to live there, I’d have to live near where I worked.

Once we got up to my Aunt and Uncle’s in Mill Creek, we didn’t have to deal with the Seattle traffic directly. We stayed in Mill Creek for Thanksgiving Dinner on Thursday, where we’d roasted this enormous turkey, probably almost 30 pounds, had some amazing garlic mashed-potatoes, sugared carrots, fried green-beans, sweet-potatoes, ham, pie, and so on. It was an enormous feast, and between the food and the wine, everyone was stuffed by the end of it, and there was still plenty left over.

Friday we sat out on the Black Friday shopping, and instead met my Great Aunt for lunch at a restaurant called Bahama Breeze. The place had great atmosphere, but the food was somewhat disappointing. I had a pocket-bread chicken salad sandwich, which lacked much flavor in the chicken salad, instead depending on flavor coming from the Apple-Mango salsa which adorned it. This would have worked out better if the salsa had been mixed in with the chicken salad, instead of simply having a little bit set on top. Not a restaurant I’d hurry back to, despite the cool atmosphere.

Saturday, we made our way to the UW campus to watch the 100th Apple Cup between WSU and UW. This was fun because the family is split between Huskies and Cougars, so we had some playful ribbing to go along with the tailgating. Once again, we all ate until we could eat no more, and drank ourselves silly. The game was fun to watch because it spent a lot of time going back and forth, and the game was defined by big plays. It was mildly depressing at the beginning of the game the Cougars (my almost alma-mater and current employer) gave up 10 points in the first seven minutes, but the pulled back and tied things up by the half keeping the game neck and neck throughout. And as big plays defined the game, big plays ended it, with the Cougars scoring on a 33-yard touchdown pass with thirty seconds on the clock, and an interception in the Huskies end-zone in the final seconds of the game.

Of course, not being much of a football fan, I thought the most entertaining part of the game was the incredibly poor reception Washington’s Governor, Christine Gregoire, received as she prepared to award the Apple Cup. I was a tiny bit surprised, as politically, I expected her to be more popular in Seattle, but apparently, she’s done about as much to please people over there as she has for us Eastern Washington residents.

After the game, my father and one of my uncles had to settle a bet, by buying dinner for the entire group, so we made out way to this fantastic restaurant not too far from the U. La Piazza, located at 55th St and 35th Ave, is a traditional rustic Italian restaurant, offering plenty of pasta, fish, meat, and even some traditional Italian pizza (there wasn’t any pepperoni in the place). As excellently prepared as Catherine’s veal was, and my father was raving over his Spaghetti and Salmon, everyone seemed to feel that the sauce that the Tortellini con Gorgonzola came with was the best food in the house. It was a nice, thick, creamy cheese sauce with tasty tortellini and plenty of basil on top. All in all, everything you could get there was fantastic, and I’m almost thinking it’s worth going to Seattle solely to visit this restaurant. Plus, my sister’s boyfriend was our waiter, which was kind of fun for everyone.

Sunday we got out of town by noon, after breakfast with my parents, so that we could drive through Spokane so that Catherine could see her folks. It was great to see everyone, and the trip was a lot of fun, even if we didn’t really get any work done while we were out. The driving wasn’t bad, and for what we got out of the trip, we really didn’t spend very much. I’m really looking forward to seeing everyone again in June.

On Source Control

A frighteningly large number of developers don’t use any sort of Source Control for their projects. I know I didn’t throughout most of my collegiate career. This is a shame, really, because often times we end up spending hours trying to ‘undevelop’ a path that didn’t work out, while with good source control, we’d never have to waste that sort of time again. I recently sat through an interview with a gentleman who had only used Source Control in a single company he’d worked with as a contract programmer in 20+ years in the software game. And this was with big companies, like Disney and Lockheed Martin. To be fair to those companies, he was working on older Mainframe applications, but even he acknowledged that based on his brief experience with Source Control, things would have been easier had it existed at those companies.

It doesn’t matter how large a team you have, from a single developer, to a team of hundreds, Source Control will simply make you more efficient. It provides a full history of the application from the time of the import, it provides the ability to take out ‘bad’ changes from source files, it protects users from overwriting another developers changes. At my current job, we lack Source Control, and it’s occured many times where we’ve had two people begin working on the same file, only noticed when our editor starts complaining about the file being modified outside of itself. To a degree, good communication can fix this problem, but it’s just not cost-effective to have to verify that a file isn’t in use each time you go to use it.

When it comes to Source Control, I view there as being two primary ways of thinking. The first, is the File-centric method. This method was the original method, and it is still embodied in tools like Subversion and Perforce. These systems concern themselves primarily with tracking the history of each file in the repository as it has changed over time. These systems can be configured to allow a user to either lock the file, so no one else can modify it, or simply allow multiple users to edit a file, and race for the check-in. Depending on the software, there will be different levels of maturity on the tools used to resolve conflicts when a file has been modified locally between check-outs. These systems tend not to have very mature branching models, as any branch created is more a copy of the trunk than an extension of it, even when the changes are merged from the branch to the trunk, you lose much of the history of the branch. In many respects, this style represents the old guard, the old way of thinking, and these tools have typically begun to integrate a great deal of the features and ideas from the next methodology.

Tree-centric programs don’t treat the files as any more or less important than the other aspects of the tree, the directory structure. These systems typically have a much easier time moving files within the tree. PlasticSCM and git are the two systems that match this way of thinking that I’m most familiar with. In addition to handling file movement better, these systems tend to have much better branching mechanisms, and better support the “Branch-Per-Task” methodology of development, where you create a new branch for every task (either new feature, or bug) that you encounter. By keeping all the development in various branches, a developer can get full source control while they edit files, but ensure that trunk always remains buildable and runnable, which can be very, very important in production environments. By branching heavily, a developer can ensure that their modifications are working, without having to worry about bizarre interplays between the code their modifying and the code someone else is modifying, until both pieces of code are deemed ready and need to be integrated.

More branching does mean more merging, but git and Plastic both have excellent merge-tracking capabilities, largely because in their tree-centric model, they track versions, instead of files, and a version of a file or directory can have multiple parents, offering full history of a file back through every branch that has ever modified that file up to the current version. A very powerul tool, that makes it easier to track who has done what and when.

I think it’s obvious that I’m a big fan of Plastic and git. I have more experience with git, and will continue to love it for Open Source development on distributed projects. Plastic is my choice for team development, however. Sure, it’s expensive (but still cheaper than Perforce), but it’s a great program and it offers really solid tools to maintain code with. But, there are plenty of other options out there too, that I’m simply not as familiar with. My current team is looking at Microsoft’s Team Foundation Server as an alternative, which my manger likes because it offers a lot of built in reporting support that doesn’t exist elsewhere. I’ll be posting more impressions of Perforce, Plastic, and TFS as I become more exposed to those options.

Every team needs to research and decide on the tool that will work best for them. I think Plastic is a great choice, and I’m pulling for it with my team, but in the end, implementing any source control mechanism will be an enormous benefit to our team. The question is “Which Source Control should I use?” Not “Should we use Source Control?”

Password Management

I think it’s worth talking about Passwords. Most of them suck. Most people use the same password (or small set of passwords) for everything from their banking to the membership of discussions forums, to signing up for free iPod giveaways. Most people’s passwords are based on dictionary words, personal information, and simple transfomations of letters to numbers or symbols (a = 4 = @, etc).

It’s hard to blame anyone for this, passwords are asked for everywhere on the internet, we can’t possibly remember a unique password for everything. The trick is simple. Password management. A good password manager only requires you to remember one password, and then you’ll be able to use unique passwords on every new site, and easily access them. While I have yet to find what I would consider to be a “perfect” password manager, there are some decent products out there.

There are a small handful requirements I have for Password Management software. First, the software must have tools to generate strong passwords, with customizable rules. Second, the password files must be stored on the disk in an encrypted fashion, and have controls to prevent unencrypted versions of the content from ever being written to disk. The software must be able to store a wide array of passwords. Finally, the software must be easy to use.

First, there are several built-in password managers in different pieces of software, Firefox and Internet Explorer being the popular two. While convenient, these typically have very poor security. In Firefox, the passwords are merely saved in Base-64 Encoding and can be easily translated if anyone can get access to your files. While Master Passwords will make this harder, they too can be easily recovered if an attacker has phsyical access to your system. Even IE7 suffers from a similar plight, which makes the passwords trivial to recover. These solutions both fail because the passwords are not stored using strong encryption, they can’t generate strong passwords on their own, and they can only store passwords for FTP and Web sites. Very useful, but could be a wider range.

First, there are the software password managers. On Windows, Password Safe is a popular choice, having originally been written by Bruce Schneier, and since Open-Sourced, this is a solid bit of software that has an heirarchical method of organizing passwords, can generate passwords for you, and keeps your passwords secure using TwoFish encryption. This is really handy, and the fact that Password Safe (or it’s Python-cousin for GTK Revelation), can run from a flash drive without full installation make the software that much more useful. Carry your encrypted passwords with you on a flash drive, and the software to access them with. These two programs, individually, fulfill the needs I list above, and do a fine job. Each of them can be set up to launch access to the accounts listed, even to a wide variety of different types of accounts. The problems I have is that the password manager has to do the unlocking, and it’s far, far from seamless. Also, the passwords aren’t always available. If you don’t have a computer available, or a computer that can run your password manager, you’re unable to access them.

The next great platform are portable password managers. In this arena, we have the Mandylion Passwrod Manager, a excellent little keychain fob which will store and generate passwords for you, and is likely to always be on your person. Even if someone steals it, the passwords are protected by a configurable passphrase using the buttons on the front, and the device can be configured to ‘self destruct’ and eliminate all of it’s data if someone appears to be trying to break into the device. It’s weaknesses are that it can only store a handful of passwords (50), and for most people it’s just one more, unwelcome thing to carry around. The next generation smartphones seem like a good candidate for password management software, as most people wouldn’t dream of going anywhere without their phones. Of course, this arena would require several different versions of the software, as there are just too many target platforms (Palm, WinCE, iPhone, Android, J2ME, etc). As great as these tools would be, they suffer from the fact that good passwords don’t usually flow when typed, and they lack the ability to copy and paste. This seems like a minor tradeoff for the benefits of ubiquitous access, but these solutions do lose points for a lower level of user-friendliness.

Becoming more common, we have user keyrings being built into computing environments. Mac OS X Keychain was the first strong example of this idea, where a user has an encrypted password safe, which authorized applications can query to get passwords out of (and save passwords in), which allow the system to take over the role of managing passwords in a secure fashion. The Keychain is an excellent example of how this sort of technology is supposed to work, offering tight integration with Safari, Mail, Finder, and most other Apple-provided applications. GNOME is working on their own keyring, and Mozilla is to extending Firefox and Thunderbird to support Operating Environment Keyrings like Apple and Gnome’s. While these solutions are great, and offer wonderful integration with a computing experience, neither can create their own passwords, and neither can be easily carried with you.

Currently, I tend to favor Password Safe and Revelation. They’re great programs, that offer great security. Mac users will probably be happy with Keyring, and as GNOME Keyring improves, I suspect my reliance on Revelation will fade. Ultimately, though, I want to be able to carry my passwords with me. I like the Mandylion device, and I’ve planned to buy one for quite some time, but as Smartphones become more cost effective, I think they might be the answer, espeically if they can be easily synced with a computer’s keyring.

The most important point, however, is that there are other options out there, and excercising them will help your own security tremendously. Stop using the same password for everything, stop using passwords based on the dictionary or personal information. Come up with one strong password or passphrase that you can remember, and use a good tool to manage the rest. It will take a tiny bit more effort up front, but the peace of mind and protection of your personal data should be worth it.

First Snow of Pullman

Thumbnail image for 00001.jpg

Well, it's snowed on us for the first time since we moved down here to Pullman, and I should have known this was going to stick when we made our way over.  The hills ought to make this fun, and I'm really hoping this clears up a bit before we have to travel.

Ubuntu on Dell Inspiron 1420N

Quite a while back, Dell decided to begin offering Ubuntu on some of their home-user systems. Catherine was needing a new laptop that she could use as a research laptop, so I talked her into the Dell Inspiron 1420N.

This worked out nicely because most of the software used in Statistical Phylogenetic research today was written for Unix-based systems. Because of this, we knew that Windows was going to be too much of a hassle, and the price of the Macbook Pros was just too much money. In short, we were able to buy a Inspiron 1420, configured similarly to the current low-end Macbook Pro, for about $700 less than we’d have paid to Apple.

Overall, we’ve been happy with the system. Ubuntu loaded right up, and lacked any “Dell-isms” that you have to deal with on their Windows-based laptops. It was a clean Ubuntu 7.04 install. I did attempt when we got the system to upgrade it to Ubuntu 7.10, but that turned out to be a huge failure (to be fair, this was pointed out by Dell’s Linux Labs).

The only thing that was really advertised was the lack of Compiz support, which wouldn’t have been that big of a deal, if Ubuntu would have failed over to metacity without throwing a fit about the fact that Compiz wasn’t working. We could have told Ubuntu to stop complaining, but unfortunately, that would have prevented Compiz from automatically working in the event that a future update fixed the Intel 3d support.

Since 7.10 was causing so many problems, we decided we needed to downgrade back to 7.04. Dell packages their Ubuntu laptops with a small partition which contains the base system which can be easily reinstalled. Unfortunately, there isn’t a separate /home partition, so you have to be sure to backup your home before reinstalling. As a big fan of a special /home partition, I found that inconvenient, but I do understand why Dell did it with a system intended for non-Technical users.

We were having major, major issues with the laptop on our network at home. It would connect cleanly, but didn’t work very well. After some research, I noticed that ip-based traffic was lightning fast, but DNS lookups were painful, even thought the default Gateway was the DNS server. This took me a few weeks to figure out. A bit of research suggested it might be related to ipv6 support. I hadn’t taken that into account because I’ve never had problems with it being enabled on any other system, but it appears that blacklisting the ipv6 modules has fixed the problems we were having. It’s working great right now, but we haven’t tested it on Campus yet, and I’m not sure the problem was completely resolved at home, but I suspect now it would be a router issue.

All in all, we’ve been happy with the purchase. The built-in SD Card reader is working fantastically, and we’ve already used it a few times to load pictures from digital cameras. Haven’t had the opportunity to try the PCMCIA support, or the Firewire support, but everything else has been a fantastic user experience. Apparently, it’s attracted some attention from some of the other Grad Students in Biology, so it might be interesting to see if it catches on a bit more here at WSU.

Hushmail Turns Over Email to Government

According to Ryan Singel at Wired, Hushmail, a company specializing in offering Encrypted Webmail to their customers. In my previous series on e-mail encryption, I mentioned how difficult a problem this was. My argument was two-fold. First, if you store the Private Key on the web-mail server, the key can be compromised if the server is compromised, even you passphrase is up for grabs because it has to travel accross the network, and be unencrypted on the server.

Doing the encryption/decryption on the client is difficult, because it’s not convenient. It’s impossible to do in Javascript because Javascript isn’t designed for the kind of heavy math that encrpytion requires, and it would be painfully slow to do this. Also, since Javascript can’t read data from the local disk, you’d still have to store the keys on the servers. At least this time, the passphrase to unlock the key wouldn’t have to travel across the network. Hushmail opted for a different route, the use of a Java Applet which would handle the encryption/decryption process. This works because signed Java applets can be given access to the filesystem by the user. The unfortunate part of Java Client security is that the security is an all-or-nothing deal, either you give the applet full permission, or no permission.

The benifit to this, of course, is that with Hushmail’s Java client you store your encryption keys on your local computer. Apparently, due to users complaining about the hassle of not being able to access their web-mail from anywhere (because of the requirement for the keys, and Java), Hushmail began to offer a service where you can store your key on the server, and access your mail over https. While they acknowledge the fact that this does require a user to impart a significant amount of trust on Hushmail that isn’t necessary with the Applet based solution, marketing appears to play down the danger quite a bit.

That all changed though, now that Hushmail has, due to Court Order, captured a handful of users’ passphrases, and turned over a dozen CDs worth of encrypted e-mails, along with the encryption keys and passphrases to the Government. Sure, in this case it was a part of a legitimate criminal investigation, but it exemplifies the weaknesses and challenges inherent in solving the encrypted webmail problem. It also shows that Hushmail’s non-Java Applet solution is not an acceptable channel for secure communications. The lesson is clear: Protect your keys, and your passphrases. Hushmail’s less secure option doesn’t allow you to fulfill either goal.

In the end, though, Hushmail is interested more in protecting Hushmail than their users, which is understandable from a Business perspective. The protection they offer is great for keeping your data secret from undirected attacks, but Hushmail is clearly more than willing to turn over data in response to a directed government probe. So Hushmail is perfectly safe, if you use the Java-Applet and don’t give the Government any reason to ask questions, but if they do, and a Judge signs off on it, Hushmail could even ovveride the security of their Java-applet, and odds are, you’d never know. At least their owner is honest about this fact. If anything, though, this goes to show that no web-mail based encryption system could probably ever hope to be considered secure.

I believe that as handheld computing becomes more common, protection will be easier, as we won’t have to use untrusted systems as often, and our keys can always be with us. With High-Speed Wireless communications, your handheld computer could even serve as your own personal encryption engine, where data from the untrusted machine is sent to your handheld, encrypted on the trusted platform, and sent back. With a reasonable TLS algorithm protecting that short link, the danger of interception is low. It’s something to think about for the future.

Google Releases Android SDK to the World

Google’s announcement last week about Android, and the future of their rumoured ‘gPhone’ project was interesting, but didn’t really mean anything until yesterday, when the SDK was released. Now, I’ve had barely a chance to really read the SDK documentation and play with the SDK, but overall, I really like what I see. Especially since the Mono guys are looking at setting up Mono to be able to output Dalvik bytecode.

The language of choice, at least for now, is Java, and the Android system doesn’t provide access to the underlying Linux system yet, though that is likely to change if Google follows through on opening the source. As much as I tend to dislike Java as a language, it’s a reasonable choice, since Java has been the language of choice for most mobile devices for quite a while. Due to this Java-dependency, the only IDE integration currently provided is with Eclipse. I’m not a big fan of Eclipse, personally, but that’s largely because I haven’t done Java development in a while, and Eclipse isn’t overly useful outside of Java. I’m going to be using it more, as I learn Android a bit.

If this succeeds in unifying the Mobile market onto a steady platform (not identical, but steady), this will be the most revolutionary advance in mobile technology since the invention of the cell phone. Still, not everyone is excited about the Open Handset Alliance. This disappoints me quite a bit, because I’ve been planning to buy a Palm Treo in December.

I’ve always been a fan of Palm. Their PDAs have always worked well for me, and always integrated well with Linux. They’ve always made developer’s tools freely available. Of course, developing for Palm OS was never the easiest thing to do, as the OS is a fairly thin layer over the hardware. While this has some benefits, as the hardware has gotten better, the option of running code in a VM where it’s not likely to crash or hang the entire device, is really attractive.

Palm has even acknowleged this, and has been talking for years about making a Linux-based PalmOS available. Here was an opportunity to get the technology without doing that much work. And they turned it away. Their chief complaint, about being able to control the Software-Hardware relationship is bullshit on two levels. First, they offer Windows Mobile based smartphones. Second, with Android, they have the ability to tweak it to their hardware, hopefuly keeping it still compatible with the basic Android distribution.

Instead, it’s business as usual for Palm. Some people have already begun the funeral march, and while I think it’s too early for that, I’m seriously reconsidering my choice of Smartphones here in a month or so. Sure, nothing actually runs Android yet, and might not for a year, but if I get a different phone, I would like to have an upgrade path available, something Palm definitely won’t offer.

Yes, Android isn’t the only Linux-based Mobile Platform out there. Trolltech had their Greenphone, which isn’t available anymore. OpenMoko and the Neo1973 are cool, but apparently not ready for prime-time. I’ve already mentioned Palm’s vaporware offering. Android looks like a potentially great hackable platform, and I’m hopeful that it’s going to be worth it. Too bad I can’t buy anything that uses it until late next year.

My question to you, then. Is there a Smartphone which integrates well with Linux and Gnome, that should be able to run Android some day?

Open Document Foundation Closes Shop

In the battle for XML-based Document Formats, there were two: The Open Document Format (ODF), overseen by OASIS, heralded by Open Office, and Microsoft’s OOXML. I’ve always felt that ODF was a superior format, and that many of the decisions made with OOXML were suboptimal.

All this changed when the Director of Business Affairs for the Open Document Foundation, Sam Hiser, posted a scathing review of the direction ODF was going. While this was not necessarily a direct attack on ODF, but rather on the direction that Sun was taking Star Office, and thus Open Office, it led to the Foundation backing out from it’s support of ODF, in favor of the w3c’s (Compound Document Format)[], which appears to be a potential replacement for XHTML.

The impression I was left with, was that the Foundation felt that their efforts to make ODF and OOXML work together were being ignored. Personally, I think this was reasonable, as we don’t need to build a lousy XML-based office format by combining the two formats. However, Open Office does have a need to be able to at least load OOXML files, so that was a resonable usage of Sun and the team’s time. Dropping ODF with the intent of extending CDF was a childish political move, one that I don’t think the Foundation stood any chance at succeeding with.

It didn’t help when the w3c argued that CDF was in no way shape or form a suitable replacement for ODF. Not only that, but the Foundation was trying to shape CDF without joining the CDF working group. Because the Foundation wasn’t the sole controller of the future of ODF, they chose to abandon it, and it’s led to their closing, since there isn’t an accpetable alternative out there. Luckily, they didn’t try to fork the specification to take it in their own direction.

Good Bye, Open Document Foundation. The ODF will be fine without you, and I just want to thank you for not completely killing the format in your childish tantrum.

Nigerian Goverment does the Right Thing

Per Slashdot:

An anonymous reader writes “After trying to bribe a local supplier with a $400,000 marketing contract, Microsoft has still apparently lost out in trying to woo Nigeria’s government to use Windows over Linux. Microsoft threw the money at the supplier after it chose Mandriva Linux for 17,000 laptops for school children across Nigeria. The supplier took the bait and agreed to wipe Mandriva off the machines, but now Nigeria’s government has stepped in to stop the dirty deal.”

All I can say is, good for them. The decision to use Linux appeared to have originally been made based on technical and value merits, and Microsofts attempt to buy out the deal was ridiculous. I can hardly blame Microsoft though. A vendor offering incentives to try to increase their market share is nothing new, but that doesn’t make it any less anti-competitive. Nigeria’s government choosing to step in is a huge win for fair-market competition and Linux.

Between this and the One Laptop Per Child project, it should be interesting to see how Linux is going to shape the developing world.

Election Day in Washington

Yesterday was Election Day here in the United States, and Washington State had a few interesting issues on the ballot. Unfortunately, I had neglected to register to vote since I’d moved, and my parents neglected to mail me my absentee ballot for Spokane County. Such is life, at least this wasn’t a major election year. Still, I had opinions on the issues being voted on this year, and I’m disappointed that I didn’t voice my opinion, even if I agree with most of the decisions made.

Initiaive Measure 960 - Concerning Tax and Fee increases by the State government. Passed: 52.4% - 47.5%

Requires tax and fee increases proposed by the government to have a 2/3rds majority (either legislative or from the people) before being passed into law. Not surprised this passed, since the people of Washington of often tried to limit the governments ability to tax us. It’ll be interesting to see if this is upheld by the courts, as so many other attempts to do the same have not been.

Referendum Measure 67 - Concerning triple damages for people illegally denied insurance claims Passed: 56.9% - 43.0%

The wording of this referendum was ridiculous. “It will be unlawful for insurance companies to ‘unreasonably’ deny claims.” Okay, so, what is the definition of ‘unreasonable’ in the above sentence? Oh, and apparently there is a clause that will give the attorneys more money too! My favorite part is that some insurance companies will be exempt, so it’s not even a consitent law.

This law needed to apply to all insurers, have no clause regarding attorney fees, and provide a strong definition of ‘unreasonable’ for this to have been reasonable. The pro-groups claimed that Insurance rates won’t go up because of this. That’s bullshit. The cost of doing business will rise because of this law, and it will cause the rates of insurance to go up. Period. Too bad for people trying to run small businesses (like myself) who need to get insurance without the benifit of a lot of members on the same plan. Good thing for me, that I still have a day job, I guess.

Engrossed Substitute Senate Joint Resolution 8206 - Establishment of a budget stabilization account Passed: 68.0% - 31.9%

Requires that 1% of the state’s funds every year be put in a special fund with limited access. It’s an emergency fund, which is a reasonable thing for the government to have, especially given the volcanic activity we will experience again someday. I think the government should, in general, be more responsible in setting up these sorts of funds, as most government money today is just thrown into a general fund.

Senate Joint Resolution 8212 - Constitutional Amendment to allow Inmate Labor Passed: 60.1% - 39.8%

I’m glad this passed, though I was torn on this issue. The danger is that the Inmate Labor could potentially outcompete other citizen-run firms, as their expenses are less (since Inmates don’t typically get paid much for their time). However, the use of Inamte Labor could serve as a good alternative to the employment of ‘migrant workers’. It’s not that I’m against migrant workers in theory, they’re clearly filling a need that no one else is.

However, they’re usually non-citizens and they’re causing a lot of problems. They take most of their earnings out of our country, they use SSNs belonging to citizens, they often take advantage of services they’re not paying to support. Hopefully, someday, we can get a decent law passed that gives migrant workers a legitimate legal status, whether or not it provides a path to citizenry (which I don’t think most of them want).

The use of Inmate Labor provides a service to this state that can fill the same need as migrant workers, while also standing a better chance at actual rehabilitation of the men and women sent to prisons. Arizona had had work camps and inmate labor for years, as a voluntary program for prisoners to enroll in tha serves to shorten their time. Most of the prisoners who do a tour in the work camps appear to come out more ready to integrate properly in society. Plus, a large amoount of the prisons in Arizona today were constructed by inmates, no doubt at a much reduced cost to the state, and it’s people.

Engrossed House Joint Resolution 4204 - Constituional Amendent to revise school tax levies to only require a simple majority Failed: 48.1% - 51.8%

As strongly as I believe in Public Schools, I’m glad this wasn’t passed. The argument for the referendum was reasonable: If levies for prisons and stadiums and the like can be approved by a simple majority vote, why does it require a super-majority to pass school levies? Why are schools held to a higher standard?

I agree with the sentiment. I disagree with the execution. If we’re going to hold schools to a super-majority, we should hold other public works projects that require tax levies to be held to the same standard. If someone were to write up a new constituional amendment requiring a super-majority for all tax-levies based on property ownership, I’d be all for it. Congress needs to spend the money they have more responsibly, and I believe that requiring a super-majority on many more projects would help to force that to happen.

Substitute House Joint Resolution 4215 - Constitutional Amendment to invest in Higher Eductaion Funds Passed: 53.0% - 46.9%

Having been heavily affiliated with Universities for the majority of the last decade now, I definitely feel that it’s worth investing in. I just don’t understand what the point of this law is. Is the money from the investments to go to keeping tuition rates lower? Is it supposed to sponsor research or other Academic Programs? Making this money work and be available is good, but I just don’t understand what they are trying to achieve, and that make me doubt the necessity.

All in all, things landed about how I expected, and I’m about 50% in my agreement with the decisions made by the voters, which also wasn’t unexpected.

What WON'T Kids Snort?

Okay, I’ve never been into drugs. There are days I feel like I have enough trouble keeping track of what I’ve done and what I’ve only thought about doing, that I really never really thought something that could possibly make me more absent-minded would be a good thing. Still, I’ve known plenty of people who were into various drugs, and the ones that could keep it to a recreational activity were mostly good people. I didn’t tend to hang around when the activities were occuring, but I didn’t write them off as people either.

Still, this latest thing that kids have supposedly started doing (I pray this is a joke, I really, really do). Jenkem is probably the most vile way of getting high I’ve ever heard of. Breathing in the fumes of fermented fecal matter? There are two reasons I’m pretty sure this won’t work. First, Erowid says nothing about it, and the people over there are usually pretty hip to new drugs. Second, as nasty as the chemicals in sewage are, I don’t think they can cause a high, outside of oxygen deprivation. If they did, wouldn’t septic and sewer guys have to wear oxygen tanks when working?

Still, I understand why this story has gained some popularity. This is exactly the kind of thing that middle school kids would try because they heard it might get them high. I’m not saying that I think huffing fermented sewage isn’t bad for you, but I really doubt it causes an hour of visions (including dead relatives). I suppose it’s possible that the stories are true, and this does cause a problem, but I don’t think any drug is worth the taste of shit in my mouth.

Mass Storage Encryption

My previous posts on Encryption have focused on Public Key Encryption and e-mail. Users of PGP already know that they can extend the same practice to individual files. Mostly, this is used to provide a cryptographic signature on a file, to prove that it has come from a reliable source, but you can also encrypt a file such that it can be decrypted only by a specific user. This has it’s uses, but it quickly grows out of hand, and doesn’t really work well for the file-system level. Plus, the decryption of a target file places the decrypted version of the file on the hard drive, which could then be recovered at a later time by a forensic analysis of the volume. A good Mass Storage encrpytion scheme must decrypt the files in real time, storing them in memory while in use.

Mass Storage Encryption is of great importance to most large enterprises. Particularly with the proliferation of laptops and removable mass storage. Filesystem Encryption seems to come in two flavors:

  1. Full disk encryption which encrypts an entire hardware volume
  2. [Filesystem Encryption( “Wikipedia”) which encrypts files and folders on a filesystem

Of the two, Full Disk Encryption is vastly superior, as most Filesystem Encryption schemes don’t encrypt metadata on the files (name, size, owner, etc). This lack of meta-data encryption in a Filesystem encryption scheme leads to a number of interesting (both theoretically and practically) attacks on these sorts of encryption schemes. By anaylzing meta-data you could learn a lot about the target and their activities, but more insidious is the potential that suspected known file was encrypted by the filesystem. By knowing what the plaintext version of a piece of ciphertext is, you can calculate the key used to create the file. This wouldn’t be a completely trival task, but people rolling out this sort of encryption must be very careful what files they place in their encryption.

The Full Disk Encryption methods are vastly superior because they create an encrypted container that a filesystem is created within, meaning that the metadata remains safely hidden within the secure container. The name is only truly accurate when dealing with a hardware level encryption system, which are not very common. This is because the software encryption layers encrypt volumes, which are typically partitions, but can span multiple hardware drives, which is far more flexible. I suspect we’ll start to see more RAID controllers with on-board hardware encryption.

There are a few problems with Hardware-Level Encrpytion (HLE). First, most schemes on the market today use DES and TDES, both of which are schemes which are not considered particularly secure anymore due to small key sizes (easily brute-forced), and statistical weaknesses based on the birthday-bound problem. Second, no current HLE systems have a good method of storing the keys separately from the system. The current front-runner technology is the Trusted Platform Module (TPM), a hardware microcontroller which can uniquely identify a computer. TPM suffers from two weaknesses. For Internal Storage, such as hard drives, the drive is only safe from an intruder if physically separated from the computer which initialized it. For External Storage, such as USB Flash Drives, the drive is only usable on the computer it was initialized on. While TPM appears to have some other benifits, as a hardware identification layer for software encryption, I just don’t think it’s acceptable.

What is needed is a middle ground, but in order to achieve that middle ground, it is necessary to establish what data needs to be protected. In my opinion, any operating system or software runtime files should be considered expendable. On my Linux Box, it isn’t important to encrypt the contents of /usr, /boot, /lib, /bin, /sbin, and so on. Those are all system files, which if an attacker is able to take my hard drive, I don’t care about them getting their hands on. /var and /etc fall into a little bit more of a grey area. I would argue that /etc doesn’t need to be encrypted, since the majority of information on /etc wouldn’t be of much use to an attacker to getting access to my private data. Even having access to passwd and shadow wouldn’t be much of a danger if a reasonable password scheme was in place. /var could have some data that was wanted to be kept private. I feel tha the best way to handle this would be to create a new directory, say /secvar which /var would symlink to for the files that an organization wanted to keep encrypted (say their webroot or employee e-mail).

Such an encrypted partition would probably need to be available to all users while the system was running, being protected by the operating systems other security mechanisms. What this would protect from is the loss or theft of the physical drive. But how should it be mounted? Good passphrases are nice, but in a large organization, can be unweildy. Keeping keys on removable storage which would only be required at boot is a good soultion if phsyical access to the hardware is a reality. This is a decision which will need to be made at the organizational level depending on the organizational needs. The important thing is to keep backups of the passphrases and keys and ensure that access to those pieces of data are very highly regulated. If the keys are lost, so is the data.

The exact system used to facilitate the encryption is going to differ from system to system. I feel that this part of the encryption process can be Operating System specific. On Windows Vista and 2008, Bitlocker appears to be a good solution, and I know that Washington State University is considering it for an institutional standard. Bitlocker addresses my primary concern with TPM, actually, by forcing a user to create a backup key and/or passphrase, that can be used to move a Bitlocker volume to a new hardware platform. On Linux, dm-crypt and LUKS can provide similar functionality.

When it comes to encrypting user data, the situation gets a little messier. In a shared environment, where user data needs to be accessible by administration, the same methods discussed above should be used, as it provides access to the administration, while still giving the benifits of the encrypted filesystem. In a situation where user data should be protected, each user should have their own volume which can be encrypted (some systems can have virtual volumes that can be stored on a hardware filesystem and mounted as a loopback device, more on that later). In this case, the login process needs to be extended to mount that users personal encrypted filesystem when they log in.

The concept of the individual user data directory leads well into the next problem of the removable storage, most frequently USB flash drives. There are security risks regarding the use of removable storage within your organization, but I’m going to assume for the following discussion that it’s already been decided how the organization is going to be handling those security implications. Today, we’re going to discuss what can be done to ensure that if a USB drive is lost, the data on it is safe from prying eyes from whomever might have found it.

While an organization could certainly use the same encryption methods that they use for their hard disks, that solution is imperfect because it limits the ability to recover that data to a particular operating system. Ideally, the USB drive should be accessible from any OS that happens to have the requisite software avaiable. For this, I’m fond of TrucCrypt, which has good support for Windows, Mac OS X, and Linux. The nice thing about TrueCrypt, is that the concept of a encrypted filesystem is somewhat flexible. If you want to create an actual hardware partition or drive to be encrpyted, that is supported. Also, you can create a file which will represent the encrypted filesystem which can then be mounted. I suspect many users will probably create an encrpyted file to store their encrpyted filesystem for two reasons, mainly because of Windows.

On Windows, a USB Drive can not have multiple partitions, so it would be impossible to have an unencrypted drive to store, say, a version of TrueCrypt that didn’t need to be installed, and then the second encrypted partition. By a no-install version of TrueCrypt available for all platforms you’d be using the drive on, you can ensure you’ll always have access to your data. The biggest weakness of this approach, is that the most portable Filesystem, FAT, only supports files of up to 4 GiB in size. As drives get larger (8 GB drives are not terribly expensive these days) that could be an inconvenience.

TrueCrypt has some interesting features, such as the creation of “Hidden” volumes within volumes, which allow you to give an assailant a password to your filesystem which will have data that looks interesting, but really isn’t, while keeping the real data safe. While interesting, I’m not sure how useful it would be in practice, but then I’ve never had data so secret I thought someone might kill me for my key, which the TrueCrypt Docs seem to indicate they feel is a possibility. Still, might be a useful feature, even if you aren’t in the NSA.

At the end of the day, the biggest rule regarding filesystem encryption is that you should be doing it. At least for any sort of portable drives. You should do it in such a way that only certain people can access it, be it through the use of strong passphrases, or encryption keys on USB devices (which Bitlocker requires, if you don’t have TPM). If using encrpytion keys on a USB Drive, the encrypted filesystem and the USB key should be together for as short a period of time as possible. As with encrypted e-mail, there is a level of education this will require for users, but ultimately, the increase in protection of your vital data is worth any minor inconveniece it might cause your users. In truth, it can bring more convenience to them, as you can be more trusting of them taking data out of the organization to work on.

Another Reason to Encrypt your E-Mail

The Sixth Circuit Court of Appeals in Cincinatti has agreed to hear a case, United States v. Warshak, which focuses on the issue of privacy rights and electronic communication, specifically e-mail. Honestly, I’m not sure how this is going to turn out. The tack the government is taking has a lot of potential for success. The Government is attacking the problem of following e-mail by targeting ISPs. In this case, ISPs appear to be defined largely as anyone who is running a mail-server, as anyone running a mail server has the ability to snoop on any e-mail sent through their server, including saving it if they so desire.

Personally, I’m a big fan of server-side storage of e-mail. When I was the sysadmin at CB Apparel, I worked to reimplement the companies e-mail situation to use IMAP instead of POP3, and a webmail client to ensure that our sales staff had full access to their e-mail wherever they might be. Had I been able to find a more mature Open-Source Groupware Suite, I’d have integrated that instead. I love the idea of e-mail available anywhere. But it raises an interesting issue. Most ISPs have rules in their Terms & Conditions that they can monitor e-mail, either to protect themselves, others, or comply with the law. At CB Apparel, we were able to use stored e-mails to prove that a sales person was using company resources for personal gain, trading products that the company was paying for for goods that they were keeping themselves. That person was going to be caught anyway, but the stored e-mail served as a great level of proof.

The problem is that the monitoring of e-mail, either by a corporation or the government, has always depended on an issue of suspicion. Though I certainly had the ability, I had no reason to monitor my users’ e-mail unless we had a reasonable suspicion that we were going to find something. We respected the privacy of almost all our users, revoking that respect only under the case of reasonable suspicion. Same with investigating their web usage. In a few circumstances we had reason to believe they were misusing company resources, and we investigated that. However, I would never have dreamt about cooperating with Law Enforcement unless they could furnish a warrant of subpeona requiring it (though of course, that decision would not have been mine, but my manager’s). Still, the tack that the Government is taking, that e-mail privacy is non-existent and that they should be able to monitor and read it as it passes through ISP servers without requiring a warrant bothers me. A lot.

Some people are taking a 4th Amendment stance on the issue. That such a monitoring system would constitute an illegal search and seizure. I disagree with that argument, simply due to the fact that most e-mails are sent in an insecure fashion. It would be like claiming that a conversation overheard by a law-enforcement official (or even a concerned citizen) while standing in a shopping mall or a restaurant would not be permissible in court. The only success the issue might have, and the defense really needs to push this side of it, is that the illegal search is occurring against the ISP, by forcing them to turn over details on communications being conducted through their servers to the government, they are being forced to give up their right to decide when it is proper to notify the authorities. Because they are being forced to give over this information, we are being told by the government that we have no expectation of privacy in our communications. If they start with e-mail, where will they stop? How long before they can justify warrant-less phone taps to the courts? How long before they claim that any digital media is not considered private?

There are a lot of conspiracy theories regarding the NSA and their ability to break public-key encryption. Certainly, if they can’t get their hands on the Public Key, they’ll have a nearly impossible task at discovering the private key (another reason some people don’t use public-key servers), but I’m of the belief that the NSA still doesn’t have enough computing power at their disposal to efficiently crack 4096-bit keys. I’m still not completely convinced that Quantum Computing will allow the nearly immediate factoring of large primes that it might be able to provide, that are a requirement to cracking modern public-key encryption.

The thing about e-mail monitoring is that the signal to noise ratio is already insanely high. Our mail server at CB Apparel would drop thousands of e-mails a day as Spam. Some days, we’d drop more e-mail than we’d deliver, and even then some Spam would get through. Between that and all the legitimate e-mail traffic, we’re talking about a huge amount of data that would be of no interest whatsoever to law enforcement. Which means that we’d have to have a monitoring system in place that would escalate e-mails that it felt were of interest, presumably on keywords. It’s been rumored that this sort of monitoring is on the phone system already. My Uncle would joke about starting conversations with his brother with a series of words guaranteed to get some attention, like “Bomb President Assassinate Plutonium”, and then proceed with a normal conversation that would have seemed completely inane to whoever might have been listening. E-mail would be lot easier than voice to monitor, since it’s a text-based system instead of voice based, so an e-mail monitoring system seems more reasonable than a voice system (I still don’t fully believe the telephone monitoring system actually existed).

If everyone used encryption, then there wouldn’t be a signal, just noise. At most, the watchers would be left building communication networks, trying to extrapolate relationships based on communication frequency and message size. Not unlike what AT&T is doing as part of their datamining project. Is this sort of information useful? Of course it is. Otherwise, we wouldn’t have been doing it for the entire history of cryptography. A sudden surge in encrypted communications between two generals you’re at war with is sure to be a precursor to something.

This issue boils down to whether or not you have an expectation of privacy on any of your digital communications. Sure, they’ve started with e-mail, but they can easily escalate it further if this is approved. We need to prove that we will not stand for such an invasion. If this courts disagree with the plaintiff in this case, we need to lobby hard to show our congressmen that we value privacy in our digital communications enough that though it might be considered Constitutional, this sort of monitoring should still be illegal. If our congressmen show that they don’t value the privacy of the American People, we need to replace them with men and women who will. Either way, I hope people will look more seriously at the use of encryption technologies. It’s been looking more and more like we can’t trust the people we’ve chosen to lead this nation to respect it’s citizenry, and their right to communicate and live in privacy.

Finally, I encourage all of you to join the EFF today, and support those lawyers who fight for freedom and privacy in the digital age.

The Next Iron Chef Heats Up

Catherine and I have been watching the Next Iron Chef every Sunday night since it started. They started with a field of 8 Chefs, and have now whittled it down to two. I agree with everyone who says that the American Iron Chef is inferior than original Japanese one, but it’s still entertaining.

My only major complaint with the American Iron Chef is Bobby Flay. It’s not that Bobby isn’t a good chef, and that the food he cooks isn’t fantastic, but when Bobby Flay descends on Kitchen stadium, you know what you’re going to see. He will use Mango, he will use chili powder, and he will use some sort of ancho-chili sauce. I don’t think I could tolerate being a Iron Chef judge on more than a few Bobby Flay challenges, because it just looks like it would feel so repetitive.

This isn’t about Bobby Flay though, this is about the next Iron Chef that Food Network is looking for. It is now down to two, John Besh, and Michael Symon. I knew John Besh from a previous appearance on Iron Chef, and he’s been a favorite of mine throughout the entire competition. Michael Symon was unfamiliar to me, but over the course of the show, he’s really shown himself to be a great, innovative chef. The lobster hot dogs he did last night were interesting, and really well received by the French. John Besh’s stuff looked fantastic, but I’m not sure he’s as innovative as Symon.

But next week is the final competition, being presented only as a Iron Chef competition can be. A head-to-head winner-takes-all showdown in Kitchen Stadium. I still want Besh to win, but I know Symon is going to give him a run for his money. If you like watching fantastic chef’s showdown, The final episode of The Next Iron Chef is going to be hard to miss.

Crpytographic Key Distribution

Returning to our discussion of Encryption, we reach one of the most contested portions of modern Public-Key cryptography: Key Distribution. The advent of Public Key Cryptography went a long way in making reliable, distributed cryptosystems a reality, because now each member of the cryptosystem can have their own secret, instead of the old model where each group of communicators would maintain their own secret, which could obviously lead to a large number of shared secrets to be managed by one person. Due to the circular nature of the RSA algorithm, a user can keep their secret secret, and make the public portion of their key available to as many people as they choose.

The question, however, is how public is too public? The ultra-paranoid would argue that public keys should only be given to people you intend to communicate securely with. However, this attitude has several potential flaws. If you wish full control over your public key, you run in to all the same key distribution problems that old key management systems required, namely the need to establish a secure method of trading keys in order to communicate securely!

The alternative, which is fairly popular among many users, is the Public-Key server which serves as a collection of any and all public keys users choose to upload to it, freely available to anyone who wants to download it. Some people feel that making your public key publically available could allow your key to be broken, though a reasonably large key size, and a key expiring periodically makes this unlikely. The more pertinent issue, in my mind, is the social networking attacks that are possible against such a public system. By analyzing who has signed a person’s key, you can build a pretty good picture of that person’s known associates, which could be used by a scammer or attacker to determine ways to get at a mark. Of course, the popularity of social networking sites, even among people over 50 years old suggests that most people aren’t that concerned about Web of Trust-based attacks.

As a decent in between, I think it’s worth large enterprises to consider running their own keyservers, open only to people authorized to access the enterprise network. This should make the paranoids more comfortable, as they won’t have to worry about just anyone getting their hands on their encryption key, and everyone in the enterprise knows exactly where to find verified and trusted keys for their co-workers. By utilizing technologies to limit access to the key server based on network locality, the system is mostly free from the social networking based attacks referred to above, and more trust can be associated with the keys, due to the extra controls in place for who can place data on the server. With the success of server virtualization, the fiscal impact of running the server should be very low.

In the end, I think we may still need to do something to better integrate Encryption technology into modern computing. The problem is that it will need to be a largely population-led movement, as there are forces at work that don’t want your communications to be private. That want to stigmatize encryption as the realm of paranoids and terrorists. Maybe there is a touch of paranoia to it, but it’s more of a matter of education. People often mistakenly believe that communication they are having is secret, when really, it’s potentially open to the world. Business and Government often need to communicate confidential information to one another, but often fail to do so securely, simply because they fail to acknowledge the realities of secure communication until they have made a serious mistake potentially compromised confidential information.

It is the responsibility of all organizations to protect their data, particularly where it comes down to personal information of their employees, customers, and beneficiaries. I am of the opinion that high-grade encryption is the best way to accomplish this task. I also feel that we can do better than the 128-bit keys which are standard in the world of secure web communications today. I prefer PGP or GPG Keys to S/MIME Certificates, simply because they can be more easily used for more than just e-mail. However, the point of this, and the last several entries on Encryption, is that if you aren’t using it, you need to evaluate why you aren’t, and most likely, I think you’ll find a reason, at least in business, to do so.

The Terrorists Have Won

I’ve blogged about valuing privacy before, but Bruce Schneier, a Security Technologist and Author, blogs today about a War on the Unexpected. His post a number of vast over exaggerations of things that under normal circumstances would never have been considered a threat, but the current state of panic that most of the populace seems to live in is a strong indication of one thing. The terrorists have clearly won.

Bruce links to stories on a lot of the high-profile out-of-control escalations which have occured over the last few years, so I won’t bother rediscussing them here. What frightens me most, is the people supporting knee-jerk escalations in the comments on Bruce’s blog. A cell phone being left behind on a plane does not warrant a full evacuation and search of the plane. If airport security is doing their job, how would a bomb have gotten on the seat? A lot of things are out of the ordinary, but out of the ordinary doesn’t mean that it’s a threat. If w’ere going to encourage the populace to report everything, we need to train people on identifying a credible threat. We need to stop praising people for out of control escalation, and praise them instead for appropriate responses. I’m not suggesting that no response is the best solution, but responses must be considered and reasonable. While today most of the Western World lives in fear of imagined threats, how much longer before we simply become numb to the world around us?

This guise of protection has already taken it’s toll, and that cost is the encouragement of interest in Science. This isn’t just the fault of Al-Qeada and 9/11. The dulling down of our children has been escalating for years. A co-worker of mine was recently approached by Washington’s Child Protective Services because of a black eye he recieved at home, since the teacher felt the need to escalate a single bruise witnessed on a young boy. I remember when I was nine years old, I constantly had bumps, bruises and scrapes from playing with other children and just generally going foolish things. Hell, I’m in my mid-twenties, and I still get small injuries on a regular basis. I know I haven’t seen kids change that much as I’ve grown up.

Yes, we need to protect children, but we need to protect them from real threats. Paedophiles, murders, theives. We need to teach them what to do when something genuinely frightening happens, and comfort them when something unnerving happens. But we need to keep them free. Free to play. Free to experiment. Free to Explore. If we don’t our society will crumble as our desire to promote safety raises everyone to such an intense level of fear, that we can not function. We can not advance. We can not succeed.

Did the world change on September 11, 2001? I don’t think it did. We’d been attacked before. We’ll be attacked again. What changed, was us and our perception of the world. Where before we were mostly comfortable with our world, and usually able to judge geniunely suspicious behaviour, as a whole we’ve largely lost that ability, and our society is suffering for it. AT&T uses their Hancock language to identify communities of interest, establishing guilt by association, even two or three layers away from the target of interest. St. Louis encourages a ‘telling culture’ in it’s schools, which, if implemented without a proper discussion of what a threat is, could easily serve as a precursor to the knee-jerk reactions and escalations we see today.

American’s have become afraid of the silliest things. We choose to fear that which we don’t understand, rather than seek to understand it. In our rush to fear, we choose to shelter people from things rather than educate them. The St. Louis story I link to above makes reference to escalated responses by children to bullying, presumably in the form of school shootings. This is an unfortuante problem, particularly with how young some of the shooters have been, but this is more of a lack of education. If you’re going to keep guns in the house, they need to be kept safe, and everyone in the house needs to respect the weapons, and what they can do. I remember a story of a father taking his young son, three or four years old, out to an outdoor range, setting up a glass jug of water, and shooting it with his son watching. It was to make his son understand what the gun was, and to demonstrate, explosively, what it could do. People need to be taught to respect the dangers inherent in an item, whether it’s chemicals, weapons, or even a bicycle. By respecting the danger, we can keep oursevles relatively safe, while still exploring, still learning. Will we get hurt occasionally? Yeah. But that’s just another part of the learning process.