May 2010 Archives

The Feeding of an Obligate Carnivore

Catherine and I have been cat owners for only about six months. We’d both grown up as dog people, but the rules at our condominium complex are written in such a way as to specifically exclude dog ownership, and my wife fell for a hard luck story she saw on Craigslist. We’d planned to take possession of a young mother cat, named Juniper because that is where she was found (and what she’d been eating) by the people who cared for her through her pregnancy. We ended up taking Juniper and one of her kittens, whom we named Ivy due to her love of climbing things.

To ease the transition of the cats as we moved them into their new home, we originally began feeding them what their previous caretaker had fed them. Specifically, Friskies and Meow Mix. However, Catherine had been reading a lot of cat nutrition before we took possession of the cats, and her own background in Zoology made her a bit uncomfortable with the ingredients list on the cans. There was an awful lot of vegetable matter in the food, and what protein there was often came from sources like ‘chicken by-product meal’ and ‘animal digest’. In other words, plenty of roughage that a cat wouldn’t eat in the wild, and what protein there was, tended to be poor quality.

Yet, we fed the cats on it for several weeks. Their world had just changed drastically, and a change in food, we felt was unreasonable. Which is when we discovered what I consider the most disgusting part of this ‘food’. It smells the same exiting the cat, as it does exiting the can. Aside from being a terrible realization (the litter box is in my office, the cats are tidy), it mostly went to show that there was really very little in this food that the cat was actually able to derive nutritional value from! It’s basically what would happen if you ate nothing by white rice at every meal. You’d feel full, certainly, but you’d have little energy, and you’d gain very little nutrition from your diet.

Why is most of this food so poor quality? Simply because the core ingredients: corn, soy, and animal by-products, are the by-products of the food system that feeds us. Incidentally, this is also why, why JD Roth of The Simple Dollar was asked to weigh in on a book that made the claim that pets were a Sustainability nightmare, he argued against the points in the book. We’re feeding our pets industrial waste, so while there is a bit of a loss (packaging, shipping, etc) it’s probably better than simply shipping that stuff to the dump.

He is right, to a point, but I still believe that there is no excuse to feel your pets in a way that completely ignores their need for balanced nutrition. With dogs, this industrial waste food is a little better. Dogs are at least omnivores, they can glean nutritional value from corn (though Soy should still be avoided). Catherine’s dog, who has lived with her parents for 5+ years now, subsisted for some time on rice and green beans. Admittedly the limited diet was at least in part because of other problems, but it was still (with a bit of protein from time to time) nutritionally sufficient.

With cats, it’s different. Cat nutrition is not particularly hard, they don’t need a whole lot of food every day, but the nature of a cat’s diet should be very different from a dog’s. First off, feeding your cat dry food is probably a bad idea. Our cats are drinkers, and will drink from a free cup or bowl if is available, however not every cat is. Cats in the wild drink very little free water, taking in almost all of their liquid from the food that they eat. If your cat is eating dry food, and they don’t drink water freely on their own, they can develop problems like kidney stones.

There are quality brands of wet and dry cat food. Some people really like the Natural Balance brand. Our cats didn’t care for it, and I felt the green pea content was simply too high. A can smelt mostly of green pea when opened. The brand we settled on was Evo from Innova, which are completely grain free and the cat foods are generally 90% or more meat. We do keep around a bit of their dry formula as well. Our elder cat still loves dry food, though she only gets it very rarely (or when she figures out how to open the cupboard where she knows it’s kept). Evo is expensive, for cat food, costing about $1.90 on the higher end per day for a cat just shy of 10 pounds.

While we do still occasionally feed Evo, and plan to move the cats back onto an Evo diet for a few weeks this summer when we’ll be out of town, usually we try to follow a more traditional model of feeding our cats. Historically, people who kept cats fed them bits of meat and bone, or simply let them free feed. I’m not advocating free feeding, due to the effects that roaming cats can have on biodiversity in a region. Our cats only go outside when on a leash.

Our feeding strategy begins with cornish game hens, one hen per week feeds two cats, and adding additional meat (we use a mix of beef, chicken and pork, mostly chicken) as well as added liver. The game hens are important, since they contain the bone (and hence calcium) which is key to the cat’s bowel health. Without enough bone (or egg shell), a cat will likely have very runny bowel movements, which is uncomfortable for the cat, and likely going to make a huge mess out of your litter box. Plus, bone is very filling for the cat, so they typically seem far more sated when they’ve had bone. Note: It is extremely important that this bone not be cooked. Cooked bone is not pliable, and could really hurt your cat when they try to eat it.

We feed our cats about 5 ounces of food a day each. This translates to 35 ounces of meat per cat per week, or 70 ounces for the both of them. We typically prepare 2 weeks worth at a time, though we’re considering upping this to four. We have a good sized freezer, allowing us to stock up on meat when it goes on sale.

Our makeup for preparing one week (for two cats) looks about as follows:

  • One 23 1/3 oz Cornish Game Hen (If the hen is bigger or smaller, we just modify the values below)
  • 3.5 oz Liver (chicken or beef)
  • 3.5 oz other secreting organ (kidneys are great)
  • 40 oz other meat (we do a mix of 2 parts beef, 2 parts pork, 1 part thigh meat chicken, 1 part breast meat chicken)

If you’re having trouble finding ‘secreting organs’, then you can simply replace it with liver, but you’ll want to occasionally feed a can of quality commercial food to ensure that your cats are getting all their necessary vitamins. I always start by breaking down the chicken. I always start by cutting off the breasts the same way you’d be breaking down your thanksgiving turkey, then removing the wings. I then remove the legs, cutting them in half, the cutting the torso in half, back to front. I then break the halves of the chicken into four pieces, leaving me with fourteen bony pieces of various sizes to split between two cats. For breaking down the torso, I suggest a good pair of kitchen shears, they make the job go much faster. Some weeks we will also add in a can of sardines (water packed), breaking the fish between a few meals and feeding them whole to the cats, the oil in the fish helping with coat health.

I then break down the rest of the meat into half to one ounce pieces, which we mix together and break down into bags. We place three meals for each cat into plastic bags, then proceed to freeze all but what we need for that day and the next. We’ve found that three meals is a good compromise for us between going through bags too quickly, and minimizing the amount of raw meat in our fridge, since cats are very sensitive to food that isn’t quite fresh.

This was a difficult process when we first began it. It would take hours, and we’d be exhausting by the time it was done. Now, we can do two weeks in about two hours, and the cats clearly prefer the raw meet diet. They seem to have more energy than they used to, and they’ve been on a full meat diet for several months with seemingly excellent health.

They don’t drink a lot of water from the bowl we leave out, but the litter box contains a good number of urine clumps when we scoop. They don’t poop as much as they used to, but their bowl movements tend to be smaller and more solid, suggesting that they are able to process nearly all of what they’re eating. And as the person who has to sit five feet away from the cats when they are defecating, the near lack of odor is definitely preferred.

The raw diet is also supposed to promote tooth health, since chewing on the bones and connective tissue should clear plaque from their teeth. I’m not sure I noticed bad breath from our cats before, so I can’t say that this has improved that, but the fact that they do chew their food now does make me think there is something too that. Before, with the traditional ‘pat&eacute’ that most cat food is in, our cats would literally just lap up their dinner without doing any chewing. I’m positive that that didn’t help their oral health, even if I’m not sure about the raw diet helping.

Converting the cats was surprisingly easy. At first, we tried to mix the raw meat in with cans of Evo. The younger cat got it immediately, the elder cat tended to just lick off the paté and walk away. We eventually tried just giving her the meat and not providing any Evo, and she seemed to catch on pretty fast. Once we had them eating meat, we started working in bone. This wasn’t really a problem, except that Ivy, the younger cat, had a bit of trouble with the larger leg bones when she was still really young (mind you, she’s still under a year old). She’d gnaw off the ends of the bone, get a bit of the marrow from inside, but would ultimately leave some behind.

We occasionally introduce new meats to the cats. We tried duck, but the bones were too substantial to replace the cornish game hens at this point (the cats are both still pretty young), and we might try a bit of rabbit from the local farmer’s market. Ultimately, we find it easier to feed just the trinity of beef, chicken and pork. They’re the easiest to get. The cheapest. And our cats don’t always respond well to new meat. We are also considering moving toward the ‘prey model’ or feeding, where we’d buy freeze dried mice, like you’d feed to pet snakes or something, and feed those directly. The nice thing about that approach is that the meals are already balanced for necessary nutrition, however, we have no idea if it would work for our cats, and we don’t have a local pet store where we can buy a few mice to try to feed. Plus, our cats play with felt mice, and I definitely don’t want them playing with their food.

We’ve been happy raw feeders for the last six months or so. And our cats have been very happy with the change. Cost wise, I believe it saves us a bit of money off the high-end cat food we were feeding before, and breaking down the meat goes smoother every time we do it. It’s been absolutely worth it for us, and our cats definitely seem healthier than on the grocery-store cat food, and to a far lesser extent the high end commercial cat food.

Portland Code Camp 2010 Recap

If I had to choose a single word to sum up the Portland Code Camp event, it would be ‘inspiring’. I sat through some excellent presentations, which certainly have recharged my interest in technologies like F# and Clojure, but also the scale that the organizers for this event managed to meet was awesome. We were lucky, in that we were able to meet with the organizers for a short while in the evening and get some really amazing advice that we hope to use to make our own event a success this fall.

What were some of the impressive things about this event? Well, they had the Mayor of Portland on hand to talk about Technology companies in the area, with questions submitted via Twitter. Presenters and Volunteers were provided with work-quality Polo Shirts, instead of T-Shirts that I would likely never have worn again. There were hundreds of people present, and while I haven’t seen a final count just yet, I’m certain they reached their goal of 600+ people. In all, it was by far the most professional code-camp that I’ve seen.

If it had a weakness, it’s a similar weakness that all Code Camps I’ve been to have had: They feel too heavy with Microsoft technologies, which is a turn off for a certain segment of the coding population, including myself though (and perhaps because) I deal with these technologies on a daily basis. However, having spoken with the organizers, they were trying to work with the Bar Camp Portland people to try and bring that Open Source perspective into the Code Camps. I hope that our event, since there is basically no open-source heavy events to compete with, might help bridge that gap.

I also took the opportunity to test taking notes via the Mind Mapping method of taking notes, trying the VYM software package I found in the Ubuntu Repositories. I’ll have to try a few other software packages, but I found VYM easy to use for simple mapping, though I couldn’t discover keyboard shortcuts for flagging entries or changing colors. I didn’t have time to look very hard though.

I began with the F# talk given by Microsoft’s Michael Hale, a PM at Microsoft on Visual Studio and F#. I’ve been hearing about F# for several years, and been meaning to look into it, having been exposed to Functional Programming via LISP in college, and believing in the potential of functional programming for making concurrency far easier to solve. F#’s syntax kind of reminds me of JavaScript, in that it’s a LISP-derivative that looks more like C, though admittedly, F#’s syntax is further from C than JavaScript’s.

F# is still a .NET language, so it’s object model is the same as in every other .NET language, which makes sense in that F# can be merged seamlessly in with other .NET code. Plus, Michael gave us several examples of code written in ‘normal’ iterative methods, and expanded into how they could be rewritten functionally. In some ways, I think the syntax of F# is more clear than the languages from which it is derived. Rather than nesting functions with parenthesis as you do in LISP, you can basically pipe commands together, the only weakness to that method being that the pipe can only be applied to the last argument of a method. However, that’s only a minor inconvenience. Best part about F#? Not only does it work in Mono, but Microsoft actually provides an install script for Mono. Awesome. And it encourages me to spend some more time with F# in the near future.

My next hour was spent learning more about the Reactive Framework, which is a pretty interesting way of looking at events (I just hope Microsoft doesn’t enforce any patents I’m positive they’ve applied for on this technology). Basically, with the Reactive Framework, you subscribe to a sequence of events, but then you can easily mash events up to do things like “When X happens, followed by Y, do Z.” With Rx, this can be done with a couple of lines of code, instead of dozens of lines of state machine and state tracking code. Given that Rx has been ported to JavaScript, and my favorite framework, I’ll definitely do an upcoming post on Rx and YUI3.

Speaking of YUI, my talk went well. I had a dozen or so people come, some of whom left early (I assume I was being too basic, but that was my intent), and I had some really good discussion. I might have done too many comparison’s to jQuery, but jQuery is the 800 pound gorilla in the JavaScript room right now, and since Microsoft endorsed jQuery last year, most people programming in the MS ecosystem have little to no exposure to any other options. It is something I’m going to consider revising for September however, when I hope to also do a section on creating custom YUI3 modules.

I also had the opportunity to meet Ryan Grove, YUI Core Team member, and it was nice to finally meet in person someone from the team, as well as have them feel that I’d done a good job of describing the framework. I freely admit to being a bit of a YUI cheerleader, and I fully intend to continue telling people about it and trying to convince them to give it a shot.

After my talk, I decided to go to a conversation about Clojure and the Semantic Web. To be clear, I was more interested in the Semantic Web discussion, which barely happened, but Clojure was an interesting Java-based LISP. Unforuntately, the version in Ubuntu 10.04 seems to be just a bit out of date, and some of the code examples he presented didn’t work. Still, it might warrant a bit more looking, though at this point I’m a bit more interested in F#.

I ended the day learning about the Mobile Web from a developer who seems to know what she’s talking about. This is a relevant session to me, as I’m planning to do a mobile version of one of our websites this summer. There was a ton of great, useful information in this post, and I think it definitely provided me with a solid framework for starting this project. If I have a complaint about the talk, it was that it kind of felt like a sales pitch for her class sometimes. I can’t really begrudge her mentioning the class, but there did seem to be a lot of ‘we go more in depth on this during my class’ kinds of comments. Still, Gail was very knowledgeable, and certainly got me excited to get going on mobile web development, even in spite of the fact that doing it right is going to be more work than I’d hoped it would be.

In all, it was another successful event. I got some great information, met some pretty awesome people, and got back in touch with some people from years or events past. It was a really long day (some 14 hours), but I’m exceedingly glad to have gone, and if our event can be even a quarter as good as this one was, I’ll be thrilled.

Portland Code Camp 2010 - Intro to YUI3

This post will be going up almost exactly the time that I complete my Introduction to YUI3 talk at this year’s Portland Code Camp. I want to thank everyone for joining me at this presentation, and I hope to see some of my attendees around the YUI community.

The sample application is based on a Django project for a simple checkin system. The code in it’s completed state (at least, what I hope to complete) is up on github. Whatever I do complete tomorrow will be available as a ‘portland2010’ branch on that project after I present.

As a tool to help me complete the code in a reasonable time frame, I did create several snippets for the snipmate plugin for VIM, which I think are compatible with TextMate’s plugin system.

Either way, I encourage the YUI community to fork the gist linked above and I’ll gladly merge in additional snippets.

I’ll post more of a post-mortem about the presentation next week. Also, I am planning to prepare a more advanced YUI3 presentation for the Palouse Code Camp that I helping organize for September.

Freedom From Pornography Is No Freedom At All

Recently, another Gawker Media employee traded blows with Steve Jobs. However, this time, the battle was short and via e-mail, instead of ongoing and legal. The discussion began with Ryan Tate, the Gawker employee in question, asking Steve Jobs about a recent advertisement that claims that the iPad is a ‘revolution’. Tate’s complaint centered around a belief that revolution’s were around Freedom, which the iPad does not encourage.

Job’s reply to this call out on freedom was, as to be expected, very defensive of his latest toy.

Yep, freedom from programs that steal your private data. Freedom from programs that trash your battery. Freedom from porn. Yep, freedom. The times they are a changin’, and some traditional PC folks feel like their world is slipping away. It is.

Okay, ignoring the awesome Bob Dylan reference (which Tate made first, but still, great), the rest of the reply is purely ridiculous, as every single ‘point’ he makes can be easily substituted with “freedom from things that I deem unacceptable.”

Programs stealing my personal data? That’s bad, I absolutely agree, but what if access to my contacts database makes my Twitter app drastically more useful? What if I want a different interface for my contacts than what the Cult of Steve has chosen to provide me? Yes, it is possible that providing access to personal data on my phone can be misused, but it’s also very possible that providing that same access will make my experience drastically better. People have done a ton of work on Android which makes it possible for blind people to use a touch-screen Android device fairly effectively. I doubt you’ll ever see anything similar on iPlatform. Apple would have to do it, and I doubt they will.

Freedom from programs that trash my battery? What about freedom to choose to use programs that might hurt my battery, but are useful enough that I can suffer through that inconvenience? I don’t know anyone with an iPhone that doesn’t need to charge it daily anyway (and I know my Android phone needs a daily charge, occasionally with a charge during the day, depending on how much I’m using the music player and display). That’s the thing. What tends to kill your battery, in my experience, is almost always just USING the damn phone. The display is a huge drain, the other huge drain being the radio, particularly when you’re in an area with poor or weak signal. Battery drain is a problem, but it’s not something that anyone should wish to trade freedom of choice for a tyrannical freedom for.

Most distressing, however, is the claim of ‘freedom from porn’. I’m not going to defend Pornography, at least not directly. Most pornography on the Internet is disgusting and degrading, though not all of it is. I’m not going to dwell on the fact that even refusing pornography apps doesn’t stop safari from loading porn, or Safari being set to display Pornography as it’s homepage. Hell, even in the Apple Store, some people will set the default home page on the iPads to hardcore pornography (no idea why they don’t blacklist their wifi). Hell, I’m not even going to dwell on the fact that there are plenty of almost pornographic apps still being sold.

No, those things, which are all relevant, aren’t what’s really scary about Jobs’ comment. The fact is, it’s not about Porn. It never was.

Whenever someone in a position of authority begins talking about the ‘evils of pornography’, or anything of the sort, you should begin to get worried. Because, odds are, they don’t really care about Pornography. But, in American society, where we’ve developed a very puritanical (and frankly hypocritical) view of sex, the vast majority of people will never try to even be seen to be defending pornography. Because it’s dirty. Because it’s shameful. Because it’s wrong for no reason other than we’re supposed to think it is.

Recently, a music industry spokesman said, at a conference in Stockholm, that Child Pornography was a wonderful tool to use to push for censoring and filtering the Internet. Now, I will give Steve credit for avoiding the child porn argument (which is indefensible), however, I’m of the opinion that he’s using pornography as a similar tool. I am not saying that Apple should be forced to allow pornographic applications to be distributed via the App Store. Google won’t carry that material via the Android Market, and forcing a distributor to carry any particular product is not something I’m okay with. However, the control that allows Apple to limit the availability of Pornography on the iPhone/iPad is bigger than that limited scope. The App Store is the sole mechanism to get Apps to the general public (whereas the Android Market is simply the best way to target those users, other distribution channels are possible), and Apple’s policies are vague, and allow them to selectively approve and deny anything they want, on seemingly arbitrary criteria.

Apple wants to be the stewards of experience and information. And their bulwark in this is pornography. However, their ‘freedom from porn’, and their refusal to provide a non-Apple controlled distribution mechanisms for Apps, along with the unknowable rules they choose to enforce on their own store, makes Jobs’ offer of freedom, nothing short of platform tyranny.

Detecting Removable Storage Mounts using DBus and Python

As part of my workout regimen, I tend to prefer machines at the Gym that use this StarTrac system to dump data snapshots of my performance (heart rate, speed, calorie burn rate, etc.) to a binary file. In a future project, I plan to decode this file and perhaps do something with the data, but in the meantime, I’m trying to recreate the uploader function used by the eFitness website the rec center has contracted with for handling this data. That uploader is written in .NET, and takes advantage of some P/Invoke calls unique to Windows to detect when new removable media is added.

Luckily, the Web Service the website uses has a public WSDL, and it’s a pretty straightforward SOAP web service. However, this post isn’t about all that. When collecting StarTrac data, I plug a simple USB thumb drive to a box attached to the excercise machine, it updates a file on the drive every 15 seconds. The path on the drive is easy enough to know (same folder, easy pattern to file names), but how do I detect when the user has attached the device? And where Linux mounted it?

The answer, as with just about anything communication related these days on Linux, is dbus. However, even knowing that you can get that information doesn’t do much for the how. Which is why dbus-monitor is so important. Running dbus-monitor, watching the session-bus (I assume the user is logged in, since I’m using desktopcouch to store data), and on GNOME, the interesting block was this:

string "org.gtk.Private.GduVolumeMonitor"
string "0x822b808"
struct {
    string "0x822b808"
    string "DISK_IMG"
    string ". GThemedIcon drive-removable-media-usb drive-removable-media drive-removable-drive"
    string ""
    string "file:///media/DISK_IMG"
    boolean true
    string "0x822c948"
    array [ ]
}

This data came in on interface org.gtk.Private.RemoteVolumeMonitor, member MountAdded. I’ll cover in a snippet below (which I plan to contribute to python-snippets). There is one problem I need to solve here. This will provide me with every new mount, not just new USB thumb drives. Now, I could parse the third member of that struct, but that’s gtk sensitive data. It’s possible to change, and would make the code potentially harder to post to KDE or others. Perhaps that last hex value string has the information I need, but I really have no idea. I don’t see anything obviously useful in any of the other sets of member data that I feel I could trust…

However, listening for this event is easy:

import dbus
import dbus.glib # Provides the required Main Loop

sessionbus = dbus.SessionBus()
self.sessionbus.add_signal_reciever(signal_name="MountAdded",
                                    dbus_interface="org.gtk.Private.RemoteVolumeMonitor",
                                    path="/org/gtk/Private/RemoteVolumeMonitor,
                                    bus_name=None,
                                    handler_function=mountDetected)

def mountDetected(sender, mount_id, data):
    print "New drive mounted at %s" % data[4][7:]

I probably don’t need to put the Interface and the Path, but I’m a total newbie to dbus, so I did for completeness. And my only issue is that I will see CD’s mounted using this as well, but I suppose I just have to hope they don’t have the folder I’m looking for…but hopefully I can find a better way, even if it involved looking at multiple dbus events and do some internal correlation.

An Appeal for the EFF

The Electronic Frontier Foundation is again doing a fund drive trying to bring in more money. This time, however, they’re running a little contest. Whoever raises the most money for the EFF by June 30, 2010, will win two tickets to Defcon 18 this summer.

I’m a supporter of the EFF, because they work hard to protect rights that are important to me. I’m a blogger and a coder. I have interest in [maintaining fair Copyright], avoiding draconian surveillance, and fighting against the abuse of patents.

I’m not going to claim the EFF is perfect, but their work is mostly centered around ensuring that we use our technology for freedom instead of having it used against us, and issue deeply personal to me. Plus, I’ve always been heavily into information security, and would love to attend Defcon.

I would immensely appreciate if you, my reader, would be willing to donate even a few dollars to support the EFF, and potentially get me a trip to Defcon this year. Even if I don’t win, the work the EFF will be able to do with even a small donation is important, and your tax-deductible donation would be a great help.

Click Here to Donate

On Ogg, Vorbis, and Theora

A few weeks back, I read an interesting take on the Ogg format, which basically claimed it was a terrible format for streaming video and audio streams. Personally, I know very little about the problems that they’re trying to solve here. However, I’ve been using Ogg (and specifically, Vorbis), for years, but it had never really gained much traction (outside of the Free Software community, and a lot of games). The lack of traction suggested that maybe, just maybe there was a legitimate technical reason behind this lack of adoption.

Fortunately, the creator of the Ogg format took umbrage to the analysis above (probably in part because it was Slashdotted), and decided to offer a point-by-point rebuttal. It comes down to the fact that, as far as Monty is concerned, Ogg works very similarly to other container formats, even ones that the original author held up as gold standards, like Matroska. Clearly, if there are any weaknesses, it’s not likely with Ogg.

And yet, the standard for audio interchange and online music quickly became MP3, a format well known to be patent and license encumbered, instead of the (to the best of anyone’s knowledge) un-encumbered Vorbis. Why? The two formats are really similar in performance characteristics (sound quality vs. bitrate). Yet, MP3 is by far the standard. Hell, the Ubuntu One Music Store, or rather the store that backs the U1 Music Store doesn’t even offer Vorbis as an option.

I think the most likely cause of this was simply that MP3 was the format that everyone learned about digital music through, specifically due to Napster, since Napster only worked with MP3, and Napster was big almost two full years before Vorbis’ first release. Plus, the first runs of portable audio players only supported MP3 as well. Between these two factors, how could Vorbis possibly have taken off? Note: I don’t count M4A/AAC. It’s still, to this day, only used on the iPod and with iTunes. Yes, it’s big, but that’s a factor of the iPod’s success, which is an completely non-technical issue. Before M4A, MP3 was (and is) king. The other problem Vorbis faced, was simply that Tremor, it’s fixed-point version suitable for embedded use, wasn’t available until 2002, almost a full year after the iPod had begun to dominate the market, and to date, no iPod has included a Vorbis decoder.

By 2002, when Vorbis had all the pieces necessary to be a real contender (a better sounding release, and fixed-point version), MP3 was so deeply entrenched that it didn’t have a chance to unseat the king. Plus, the MP3 patent holders have, to date, sworn not to enforce their patents on open-source projects, and they charge their fees to the hardware/software producers, so to most people, the encumberedness of the format is apparently a non-issue.

Then, came Theora, a video format based on the VP3 video codec. It’s been in process since 2004, and frankly been mostly usable since that time, even though it wasn’t considered final until 2008. This means that the extensions to VP3 that became Theora didn’t even begin until the current King of video interchange, H.264, was completed in 2003.

And why is H.264 King? Well, most people encode in H.264 because the standard mechanisms to distribute video via the Web, Flash. And now, Safari, Chrome, and IE9 are all committed to using H.264 for video. Firefox, trying to make a stand, only support Theora at this time (to be fair, so does Chrome).

The corporation that owns most of the patents around H.264, MPEG LA, has promised not to charge royalties on use of the format for freely distributed content until 2016. What will happen then? Well, we’ll see in 2015 (when they’re likely to make a new announcement). But either it will remain free, or a lot of people will be frantically re-encoding to try to avoid going into violation.

Plus, while using the format may be free (in certain circumstances), creating software or hardware to encode or decode the codec is absolutely not free. So, anytime you have something that uses this hardware, you’re paying that cost somewhere. With Flash, at least, that cost is being paid by the people buying the developers tools from Adobe (not to mention buying the encoders). Google, Apple, and Microsoft are all paying licensing fees to distribute H.264 decoding code with their browsers. Mozilla and Firefox know that they can’t do that without limiting the ability of re-distributors (like Ubuntu) to distribute Firefox, which is unacceptable to them.

However, the current great hope to deal with the encumbrance issues with H.264 isn’t to look at Theora, but hoping that Google may open source the VP8 codec they bought with On2, later this year. While that would be an awesome development, the question still remains: Is VP8 really any better than Theora?

From a technical standpoint, I have no idea. But I don’t think it has anything to do with Technical issues. A person contacted Steve Jobs to ask about Theora on the iPad, due to Jobs’ comments about Flash’s lack of openness, when Flash uses H.264 like Safari does for video. His response was simple:

All video codecs are covered by patents. A patent pool is being assembled to go after Theora and other “open source” codecs now. Unfortunately, just because something is open source, it doesn’t mean or guarantee that it doesn’t infringe on others patents. An open standard is different from being royalty free or open source.

This is clearly a case of ‘better the devil you know, than the devil you don’t’. No one’s willing to trust Theora because they don’t know for certain that it’s covered under any patents. Meanwhile, H.264 is known to be encumbered, even though it could (as likely as Theora, at least) be covered by patents we don’t know about yet.

Plus, apparently people have been blowing smoke about a patent threat against Theora for years. So far, Xiph’s lawyers don’t see any credible threats on the horizon.

Personally, I’m with Monty. For one thing, Theora is an extension of VP3, which is older than H.264. If someone was going to make a patent claim on Theora, they need to do it before they lose their ability to enforce any such claim. Plus, MPEG LA doesn’t want anyone to develop competing codecs, so keeping up this fiction that you’d be hard pressed (they won’t say “can’t”) to create a Video codec that doesn’t violate their patents greatly helps their bottom line. Unfortunately, they had a lot of money, which gives them unfortunately advantage in the legal system.

Ogg, and it’s children Vorbis and Theora, have been the unfortunate victims of one of the most effective FUD campaigns, I’ve ever seen. And it’s been largely successful, in large part because the Hardware support has been lagging behind. Until we start seeing Vorbis and Theora in more hardware (and Android is definitely helping with Vorbis), it’s going to be difficult to gain any traction. And I tend to agree with those who feel that we can’t build Open Standards on closed platforms, despite what Steve Jobs might feel.

Broadcom B43 Drivers on Ubuntu

1 Comment

My recent Dell Mini 10 purchase came with a Broadcom BCM4312 Wifi interface. And it worked great…with the Ubuntu 8.04 install that was stock. That wasn’t going to work for me. I wanted the (the Alpha) 10.04 install. Shortly after upgrading, however, my Wifi stopped working entirely. So, I found this bug, happy the problem wasn’t only me. Hell, for a while, I had to use NDISwrapper to use Wifi at all,.

Eventually, I downloaded the compat-wireless package from Linux Wireless, and started running my wireless interface in PIO mode, which worked okay. I could get on the wifi at a handful of places I really needed it (home, office), but not much else. It was a challenge. Hell, I had trouble on our enterprise Wifi, which uses the same SSID across our entire campus. I could get on at a small handful of locations, but not others. No idea why.

Eventually, I was getting fed up with this, so I hit up the Ubuntu forums, and finally found the answer that stabilized my Wifi experience. I had to create a b43.conf file in my /etc/modprobe.d folder, with the following:

 options b43 pio=1 qos=0

Apparently the QOS code was occasionally forcing me to have to take my wireless down and reboot the interface in order to use it. Since that change, my wifi has been almost seamless (I still get intermittent disconnects, but they come right back up, that is either the sign of a hacking attempt, or misproperly configured Wifi, I’m leaning toward the latter). It’s been great. And, it seems that in a forthcoming Ubuntu Kernel, the bug should be fixed for real, which I’m really looking forward to. I (and many others) wish that this had been resolved prior to Ubuntu 10.04 shipping, but with any luck, this won’t be a regression for 10.10 and beyond.