Mad, Beautiful Ideas

No man is an island, entire of itself; every man is a piece of a continent, a part of the main. — Devotions Upon Emergent Occasions, John Donne, 1624

In the last few years, in particular, it seems that there has been increased discussions about the topic of Billionaires. Specifically, discussion about whether or not there should be any. Many have weighed in on the matter, from actual Billionaires like Bill Gates and Mark Cuban, to Anarchic Socialists who dream of a world without money.

In the middle, there seems to be an growing number of people who want to see a return to pre-1980s Top Federal Tax Rates, as Representative Alexandria Ocasio-Cortez suggested on CNN back in January and Senator Elizabeth Warren's far more ambitious 2% tax on all fortunes over $50M, to the various "supply-side" economics proposals that have been floated since the 1970s that nearly destroyed the State of Kansas a few short years ago.

The center- and right-leaning objections to these proposals have been simple: "People worked for that wealth, and they shouldn't be denied it." There is some truth to that position: People did work for that money, and often there was a significant risk that allowed them to earn it. However, this argument simply ignores several important details.

Let's take Bill Gates, and his story, as an example, though it's worth noting that the core points here are basically true of every Billionaire.

First, Gates didn't build Microsoft (the largest source of his fortune) alone. Excluding the Nokia Acquisition in 2014, by the time Gates largely resigned from Microsoft that year, Microsoft had around 100,000 employees. Given the natural retirement and the general churn of employees between jobs in tech these days, it's probably a conservative estimate to say that Microsoft has had well over a million employees between it's founding in 1975 and Gates departure in 2014. Yet only 3 people became Billionaires off of Microsoft (Gates, Allen, Ballmer).

Yes, Microsoft has produced tens of thousands of Millionaires. 12,000 by 2005, but that number is likely quite a bit higher by now, though only that pre-2015 number is likely to include many who earned more than $10M, often largely due to pre-IPO stock grants and Microsoft's strong market performance over the decades.

Did Bill Gates warrant more pay than the average Microsoft employee? Sure; he had more responsibility and had taken the risk of founding the company. But he didn't just get more, he got orders of magnitude more. Three to five more orders of magnitude.

Second, we talk about Billionaires like they came from nothing. Bill Gates's father was a prominent Seattle Lawyer, his mother served on the boards of several banks. He attended a private, very expensive private school. He was at Harvard when he dropped out to found Microsoft. He grew up in substantial privilege, and he could afford to take the risk on a company like Microsoft in 1975 because if it failed, he knew he wasn't going to starve. Hell, Jeff Bezos of Amazon got his parents to put up $250,000 in initial funding, and it's never been intimated that he was seriously risking his parent's future in taking that money.

Knowing you have an out if things go really badly, even if you tell yourself you'd never ask your parents for more money, sits there in the back of your head, it allows you to take risks you might not otherwise, because there is an out.

So, I do understand why there is some balking at a high tax rate for earnings over $10,000,000 a year. Even successful entrepreneurs will typically see those earnings hit in a single big event (company sale, IPO, etc.) and as a result company sales will be a lot less lucrative. I mean, it's still at least a $10M payday, but we're still talking about a massive change in the potential payouts for entrepreneurs. Though anyone who wouldn't found a company because they would only make ~$10M is frankly, kind of an asshole.

This would be a huge shift in the way America thinks about wealth and the role wealth has in a greater society.

We need to be rethinking this though.

I have worked for Google for 6 years. Between base salary, stock, bonuses, and other benefits, I earn somewhere north of $300,000 a year. Somewhat to my surprise, this actually puts me a bit above the median salary at Google. It also puts me any my wife in the 97%+ of wage earners in the US. However, the distance economically from where we are now, from where we were back when we when I was civil servant making around 1/5 of I make now, is far, far less than the earnings of the super rich.

Even Sundar Pichai, Google's CEO, while his base salary is only $650,000 (though that doesn't include his $1.2M security budget), he's been awarded literally hundreds of millions of dollars in stock over the last few years. So much so that he's actually started turning down newer stock grants.

The result of all of this has been the intimation that a large number of entrepreneurs will leave the US if these things become policy. Maybe that's true. However, the things that these programs could pay for, like Medicare For All, also stands to remove impediments for a lot of people to start their own businesses. Plenty of people have medical problems that require fairly high maintenance costs, enough that having their health insurance tied to their employment makes starting a business a massive risk.

Does the likelihood that Medicare for All may free a ton of people to start businesses offset the loss of a relatively few businesses that may grow quite large? Probably not directly. But it's worth noting that tax laws are changing internationally to make it a lot harder for companies to move around their earnings in order to avoid taxes, as much of Tech has done with Ireland over the years.

However, at the end of the day, the numbers that have been thrown about as the targets for these higher tax rates are high enough that they will never impact 99.9% of the population. I also don't believe that many entrepreneurs would choose not to start a business because they'd "only" stand to earn around $10M.

These proposals don't seek to deny people rewards for success. For taking risks. They simply seek to acknowledge the fact that we live in a society, that we do not succeed or fail purely on our own, but that there are an enormous number of factors involved in every success. If people are really intent on preventing this wealth from being taken by the government in taxes, maybe they'll start paying better wages, so we might finally see real wages increase again for the first time since before I was born.

PS: Did you know Bill Gates Sr. co-wrote a book on why we should tax the super rich more?

I've been thinking a lot lately about Social Media.

It's probably worthing noting that in the more than seven years since I last posted to this blog, I have spent six of those years with Google, and specifically spent two of those years working on Google+. I actually like Google+ a lot, and am disappointed, though sadly not surprised, at the recent decision to shut down the consumer-focused Google+ access.

This post isn't really about Google+, though. While it's true that Google+ is not immune to what I'm talking about here, my thinking on Social Media today applies pretty evenly to every Social Media site I've ever used (and that dates back to in the late 90s).

My feelings on this are also deeply influenced by what the Internet was when I was growing up. The Internet of the late 1990s and early 00s was very different in a great many ways. But what I've been thinking about most recently is the way that communities tended to be somewhat diffuse, specific, and focused in a way that modern Social Media doesn't really allow for.

If you were interested in cars, there was a set of car forums you could participate in. Politics? Computers? Furry Porn? The communities were out there. They still are, and they are certainly much easier to find than they used to be, but there is one major difference. Today, those communities exist on Facebook, Google+, or on hashtags or lists on Twitter.

The problem with that is that it means that all of your communities that you participate in are tied to a single identity.

Merging identities for productivity is great (I still dislike that I can't easily merge my work calendar and my personal calendar free/busy data with reasonable privacy defaults). However, merging identity in social contexts is a very, very different problem.

We all present ourselves differently to different groups. The way we behave with close friends is different than co-workers, or church groups, or our parents. For some people, this can actually be dangerous. Even if it's not dangerous for you (it's not for me), when everything you believe is basically a click away from every conversation you particpate in, it means your communities are constantly bleeding into each other, which can be, if nothing else, exhausting.

If this had the side effect of breaking down echo chambers, maybe it would have been worth it. But clearly, it didn't do that.

In fact, I think it might have made it worse. When everything you might disagree with anyone on is always a half-step away, I suspect it causes people to be more defensive, and thus more likely to double-down on disagreements, and more inclined to apply purity tests. I think it drives people further into their echo chambers, because it becomes impossible to step out even for a minute, and there starts to be little incentive to do so.

And giving people space to step out of their echo chambers is the only proven way to get anyone to actually change their mind about anything.

Now, I know that right now, giving people that space has gotten really hard. The US Governemnt has built Child Internment Camps in Texas. Nazis are marching the US Streets and murdering protestors. Our President is openly fascist, and don't think the overt racists don't notice.

While I may question the tactics of harassing Mitch McConnel while he's out to dinner (it makes him look like a victim to some people who might be otherwise convinced), I understand why people are doing it.

And this post has gotten far more political than I intended when I started writing it. But that's almost the point. Social Media, the fact that it is broken, and very likely the ways in which it is broken, are very likely linked to where we're at politically today. Yes, there have always been Nazis in America, and Nazis are really skilled at exercising their power in an outsized manner. However, Social Media has proven an effective tool for disinformation (that target both sides, as they're interested in the chaos). That disinformation has helped sow distrust that plays into that already polarizing nature of these services.

I'm not sure what the way to fix this is. Allowing people to manage multiple identies on these mega-services is difficult, as it can become hard for the user to keep track of the current context they're in (we've all sent chat messages to the wrong people before). With smaller, more focused services, we had better visual signals. Doing this at scale for the large services is likely unworkable. You try telling your visual designers they need to build something in a dozen different color schemes that users can switch between at will (and even just switching color schemes doesn't really solve the problem).

Giving community participants more freedom on how they present themselves to that community is an important step to allowing users to move between communities. People who are regularly moving between communities are exposed to more ideas, more different ways of viewing the world, more different kinds of people in general. That is healthy.

Online Communties in the 90s and 00s had plenty of problems. We pretty much all assumed that people we were interacting with where cisgendered, heterosexual white men between 16 and 25. Rarely questioned that assumption unless someone made an issue of it. Unfortunately, particularly at the time, we seemed to usually be right, which had the side effect of pushing people who didn't match that list of characteristics to pretend they did in order to match the community norms. We, the Default People, defined those norms after all. And trust me, the Default People like to complain when challenged. I've seen countless influential tech people hemmorage followers when they start getting Political on Twitter.

For people who fall outside that default (a term I'm only using sarcastically, and am stopping using because the idea of default is inherently limiting), I do think modern Social Media has made finding "their people" easier. Community discovery is just so much better than it used to be, and these sites make walking the social graph easier to help build these communties more easily. That has unquestionably done a lot of good for a lot of people, even if it sometimes forces those people to fully present themselves, even in contexts where it may not have mattered otherwise. This need to fully present oneself at all times, may well have contributed to the many discussions around representation and opportunity presented to women, people of color, and the transgender participants in our broader communities.

While I believe it's got to be possible to allow people some more freedom to limit how they present themselves into smaller, more focused communties, that we can move between and discover with some ease (easily creating new isolated sub-identities we can use within these more focused communities), it is vitally important that we do so in a way that combats the idea of any group of people being default (unless, I suppose, if the community is targeted based on demographics, I guess).

I actually think this has something to do with the rise of Slack and Discord, particularly among younger users. Discord servers tend to be somewhat focused, and while you can see mutual servers you have in common with other users, you can't just see all the servers they belong to, giving you a lot more flexibility (though since you only have one identity across all servers, you can incidentally link to people across communities who may or may not respect any separation you'd prefer to keep between those communties). I suspect Slack is similar, but honestly, I've never really used it. I've barely used Discord.

There are so many confounding factors present in whether things are better or worse, or how they are better or worse. Where Social Media, and how it's evovled over the last decade, precisely fits into both the positives and negatives are really hard to judge. Maybe some community isolation and mobility wouldn't help anything (and would add discoverability impediments that could actually harm those in need of smaller communities). The privilege of having belonged to the default group (at least for the English-speaking world online) absolutely taints my thinking, so I must acknoweldge that my gut reaction may be entirely flawed.

Still, things can be better, and I am increasingly convinced that it will take a major shift in the way we, both the users and the platform owners of these large social media sites, think about how we want to build our communities moving forward. The status quo certainly doesn't seem to be working for anyone.

The last few weeks have been very long, but still very, very good. Almost exactly two months ago, I found myself with the opportunity to leave my employer of nearly four years, Washington State University. WSU has proven to be an excellent incubator for me over these last few years, but I know full well I had been ready to go for months, even prior to beginning my job search in earnest. At some point, I had begun to feel that the University had become an impediment to my further professional growth, and I increasingly found myself strongly disagreeing with the direction of the higher leadership at the institution, whom it seemed was continually making decisions that I felt were neither sustainable, nor fiscally responsible for a state-run institution.

While I was ready to move on, the decision to seek a new job was also strongly driven by a new opportunity my wife had created for herself. Due to an unfortunate situation with her advisor for her graduate program, she decided to stop pursuing a Ph.D. at WSU, and instead complete her Master's degree in Zoology and complete her Ph.D elsewhere. Earlier this year, she was given an excellent opportunity at the University of Louisiana at Lafayette, working under Dr. Darryl Felder, a researcher focusing on decapod crustaceans. It is an amazing opportunity, that will have us moving, six days after this post, to the heart of cajun country.

As for me, I have spent the past six weeks or so working for Meebo. Meebo has been finding itself in a period of really aggressive growth over the first half of this year, with a half dozen people joining on the front-end JavaScript team alone this year, including myself. It's been an exciting place to be, and though Meebo's engineering is based in Mountain View, California and New York City, I have been lucky enough to be brought into the company as one of the first full-time remote engineers.

Working remote is definitely a change, though my experience in the open source community over the last decade or so, and especially on the YUI project over the last three or so years, had taught me a lot about working with people you communicate with primarily via e-mail and chat. Still, it has been an adjustment, as my desk is ten feet from my bed, and as my fellow YUI contributor and recent Meebody, Tony Pipkin (@apipkin) recently tweeted:

New office attire: basketball shorts and a plain white t

At Meebo, I have transitioned to being a pure JavaScript programmer. When I need a server-side component, I pass those tasks off to someone else, which is a bit awkward. I have to e a lot more proactive about making sure that my server-side counterpart is aware of my requirements early enough that they can be scheduled, since I'm not in Mountain View, I need to communicate really clearly and with written specifications, because miscommunication can result in the wrong thing being implemented because of ambiguous language.

I've been assigned to the Ads product at Meebo, which means that any where you go with the Meebo Bar, when the ad pops up, that's code I now own running. Advertising is a nuanced business, but I have long been convinced that the best model we have at scale for monetizing content is ad sales (it doesn't scale down to, say, the size of this blog, however), thought there is an incredibly amount of nuance to that business that I had no idea existed. Comments for another post, however.

In six days, Catherine and I will watch as everything we own gets loaded onto a truck, before we follow that truck out of town for a drive across the country with our two cats. The kind of change that we're looking at has grown to be incredibly intimidating, even though it's exciting. Starting work on the 25th, right after we get down there (and incidentally, possibly a week before our possessions arrive. A 6-14 day delivery window is really inconvenient).

I'll be looking to get involved in a developer group down in Lafayette, and I'm looking forward to getting familiar with the area. And I definitely plan to start blogging regularly again come August.

Scott Berkun1 is a former Microsoft Engineer who severed ties with the mothership and went full-time public speaker some years back. His book, Confessions of a Public Speaker2 is sort of his manifesto on how he felt comfortable making that change and how he feels he's found success.

Scott acknowledges the simple fact that he feels comfortable giving away these secrets and hard earned knowledge simply because he knows that most people will never do the necessary work to become a truly great speaker. I know that I'm not apt to immediately. I feel that presentation is important, and I often present at regional Code Camp events and for WSU's Application Developer's group, something I began doing in part because I was tired of going to events and sitting through talks that I felt had little value either due to poor presentation, or just because I felt I knew more than the presenter.

More than that, I think that it's important to share information within our community. Between techniques and tools, we are certainly spoiled for choice, but without discussion and presentation, the majority of people developing software today have no chance to get expose to any idea that isn't backed by a marketing budget (this is a notable problem in the .NET and Java communities, but that's another post).

If you've done anything with public speaking in the past (and odds are you've taken a class at some time that did something with public speaking), you've no doubt heard much of this advice before. Practice, prepare more content than you think you need (but be prepared to cut content on the fly if necessary), practice, learn to harness your nervous energy, practice, show up early, etc. However, Berkun goes into a depth on this material that I'm not sure I've ever seen before.

He debunks common myths, like "people would rather die than speak in public," by showing where such myths came from and the inherent ridiculousness in such statements. He presents many cases from his own career of things not going well at all, like giving a presentation at the same time and across the hall from Linus Torvalds', who's crowd was overflowing into the hall, while he had less than a dozen sitting in his enormous conference hall.

While Berkun does stress that preparation is the key to making any public speaking gig succeed, it's the flexibility to deal with surprises that makes the best speakers as good as they are. Quick thinking doesn't trump preparation, but it's necessary sometimes to avert disaster.

One bit of advice that I'd like to share is what to do in that circumstance where no one shows up to hear you speak. Berkun suggests getting the crowd to move and sit near one another so that you can at least pretend the space is smaller than it truly is, while also making it easier to engage directly with the audience, perhaps turning it more into an informal directed conversation than a full-blown presentation with slides.

It is clear that Berkun was in technology, and still primarily speaks about tech, but despite his background, and the examples that he uses from his own career that refer to that, this is absolutely a public speaking book, and I think it's accessible to anyone who wants to improve their public speaking, even if you're not interested in turning it into a career.



This past weekend, President Barack Obama was able to announce1 to the world that, ten and a half years after taking responsibility for the single worst day of attacks on American soil in our history, Osama Bin Laden had been killed.

I am not going to discuss the morality of killing Bin Laden. We are a nation which practices the death penalty, and whether the operation which finally caught up with Bin Laden would capture or kill him, he was going to die. At least this way, we will not be subjected to a mockery of a trial, as the world was given following the capture of Saddam Hussein2.

Some have made much of the fact that Bin Laden had been at the compound in Northern Pakistan which was assaulted on Saturday for some time, certainly long enough that JSOC3, the special forces unit directly answerable to the President which was designed around this sort of mission, had time to build and train in a replica of the compound for one full month. But I think that it's a simple answer to why, in the end, this turned out to be relatively easy.

Osama Bin Laden was a great many things, but I do not believe he could ever have been categorized a fool. He choose to attack in the manner in which he did in 2001 because he knew that in no way could Al-Qaeda stand up against the American Armed Forces. By taking principle responsibility for the attack, he put himself in our line of fire, he began living on borrowed time, and prepared himself for martyrdom.

For a time, there as clearly value in remaining alive to release statements and further antagonize the West, but in order to be a Martyr, it would eventually become necessary to die.

This is why I believe he had lived so long in North Pakistan with a relatively small retinue of defenders. When dealing with a strike team like JSOC, five militants would stand about as well as twenty, but it would be easier to live, and live in relative comfort with fewer. No doubt there were other activities in progress, training new leuitenants for instance. Plus, a martyr does not walk into death, they must wait for it to find them.

However, even if you don't believe, as I have come to, that this death was, in some way, prepared for by Bin Laden, there is no true victory in his death for us as a nation.

Ten years ago, we as a nation had certain harsh realities thrust upon us. It was demonstrated that we were not as insulated as we'd believed from the realities of global politics, or the terrible truth of the distrust and resentment created by our government's historic policy of convenient involvement in other nations' affairs. The world did not change that day, but American's view of our place in it certainly did.

And what do we have to show for it today? The death of an enemy for whom 66% of American's age 13-17 didn't even know who they were (which, if anything, shows just how irrelevant Bin Laden had become). The legacy of the PATRIOT Act. The formation of the TSA. Erosion of our civil rights. Three Middle Eastern wars, two of which were justified as linked to those September 11, 2001 attacks.

I do not mean to downplay the actions of JSOC and the SEAL team which is responsible for Operation Geronimo. It was a well executed military action, particularly that we only suffered the loss of a single helicopter to mechanical failure. They executed their assigned mission, by all accounts, professionally and expertly.

I am not even terribly concerned with the lack of respect for the deceased shown by the decision to dump the body into the sea, while attempts were being made to follow all other tenets of Islam surrounding the handling of the dead. The concern that Bin Laden's grave would become a place of pilgrimage for extremists was an understandable one, though the claim they couldn't find anyone to take the body is absurd, given the size of Osama Bin Laden's family.

I do not wholly misunderstand the jubilation felt by many at the news, particularly in the city of New York. I myself was in New York City, standing on the roof of the World Trade Center in July of 2011. When those towers fell I was awash in a surreal feeling over that experience. But I didn't know anyone who lost their lives in the attack. I haven't watched the growing health problems of the first responders. And I certainly haven't lived with a daily reminder of the tragedy in the form of a damaged skyline and gaping hole at ground zero.

No, I do not begrudge the celebration, especially in New York.

And while this will no doubt drive many in Al-Qaeda into a new level of fervor, improved communication and analysis of intelligence, disruption of what command structure Al-Qaeda has, and a new level of proactiveness and willingness to respond among Americans has greatly reduced their chances at success. Combined with the real and reasonble security improvements, it is untrue to say we have nothing to fear, but the risk is likely better mitigated today than at any time in our past.

I just can't help but think that, until we as a nation decide that we will not trade essential liberty for the illusion of security (and much of it has been illusion), than that enemy has still won. Terrorism is not defeated by killing terrorists. It is defeated by refusing to be terrorized.