Mad, Beautiful Ideas

There has been a lot of buzz about the idea of Tuition-Free Public Universities this year, as this idea has worked it's way into what is almost certainly going to be the DNC platform for the election in 2020. I get that. While I have been fortunate enough to have had a job good enough to pay off all of my own, but also my wife's student loans by my early 30s, a lot of people are drowning in debt, and either un- or under-employed.

I am incredibly glad that this discussion appears to be focused on Public Universities, as that at least suggests that if this comes to pass, it won't just be a way to enrich low-performing private colleges (or Real Estate funds masquerading as colleges), at the expense of students.

I think that approaching the problem of rising costs of a university education by moving Tuition onto the tax payer is premature, because we have problems in our Education system nationally that such a program is likely to make much worse, and in so doing, make many of our biggest national challenges, even harder.

To begin, I'd like to take a look at one of the nations with free college that people often point to as an example in these discussions: Germany. The German model does work reasonably well in Germany, that is true, though there have been concerns that the system is unsustainable for years. In fact, several German states attempted to institute nominal tuition fees in the early oughts, though they had to roll that back due to massive push-back. Which is simply to say that Germany's system has some scaling issues that they haven't worked out yet, and we need to be aware of that.

While Germany is a much smaller nation than ours, we have a higher GDP per Capita, which does mean that, assuming we can build a taxation structure that stops 500 people from 'earning' $1,200,000,000,000 in a single year we could theoretically make such a thing feasible.

So why do I oppose this? I believe that any system of free tuition will make income inequality, particularly as it applies to students of color, worse. If we don't address the inherent inequities in our primary and secondary schooling systems, we stand to do more harm than good.

Our Public Schools are principally funded by property taxes. Black Neighborhoods tend to have lower property values, even after you control for basically everything (home's are comparable, neighborhoods have same amenities, crime rates, etc). On average nationally, this amounts to around $48,000 less in value per home. Because generally speaking, White People don't want to live in "Black" neighborhoods. Even relatively affluent, middle class black neighborhoods.

As a result of this, even when you compare relatively poor neighborhoods, which have frequently self-selected for race (or been helped along by red-lining), poor white schools have, on average, $1,500 more per student than poor black schools. For non-poor (but still segregated due to neighborhood dynamics) schools, that figure nears $2,000 per student. More funding translates to better resources, smaller class sizes; all things that lead to better educational outcomes.

Failure to address this basic inequity means that Black students remain at a disadvantage when entering Universities because their primary and secondary education likely wasn't as good. Pair this with the likely outcome that already over stressed public universities will need to raise their admissions standards.

There are a lot of factors that play into the rise of tuition costs. Certainly, ready availability of loans is part of it. One piece that I don't think gets enough attention is the increase in student populations, which all require extra resources above what state funding levels (which have been steadily decreasing) provide. In the 1970s, only about 47% of the population went to college. Today, that number is well over 80%. This is also down from a peak of over 90% in 2011.

Since the 1970s, the US Population has also increased by ~100 Million people, putting the number of people who's attended tertiary education from ~94 Million, to 240 Million. According to Washington State University's Institutional Research figures, even as overall student participation rates have been dropping nationwide, WSU has seen a roughly 15% increase in it's enrollment numbers, while tuition rates have dropped about 8% for in-state students, and risen 2% for out-of-state students (I also have no idea how primarily state-funded schools are intended to navigate this free-tuition mandate with regard to in-state versus out-of-state students, but I don't think anyone else does either). It's also worth noting that, on a longer time scale, WSU's population since 2001 (when I entered college), has increased 37%, while it's in-state tuition rates have increased by ~270%.

Free tuition, by removing a primary as a means to control the cost of growth, will likely lead to higher enrollment standards, which will only serve to increase the gap that poor people already struggle to overcome, but which impacts people of color in the US even more than it does Whites. Until we are more equitable in our primary and secondary education, our tertiary education systems will only serve to widen an already too wide gap. The tools we use to gauge students are already demonstrated to be gameable and disadvantageous to poor or non-white students; this will only get worse as minimum entry standards rise.

A university education is too expensive. This is unquestionable. This comes from many factors, from increased demand due to larger student populations (and the idea that without a college degree, you can't succeed that is pushed hard on high school students), to inefficiencies in administration, to reduction in state-level support in many places. Efforts to increase student access to a college education has instead largely increased the debt load for students who aren't seeing the increased opportunity that was implied when they were encouraged to take on that debt. More open access to loans has contributed to higher costs, though I do tend to think that results more of a factor of increased utilization than institutional greed, outside of the for-profit colleges.

Returning for a moment to Germany, it's worth noting that many students in that country choose (or are pushed toward) secondary schools when they enter the fifth grade that indicate whether or not they are expected to be on path toward university. While German students have, since the 1970s, been migrating from the more general Hauptschulen to the more advanced Realschulen, over half of the students in Germany tended to attend the Hauptschulen for a minimal general education, with barely over 10% of students in the university-bound Gymnasien by 2000. In the decades since, the university population has continued to grow, but it still looks likely that the percentage of German students that end up in Universities is less than 20% of the population. Compare this to the roughly 70% of American High School graduates who were enrolled in college in 2018.

I can't speak to the quality of German Hauptschulen or Realschulen compared to US High Schools. However, the idea that American's would accept that we were going to offer free tuition to all, but were going to cap enrollments to 20% of the High School Graduate population instead of the 70% we have today, is laughable. And it runs counter to what many people would expect, though it's the reality of the system people most frequently point to when discussing this issue.

Prioritizing education is critical to the future of our nation. However we have to shore up the base of our system. Make it more evenly distributed. Ensure that the opportunities we provide are more based on merit and ability than the confounding factors of the circumstances of birth and generational wealth. We can't solve our education problems from the top down, we need to start from the bottom up to ensure that everyone has access to the same opportunities, to control for race and other factors, we stand to more deeply entrench our problems.


I want to call out a great thread from Michael Harriot about America's history of White Supremacy that is absolutely worth a read. While I'd been skeptical of free tuition for many of the reasons I'd written in this post prior to Michael's thread, he called out several resources, particular about the racial gap in American schools that I believe improved my arguments, and captured that those funding differences are more stark than I'd thought.

I was reminded recently of the controversy with Vampire: The Masquerade's fifth edition released last year, between the early drafts referencing Neo-Nazis when discussing Clan Brujah, or even bigger, the use of the Gay Purges in Chechnya as a plot point for Camarilla activity in the country.

This reminder was in the form of the recent reading I've been doing of Over The Edge, a role-playing game set in an alternate history where magic, psychic powers, and other weird phenomena exist, but aren't widely known. Like Vampire, the history of this world is long and that requires the reframing of certain events in ways that are likely to offend some. For Over The Edge, this passage stood out to me immediately:

In the meantime, the Pharaohs had arranged the discovery and colonization of the New World, seeing to it that religious misfits, debtors, and desperate adventurers came to populate the northern continent. The Pharaohs arranged the slaughter of the natives so as to have a land without history where they would have maximum power to experiment. (The scheme to mix New and Old World cultures in South America failed miserably.) &mdash Over The Edge (2nd Edition), Johnathan Tweet, 1997

I haven't fully read the 2019 Third Edition of Over the Edge, but a quick skim does suggest that the above hasn't really been retconned, but definitely doesn't get the attention it got in the Second Edition.

Now, OTE Second Edition was released in 1997, and the atrocities that it reframes were mostly centuries old (though obviously there are far more recent issues with the treatment on indigenous peoples in the Americas deep into the 20th Century with lesser evils continuing to this day), so perhaps this never generated so much attention because of where we were culturally, or due to the distance from the events being portrayed.

However, there are other historical events from Vampire that are widely reframed. The Spanish Inquisition is framed as a mass revolt of Humanity, using the power of the Church, against Vampires. All the various parts of the World of Darkness were part of World War II, though the text is quick to point out that those events were initiated from purely human causes, thus demonstrating some degree of sensitivity to too aggressively re-purposing historical atrocities within the context of the story.

Finally, I am reminded of the NBC TV Show Grimm, which revealed in the very first Season that Hitler himself was Wesen (a type of human with a animalistic nature that most humans are unable to perceive).

Screengrabs from NBC's Grimm of Nazi Leader Adolf Hitler in Human and
                 monstrous Wesen forms.

By Season 5, Grimm goes so far as to literally describe Hitler as trying to create a world ruled by Wesen, something which paints a weird view of the world, particularly given it's also dabbling in a shadow government of "Royal Families" which regularly works in both the Human and Wesen world.

Grimm's decision here is particularly interesting, because the writers, both by not having a lot of time (due to TV narrative constraints), but also potentially due to not really wanting to, end up leaving large enough questions about the implications of some of these decisions. Why did Wesen Hitler target Jews? Who knows, it's never discussed. Trying to justify it would absolutely be risky. Not mentioning it at all, doesn't feel like a better answer.

These issues aren't unique to the stories set in our own world with a few things changed. All fiction, in order to be accessible to the reader and to have a sense of familiarity, needs to borrow from our shared culture, causing all fiction to be viewable through a historic lens.

Which is important. Fiction allows us to investigate the past, to develop empathy by examining other people's experiences and emotions through their stories. Role-playing takes that a step further by encouraging us to embody these characters, and games like Vampire or Monsterhearts, which force us as players to deal with the tension between monstrosity and humanity...can lead for some deeply compelling stories.

I suspect the fallout that the Vampire developers received came from a few sources. First, the atrocities published in the book were (and I think still are) ongoing. Second, Vampire has long drawn a larger proportion of Queer players than much of the hobby (particularly when considering how the hobby was when Vampire first published in 1991). These two factors likely simply led this to feel like a more direct attack on that community from a product well loved by much of the community. The fact that the story element seemed to suggest that Humans weren't behind this very real atrocity, particularly when the authors have emphatically insisted certain other atrocities weren't perpetrated primarily by the monstrous forces of the World of Darkness.

I understand White Wolf's decision to remove the references in the Camarilla book to the Chechen Anti-Gay Purges. It was the correct decision. Certainly in light of the backlash, but also because the inclusion of the detail was a mistake. Not only because it is so fresh, but because it did push an evil humanity was committing against itself onto the monsters of that world. Vampire may be a game about Monsters, but it's mostly a game about Humans struggling with the monster inside them. While for the Vampires that monster may be very real, some people's monsters are more figurative, and games like Vampire need to be careful to not imply that all human evil is derived from a supernatural source.

And I think, when it comes to alternate histories, that's the key thing. To never forget the awful things people do to each other, or excuse them away by implying that the people who do these things are something other than human.

They may be flawed. They may lack empathy. They may do monstrous things, but they are, at the end of the day, still Human.

No man is an island, entire of itself; every man is a piece of a continent, a part of the main. — Devotions Upon Emergent Occasions, John Donne, 1624

In the last few years, in particular, it seems that there has been increased discussions about the topic of Billionaires. Specifically, discussion about whether or not there should be any. Many have weighed in on the matter, from actual Billionaires like Bill Gates and Mark Cuban, to Anarchic Socialists who dream of a world without money.

In the middle, there seems to be an growing number of people who want to see a return to pre-1980s Top Federal Tax Rates, as Representative Alexandria Ocasio-Cortez suggested on CNN back in January and Senator Elizabeth Warren's far more ambitious 2% tax on all fortunes over $50M, to the various "supply-side" economics proposals that have been floated since the 1970s that nearly destroyed the State of Kansas a few short years ago.

The center- and right-leaning objections to these proposals have been simple: "People worked for that wealth, and they shouldn't be denied it." There is some truth to that position: People did work for that money, and often there was a significant risk that allowed them to earn it. However, this argument simply ignores several important details.

Let's take Bill Gates, and his story, as an example, though it's worth noting that the core points here are basically true of every Billionaire.

First, Gates didn't build Microsoft (the largest source of his fortune) alone. Excluding the Nokia Acquisition in 2014, by the time Gates largely resigned from Microsoft that year, Microsoft had around 100,000 employees. Given the natural retirement and the general churn of employees between jobs in tech these days, it's probably a conservative estimate to say that Microsoft has had well over a million employees between it's founding in 1975 and Gates departure in 2014. Yet only 3 people became Billionaires off of Microsoft (Gates, Allen, Ballmer).

Yes, Microsoft has produced tens of thousands of Millionaires. 12,000 by 2005, but that number is likely quite a bit higher by now, though only that pre-2015 number is likely to include many who earned more than $10M, often largely due to pre-IPO stock grants and Microsoft's strong market performance over the decades.

Did Bill Gates warrant more pay than the average Microsoft employee? Sure; he had more responsibility and had taken the risk of founding the company. But he didn't just get more, he got orders of magnitude more. Three to five more orders of magnitude.

Second, we talk about Billionaires like they came from nothing. Bill Gates's father was a prominent Seattle Lawyer, his mother served on the boards of several banks. He attended a private, very expensive private school. He was at Harvard when he dropped out to found Microsoft. He grew up in substantial privilege, and he could afford to take the risk on a company like Microsoft in 1975 because if it failed, he knew he wasn't going to starve. Hell, Jeff Bezos of Amazon got his parents to put up $250,000 in initial funding, and it's never been intimated that he was seriously risking his parent's future in taking that money.

Knowing you have an out if things go really badly, even if you tell yourself you'd never ask your parents for more money, sits there in the back of your head, it allows you to take risks you might not otherwise, because there is an out.

So, I do understand why there is some balking at a high tax rate for earnings over $10,000,000 a year. Even successful entrepreneurs will typically see those earnings hit in a single big event (company sale, IPO, etc.) and as a result company sales will be a lot less lucrative. I mean, it's still at least a $10M payday, but we're still talking about a massive change in the potential payouts for entrepreneurs. Though anyone who wouldn't found a company because they would only make ~$10M is frankly, kind of an asshole.

This would be a huge shift in the way America thinks about wealth and the role wealth has in a greater society.

We need to be rethinking this though.

I have worked for Google for 6 years. Between base salary, stock, bonuses, and other benefits, I earn somewhere north of $300,000 a year. Somewhat to my surprise, this actually puts me a bit above the median salary at Google. It also puts me any my wife in the 97%+ of wage earners in the US. However, the distance economically from where we are now, from where we were back when we when I was civil servant making around 1/5 of I make now, is far, far less than the earnings of the super rich.

Even Sundar Pichai, Google's CEO, while his base salary is only $650,000 (though that doesn't include his $1.2M security budget), he's been awarded literally hundreds of millions of dollars in stock over the last few years. So much so that he's actually started turning down newer stock grants.

The result of all of this has been the intimation that a large number of entrepreneurs will leave the US if these things become policy. Maybe that's true. However, the things that these programs could pay for, like Medicare For All, also stands to remove impediments for a lot of people to start their own businesses. Plenty of people have medical problems that require fairly high maintenance costs, enough that having their health insurance tied to their employment makes starting a business a massive risk.

Does the likelihood that Medicare for All may free a ton of people to start businesses offset the loss of a relatively few businesses that may grow quite large? Probably not directly. But it's worth noting that tax laws are changing internationally to make it a lot harder for companies to move around their earnings in order to avoid taxes, as much of Tech has done with Ireland over the years.

However, at the end of the day, the numbers that have been thrown about as the targets for these higher tax rates are high enough that they will never impact 99.9% of the population. I also don't believe that many entrepreneurs would choose not to start a business because they'd "only" stand to earn around $10M.

These proposals don't seek to deny people rewards for success. For taking risks. They simply seek to acknowledge the fact that we live in a society, that we do not succeed or fail purely on our own, but that there are an enormous number of factors involved in every success. If people are really intent on preventing this wealth from being taken by the government in taxes, maybe they'll start paying better wages, so we might finally see real wages increase again for the first time since before I was born.

PS: Did you know Bill Gates Sr. co-wrote a book on why we should tax the super rich more?

I've been thinking a lot lately about Social Media.

It's probably worthing noting that in the more than seven years since I last posted to this blog, I have spent six of those years with Google, and specifically spent two of those years working on Google+. I actually like Google+ a lot, and am disappointed, though sadly not surprised, at the recent decision to shut down the consumer-focused Google+ access.

This post isn't really about Google+, though. While it's true that Google+ is not immune to what I'm talking about here, my thinking on Social Media today applies pretty evenly to every Social Media site I've ever used (and that dates back to SixDegrees.com in the late 90s).

My feelings on this are also deeply influenced by what the Internet was when I was growing up. The Internet of the late 1990s and early 00s was very different in a great many ways. But what I've been thinking about most recently is the way that communities tended to be somewhat diffuse, specific, and focused in a way that modern Social Media doesn't really allow for.

If you were interested in cars, there was a set of car forums you could participate in. Politics? Computers? Furry Porn? The communities were out there. They still are, and they are certainly much easier to find than they used to be, but there is one major difference. Today, those communities exist on Facebook, Google+, or on hashtags or lists on Twitter.

The problem with that is that it means that all of your communities that you participate in are tied to a single identity.

Merging identities for productivity is great (I still dislike that I can't easily merge my work calendar and my personal calendar free/busy data with reasonable privacy defaults). However, merging identity in social contexts is a very, very different problem.

We all present ourselves differently to different groups. The way we behave with close friends is different than co-workers, or church groups, or our parents. For some people, this can actually be dangerous. Even if it's not dangerous for you (it's not for me), when everything you believe is basically a click away from every conversation you particpate in, it means your communities are constantly bleeding into each other, which can be, if nothing else, exhausting.

If this had the side effect of breaking down echo chambers, maybe it would have been worth it. But clearly, it didn't do that.

In fact, I think it might have made it worse. When everything you might disagree with anyone on is always a half-step away, I suspect it causes people to be more defensive, and thus more likely to double-down on disagreements, and more inclined to apply purity tests. I think it drives people further into their echo chambers, because it becomes impossible to step out even for a minute, and there starts to be little incentive to do so.

And giving people space to step out of their echo chambers is the only proven way to get anyone to actually change their mind about anything.

Now, I know that right now, giving people that space has gotten really hard. The US Governemnt has built Child Internment Camps in Texas. Nazis are marching the US Streets and murdering protestors. Our President is openly fascist, and don't think the overt racists don't notice.

While I may question the tactics of harassing Mitch McConnel while he's out to dinner (it makes him look like a victim to some people who might be otherwise convinced), I understand why people are doing it.

And this post has gotten far more political than I intended when I started writing it. But that's almost the point. Social Media, the fact that it is broken, and very likely the ways in which it is broken, are very likely linked to where we're at politically today. Yes, there have always been Nazis in America, and Nazis are really skilled at exercising their power in an outsized manner. However, Social Media has proven an effective tool for disinformation (that target both sides, as they're interested in the chaos). That disinformation has helped sow distrust that plays into that already polarizing nature of these services.

I'm not sure what the way to fix this is. Allowing people to manage multiple identies on these mega-services is difficult, as it can become hard for the user to keep track of the current context they're in (we've all sent chat messages to the wrong people before). With smaller, more focused services, we had better visual signals. Doing this at scale for the large services is likely unworkable. You try telling your visual designers they need to build something in a dozen different color schemes that users can switch between at will (and even just switching color schemes doesn't really solve the problem).

Giving community participants more freedom on how they present themselves to that community is an important step to allowing users to move between communities. People who are regularly moving between communities are exposed to more ideas, more different ways of viewing the world, more different kinds of people in general. That is healthy.

Online Communties in the 90s and 00s had plenty of problems. We pretty much all assumed that people we were interacting with where cisgendered, heterosexual white men between 16 and 25. Rarely questioned that assumption unless someone made an issue of it. Unfortunately, particularly at the time, we seemed to usually be right, which had the side effect of pushing people who didn't match that list of characteristics to pretend they did in order to match the community norms. We, the Default People, defined those norms after all. And trust me, the Default People like to complain when challenged. I've seen countless influential tech people hemmorage followers when they start getting Political on Twitter.

For people who fall outside that default (a term I'm only using sarcastically, and am stopping using because the idea of default is inherently limiting), I do think modern Social Media has made finding "their people" easier. Community discovery is just so much better than it used to be, and these sites make walking the social graph easier to help build these communties more easily. That has unquestionably done a lot of good for a lot of people, even if it sometimes forces those people to fully present themselves, even in contexts where it may not have mattered otherwise. This need to fully present oneself at all times, may well have contributed to the many discussions around representation and opportunity presented to women, people of color, and the transgender participants in our broader communities.

While I believe it's got to be possible to allow people some more freedom to limit how they present themselves into smaller, more focused communties, that we can move between and discover with some ease (easily creating new isolated sub-identities we can use within these more focused communities), it is vitally important that we do so in a way that combats the idea of any group of people being default (unless, I suppose, if the community is targeted based on demographics, I guess).

I actually think this has something to do with the rise of Slack and Discord, particularly among younger users. Discord servers tend to be somewhat focused, and while you can see mutual servers you have in common with other users, you can't just see all the servers they belong to, giving you a lot more flexibility (though since you only have one identity across all servers, you can incidentally link to people across communities who may or may not respect any separation you'd prefer to keep between those communties). I suspect Slack is similar, but honestly, I've never really used it. I've barely used Discord.

There are so many confounding factors present in whether things are better or worse, or how they are better or worse. Where Social Media, and how it's evovled over the last decade, precisely fits into both the positives and negatives are really hard to judge. Maybe some community isolation and mobility wouldn't help anything (and would add discoverability impediments that could actually harm those in need of smaller communities). The privilege of having belonged to the default group (at least for the English-speaking world online) absolutely taints my thinking, so I must acknoweldge that my gut reaction may be entirely flawed.

Still, things can be better, and I am increasingly convinced that it will take a major shift in the way we, both the users and the platform owners of these large social media sites, think about how we want to build our communities moving forward. The status quo certainly doesn't seem to be working for anyone.

The last few weeks have been very long, but still very, very good. Almost exactly two months ago, I found myself with the opportunity to leave my employer of nearly four years, Washington State University. WSU has proven to be an excellent incubator for me over these last few years, but I know full well I had been ready to go for months, even prior to beginning my job search in earnest. At some point, I had begun to feel that the University had become an impediment to my further professional growth, and I increasingly found myself strongly disagreeing with the direction of the higher leadership at the institution, whom it seemed was continually making decisions that I felt were neither sustainable, nor fiscally responsible for a state-run institution.

While I was ready to move on, the decision to seek a new job was also strongly driven by a new opportunity my wife had created for herself. Due to an unfortunate situation with her advisor for her graduate program, she decided to stop pursuing a Ph.D. at WSU, and instead complete her Master's degree in Zoology and complete her Ph.D elsewhere. Earlier this year, she was given an excellent opportunity at the University of Louisiana at Lafayette, working under Dr. Darryl Felder, a researcher focusing on decapod crustaceans. It is an amazing opportunity, that will have us moving, six days after this post, to the heart of cajun country.

As for me, I have spent the past six weeks or so working for Meebo. Meebo has been finding itself in a period of really aggressive growth over the first half of this year, with a half dozen people joining on the front-end JavaScript team alone this year, including myself. It's been an exciting place to be, and though Meebo's engineering is based in Mountain View, California and New York City, I have been lucky enough to be brought into the company as one of the first full-time remote engineers.

Working remote is definitely a change, though my experience in the open source community over the last decade or so, and especially on the YUI project over the last three or so years, had taught me a lot about working with people you communicate with primarily via e-mail and chat. Still, it has been an adjustment, as my desk is ten feet from my bed, and as my fellow YUI contributor and recent Meebody, Tony Pipkin (@apipkin) recently tweeted:

New office attire: basketball shorts and a plain white t

At Meebo, I have transitioned to being a pure JavaScript programmer. When I need a server-side component, I pass those tasks off to someone else, which is a bit awkward. I have to e a lot more proactive about making sure that my server-side counterpart is aware of my requirements early enough that they can be scheduled, since I'm not in Mountain View, I need to communicate really clearly and with written specifications, because miscommunication can result in the wrong thing being implemented because of ambiguous language.

I've been assigned to the Ads product at Meebo, which means that any where you go with the Meebo Bar, when the ad pops up, that's code I now own running. Advertising is a nuanced business, but I have long been convinced that the best model we have at scale for monetizing content is ad sales (it doesn't scale down to, say, the size of this blog, however), thought there is an incredibly amount of nuance to that business that I had no idea existed. Comments for another post, however.

In six days, Catherine and I will watch as everything we own gets loaded onto a truck, before we follow that truck out of town for a drive across the country with our two cats. The kind of change that we're looking at has grown to be incredibly intimidating, even though it's exciting. Starting work on the 25th, right after we get down there (and incidentally, possibly a week before our possessions arrive. A 6-14 day delivery window is really inconvenient).

I'll be looking to get involved in a developer group down in Lafayette, and I'm looking forward to getting familiar with the area. And I definitely plan to start blogging regularly again come August.