The Long Road

Here, there are only choices lingering upon a festering past, redolent upon those rightly reviled promises of the kinder virtues to which we all of us originally aspire, yet inevitably fall short.

Do I seem bitter? That would be the easy charge, but no, what you hear is rather the pomp of those less temperate horsemen, of ire and rage and the alienation and envy that are now both my own necessity and inevitable imperative.

– Wraith 2017

Without the weight of external factors, narratives unwind in their own time and for their own reasons.

Sometime in the early part of my college years, after a respectable amount of both introspection and consideration, I made the conscious decision to not pursue a career in writing fiction.

I was, mind you, pretty sure I could pull off the commercial thing if I chose to, but there were (and are) a lot of things competing for my time, and in the end I decided the upsides of that path were just less important to me than the ability to, unfettered, wield writing solely as a tool of expression and personal catharsis.

In 2001, I published Wight.

Dictionary.com has this to say about the word:

noun
1. a human being.
2. Obsolete. a supernatural being, as a witch or sprite. any living being; a creature.

The title was, of course, a deliberate play on all of the above meanings; as a first person work of dark fantasy, it featured a narrator able to raise the dead. As goes the inevitable cliche of an author’s first (well, close enough) narrative work, it was also a coming of age novel.

Totally an accident, I swear. I really didn’t mean it that way.

Someone once asked me if it was allegorically autobiographical, to which I could only say, “Meh.” The narrator definitely isn’t me, he is bolder than I was at that age, but also dumber. The events do not parallel anything in my own life for the most part.

While Wight was not autobiographical (even allegorically), it was an accurate representation of the emotional scaffolding and conceptual context that was the self of the me of that time and place.

That’s not insignificant; in fact, I could argue that that is far more important than any literal accounting of facts and dates and tedious timelines.

Of course – as it had to – it ended on a cliffhanger. I hate cliffhangers.

Fast forward to twenty years later.

I started putting together the framework of a sequel that would (surprise) take place twenty years later after that rather rude cliffhanger ending I had dropped so rudely on my (probably) five readers in 2001.

Instead of being an acceptable, albeit somewhat maudlin coming-of-age narrative, Wraith would be about the emotional scaffolding and conceptual context of my life at this period in my life, therefore a meditation of sorts on the falling apart of things, of how one reconciles with one’s own past, of how one tries to find new paths in the debris that is the inevitable consequence of the cataclysm of personal history.

That, like its younger sibling, has taken a few years to mature (no editorial deadlines, remember?) which in some ways has proven to be a good thing, as the passing of years has matured the thoughts and introduced new bitter flavors to the soil of that story.

This story, this “Third Book of the Dead”, is outlined and significant chunks are written, and it feels like it is close to its time to emerge into the world.

Like all children, however, it will emerge when it chooses to and not a moment earlier or later.

(Unless one chooses a Cesarean birth, but that seems overly harsh in the absence of a financial editorial requirement, right?)

There’s even a third one lurking somewhere in there, chronologically in between the two works that would be called Wer, but whether that is ever written will depend on certain things happening and certain other things not happening.

That history, dear reader, must therefore bide, at least a while.

The Game Design of a Constructed Language

unnamed

I waved as I approached the Damocles‘ quartermaster. “Skusatsu mo. Sele ma pasu. Hildo spreku?”

She cocked her head quizzically. “Tak.” Thank god, I thought. At last someone who speaks Hilde. “Kep kereru te?” she asked.

Last Chance boto kapitana yestu me. Iltre siotusa haba ma pasu. Yarog, te butu uno?”

She shrugged. “Probleme yeta.” She disappeared into the open airlock, pushing out a moment later with a small box she tossed with the experience of someone who had lived a very long time in low g.

“Danku.”

World building, whether for a book, a movie, or a game, is all about the feeling of immersion.

You actually don’t want to bury your audience in detail – a little goes a long way. That being said, if the snippets you do show your audience aren’t internally consistent and don’t make sense at an intuitive level, while your audience usually won’t explicitly call you out on that, they will definitely pick up on any lack of said internal consistency.

Thus, at the end of the day, the paradox of building an immersive environment is that the most straightforward way to build an immersive world is to construct all the details, and then proceed to only show maybe 10% of said details. The rest isn’t actually wasted, though, it lives in the implicit connective tissue of the world you are trying to build.

Constructed languages, sometimes called “conlags” have been put together for a number of different reasons ranging from attempts to bring humanity together linguistically like Volapük or Esperanto, to supporting the construction of fictional worlds such as Star Trek‘s Klingon language or J.R.R. Tolkien‘s myriad languages of Middle-Earth.

Video games have gotten into the action as well, and true to form, often in such a way as to slip beneath the radar of the player while at the same time deeply enriching the world. Skyrim’s Dovahzul.

c53e27226eb7353788723a89af6f19d0a98a79One of my several side projects is the development of a multiplayer logistics game called ORG. Set in a late 25th century where humanity has finally recovered from an appalling population crash to colonize even the farthest reaches of our solar system, I wanted to do more than make a generic science-fiction environment but to make something recognizably unique with depth that would allow for endless permutations, philosophical exploration, and narrative opportunities.

First, I started with mapping out in detail the over thirty distinct national polities that were to serve as the bedrock of a narrative milieu. In so doing, I also had the opportunity to explore the almost endless political and economic variations humanity has or might someday choose to try out.

These included everything from what might happen if you had an entire enclosed culture and state based on the often exploitive Human Potential Movement of the 60s counterculture to a society built on clone slave labor to societies where labor unions have triumphed to religious fundamentalists building a society living in terror of the possibility of technological singularity to fascinating exercises in pure democracy to…well, you get the picture.

(I admit it, I had a lot of fun seeing how many dystopic variations I could come up with. No economic or social theory was safe from my attempt to exaggerate it until its own victories would become the seeds of its own failure.)

Second, I began to explore some of the philosophical implications of a civilization like this. For example, what are the implications of culture when the frontier is dominated by a very small percentage of the human population, but the power is overwhelming maintained at home on Earth? What are the psychological and cultural implications in a society that exists in an artificially maintained environment? (Andy Weir’s novel The Martian and The Expanse series are both great fictional explorations of this.)

Org_Polities_HildasTriangleFinally, I got to the question of linguistics. Language is a peculiarly underappreciated part of social functioning. We’ve all heard of the languages that have multiple words for love (Greek) to the differences between gendered languages like French or Italian or languages without articles like “the” or “a” (Russian) or languages with minimal or no concept of tense or time (Amondawa).

While the idea of language defining how we think is in fact sometimes exaggerated to strawman-like proportions, the fact is the way a language developed does reflect the needs and foci of the speakers.

One of the unfortunate side effects of technological communication, transportation, and modern commercialization has been a rapid reduction in the diversity of languages spoken in the world today, a concept called “language death“. As the world metaphorically gets smaller by means of improved communication, transportation, and monoculture, the isolation that causes languages to evolve vanishes taking with it the losers in the memeplectic struggle for linguistic dominance.

But what happens once we (hopefully) escape the tyranny of our gravity well in colonizing force? The distances of space are vast, even when we are just talking about the distances between the Earth and the Moon – let alone the distances from Earth to Mars, or even more staggeringly, Mars to Jupiter, or the truly mind-numbing distances between the outer solar system’s gas giants and the dwarf planets of the Kuiper Belt. What happens when you are, really, all alone with just the people around you? How does the fragility of environmental security affect the way a language develops?

It’s not just the fact of isolation: language evolves to match the needs of its environment. Urban environments versus rural environments, arid versus wet, nomadic versus settled are all examples of the various axis that can influence and guide a linguistic evolution.

C5AtTc9

All these factors, plus the expected improvement in capabilities of translation software (which, I should note from personal experience, is definitely not quite there yet) do suggest a reversal or at least slowing of the aforementioned trend of language death today.

Add onto this the existence of long space voyages by trained specialists who live in even more isolation and environmental insecurity than even someone who lives on a relatively safe colony on Mars or Callisto (other than, of course, the contemporaneously raging Martian civil war and Gallilean conflict) and one can readily see how such a language could evolve.

So that’s what I started to do.

ORG‘s Hilde, or “Spacer’s Cant”, is a classic example of what in linguistic circles is termed a pidgin, or in the authoritative words of Wikipedia, “a simplified version of a language that develops as a means of communication between two or more groups that do not have a language in common. It is most commonly employed in situations such as trade.” (As the years go on, of course, it is starting to become a creole, or “a stable natural language that has developed from a pidgin.”)

I started with the historical background. I had decided that the natal location of my pidgin was going to be the asteroid belt between Mars and Jupiter, specifically a polity centered around the asteroid Hilda in the fascinating Hildas Triangle.

In the milieu of ORG, the Collaborated Union of the Hildas Triangle was settled by a sort of successor state to the modern day real world European Union called Union. Unlike the European Union, however, Union grew out of Poland and Romania, with satellite states in the form of Germany and a sans-Siberia Russia.

Space Asteroid Mining sample rock

This meant the linguistic building blocks would be dominated by the phonemes (the sounds) and vocabulary of these nation’s languages. But this was going to be a pidgin, remember, so that meant I decided to make the grammar more like Esperanto – that is, incredibly flexible, very stripped down, and readily learned. There wasn’t going to be any gendering of nouns or even suffixed tenses. Word order was (almost) freeform. Years of learning French, Spanish, Japanese, and Portuguese in school came back to tell me what parts of languages made learning the language insufferably harder than it had to be.

On top of all of this, because this was intended as a trade language as much as anything, I tried to remove sounds that were hard to distinguish for speakers of the major language groups surviving in the 25th century of ORG. (This last I wasn’t quite able to pull off as much as I wanted to, but that’s why several phonemes don’t exist in Hilde, including f, v, and w. Vowels, as well, stick to a basic five with a notable absence of true diphthongs (where you mash together two vowels to create a new sound.)

Then came the vocabulary.

One of the reasons English has purportedly been as successful as it has been as a common language for aeronautical operations is that it relatively direct, short, and staccato, things that are very desirable indeed when piloting a massive 747. Spaceflight, one can logically reason, would have similar needs. Hilde, therefore, prefers short words generally framed on both ends by consonants.

The vocabulary, as well, reflects the needs of a spacer. There are a lot of words describing things like venting atmosphere (“atmosu”), docking clamps (“kleme”), velocity (“predkoso”), attitude thrusters (“aridiste”), and radiation exposure levels (“radso”).

In the end, constructed languages are not only immersive in the bold, obvious ways as they appear on a page or in a game, but also in the secondary effects where misunderstandings have narrative implications or the language’s constituent pieces indirectly implies history.

As a support to the narrative dictum “show, don’t tell” constructed language works by implication as much as it does by definition, and can be a useful pillar to pulling the reader, viewer, or gamer out of their own context and into a new one.

The Terrible Game Design of the American Jury System

Justice Gavel

A few years back I received that dubious American honor of being selected for a second degree murder jury trial. I picked up an absolutely great selection of quotes I noted down for posterity, as well as a lingering distaste for prosecutorial abuse of power, but perhaps the most interesting thing I took away from it was a sense of how the jury process can be compared to game design.

Specifically, very, very, very bad game design.

1399895742349.cachedAt its core, game design involves the creation of certain rules and structure intended to induce the player to accomplish a specified goal in an effective manner.

The game’s design rules are structured based on a set of assumptions as to the range of possible actions and motivations of the player; if these assumptions are incorrect, both the process (the gameplay) and the end result (satisfaction and a sense of accomplishment) are going to be in jeopardy.

At its core, the United States jury system also involves the creation of certain rules and structure intended to induce the juror to accomplish a specified goal (a fair verdict) in an effective manner (the jury process).

Lawyer: “About how many times had you ridden in a car before?”
Witness: “A lot?”
Lawyer: “I’ll accept that.”

The first game system the jury system indulges in – and, obviously, I am using the term “game” in the sense of game theory – is the actual jury selection.

For those not familiar with the American system of jury selection, while there are variations from jurisdiction to jurisdiction, the way it roughly works when a jury needs to be called up is like this:

  1. Your name is selected at random from the lists of voter registration and (sometimes) Department of Motor Vehicle records.
  2. You are then on-call for a certain period of time. Huge numbers of potential jurors are called up, so if they successfully get their 12 jurors plus alternates, anyone left over gets a pass for the next year.
  3. As potential jurors come closer to the front of the line, they will be physically called into a waiting room. Often, after a few hours of just sitting there, the jury is selected and again, they dismiss everyone remaining.
  4. When you get to the head of that line, you are called in a big batch into the courtroom.
  5. Potential jurors are called up and questioned by the lawyers and (sometimes) the judge. A prospective juror can be dismissed for cause (you loudly proclaim that you are a horrible racist and could not possibly be objective) or as part of a limited number of peremptory challenges each lawyer has.
  6. If you pass all these questions, congratulations, you are seated on a jury.
  7. (Unless the two sides settle before the case even concludes, in which case you sat there to no purpose other than to serve as a pawn in a game of Lawyer Chicken.)

So let’s take a look at this from the perspective of principles of good game design.

For a jury trial, the goal is to provide the “deciders of fact” (the jurors) the necessary information to make a reasoned choice. Like a video game, the input – that is, the information you have at your disposal – can be relied upon to heavily influence the final decision – that is, the “player” action.

Battle_of_the_sexes_-_perfect_information

The first step of achieving this is to find jurors who are (1) representative of the community, and (2) sufficiently unbiased enough that they could theoretically decide either way.

And here’s where the jury system design starts to spring leaks. Any final jury seated is in no way representative – the way the system is rigged, it can’t be:

  • Have a dependent and don’t have alternative care for them? You’re out.
  • Don’t vote? Yeah, you probably won’t even be called up in the first place.
  • Possess any subject-area knowledge of the case area? Yep, you’re out because you actually…have experience in the area? What? So a judge isn’t disqualified from a case on patent law because they know patent law, but a medical professional would be disqualified for a jury because they know something about medicine. (Yes, I understand the theory behind this. It’s a fairy-land theory suitable only for works of fiction. Bad fiction, at that.)
  • Demonstrate any strong opinion about anything during the juror interview? Out. Can’t have that.
  • Willing to lie to get out of it? Out. (Probably)
  • Can’t financially support the time off work. Out as well, though in fairness, you really have to be a hard luck case for this to fly. Still, it means that being poor can essentially by itself exempt you. The “pay” they offer is laughable and wouldn’t support a bad panhandling mime.
  • Lawyer for either side suspects your class, profession, race, gender, or age is statistically more likely to vote in a way they won’t like. Gone.

So much for any hope of “representative”. Failing grade on first leg of the game design. Er, jury design.

What about the “sufficiently unbiased” criterion? Well, there’s a time-honored method for this: They ask you if you can be unbiased.

There’s a problem with this, though. It doesn’t work. As in, a double-blind study analyzing this very thing demonstrated that a potential juror’s response on this is literally as reliable as flipping a coin.

(In fairness, it could be worse; for similar studies about competence and confidence the finding was that for most of the curve, the more confidence someone evinced, the less competent they were statistically likely to be.)

In game design terms, this is like putting a player through an extensive dialog tree where the player has to respond in variable ways…and at the end of which, the responses are tossed and a completely random result is selected instead.

Lawyer: “Sir, what do you do for a living?”
Me: “I’m a video game designer.”
Lawyer: “So, what kind of games do you work on?”
Me: “The same kind of game as World of Warcraft.”
Judge: “I play that. But my character is much better looking than me.”

Even worse, because of the lawyers’ peremptory challenges – those dismissals by the lawyers where they do not need to show cause – the jury is going to be representative of how well the respective lawyers play chess.

(Now, there are some jurisdictions where peremptory challenges have been significantly curtailed, but these remain a minority of jurisdictions.)

So, if that addresses the inherent problem of seating an impartial jury, what about the actual process?

anchor+law+order+svu+fin+and+cabot

The jury is generally recognized as the “finders of fact”, as opposed to the judge who is the “finder of law”. Meaning, the jury is responsible for determining what happened, what witnesses are credible, while the judge is responsible for managing the process – what evidence can be shown, what questions can be asked, and so on.

While there are (rare) exceptions to this, generally a jury can’t ask questions of witnesses. Can’t ask for professional legal clarification of the law from the judge. Can’t do their own research (though of course, the judge can). And, most insane of all, can only operate within the set of charges determined by the prosecutor.

That last doesn’t sound that unreasonable, right?

But it is. Closed-door deals, prosecutorial political ambitions, and such effectively give some people get out of jail free cards no matter what their degree of culpability and, worse yet, can subject the system to a game of what I call “Lawyer Chicken” where each side tries to bluff the other into folding rather than subject the case to the “whims” of a jury.

Witness: “I heard a noise.”
Lawyer: “What kind of noise?”
Witness: “Boom.”

In the case of the trial I was on, the charge was second degree murder on a case of vehicular manslaughter. The jury did not have the authority to say, “This doesn’t meet the level of second degree murder, but it does meet the level of vehicular manslaughter.”

Nope. The prosecutor wanted to appear tough on crime, so literally gave the jury the choice of giving a guy who had hugely fucked up, but still due to terrible judgment rather than malice, decades in prison…or letting him completely off the hook.

The prosecutor knew no sane individual would want to let the guy off the hook completely, but she also knew that most juries would go for the charge that, well, fit the actual letter of the law. So she took it off the table to offer the jury a devil’s choice.

(As a side note, I found out later the judge could have overruled this, but as a matter of process, this is apparently rarely done.)

In game design terms, the jury trial system is effectively cheating the player (the juror) – by giving them insufficient information and inadequate control over the inputs of the system to make a reasonable choice. The result is not only bad decisions (way more of a problem in a criminal trial than a game, too), but also risks the danger of nurturing a sense of disillusionment over the trial system.

IMG_5351

Finally, there is what I call the system of “Institutionalized Fictional Perjuries”.

These are the fictions in the form of code phrases that everyone in the room knows are complete and utter nonsense, but are perpetuated in order to fit certain legal fantasies.

For example, in the case of this trial, not surprisingly a number of cops were brought up and conversationally asked, “How did the defendant appear?” Every. Single. Answer…was identical. And, not surprisingly, matched precisely the legal definition of the signs of drunkenness.

(Here I feel obligated to note, yeah, it was obvious the guy was plastered, but the fiction of pretending the officers were actually giving their honest impressions in their own words when their words were identical and matched the prosecutor’s needs was…cloying.)

Another example in much the same vein: Legally, you can’t refer to written material that has not been subjected to the rigorous demands of evidence. A reasonable point of process, but one that is completely abused when, again, Every. Single. Witness….would say the code phrase, “If I might refer to my notes to see if that refreshes my memory?” Right. That’s what you’re doing. Uh-huh.

(I swear, if they had let us bring alcohol in, we could have made a brilliant drinking game out of this. One drink for every time the prosecutor came up with an excuse to show us the picture of the body. One drink every time someone had to “refer to their notes”. One drink every time an officer “in their own words” gave the precise legal definition of being drunk. Lots more where those came from, too.)

Lawyer: “Do you know what a red light means?”
Witness: “Um, stop?”

One final example – the linguistic abuse of the word “hypothetical”. Again, because of obvious legal requirements, you aren’t supposed to say things you can’t technically know – like whether this particular person could have X number of drinks over Y number of hours before being drunk.

Apparently, however, you can do exactly that, as long as you omit the person’s name but put the word “hypothetically” in front of it. That is, “Hypothetically, if a 182 pound man who was 5’9″ tall and 34 years old with brown eyes and no beard were to…” – that’s apparently legit.

For me, the supreme irony in this comedy of process was that the entire trial was a question of recognizing consequences – the crux of a second degree murder charge – and yet we, as the jury, were forbidden to consider the consequences of our own decision (20 years to life or letting him go free because of improper jury agency.)

Screen Shot 2015-02-03 at 12.58.57 AM

In game design, the key is to give the player real choices with comprehensible inputs and real consequences. When this link is broken, players become frustrated and feel like the game is cheating them. In much the same way, the jury system fails to align its choices and inputs with the consequences, leaving the result subject more to whim and court politics than to the intrinsic promise of justice the system is supposed to deliver.

Oh, one last thing. To answer the inevitable question of what happened at the trial:

After two weeks of deliberation, it ended in a mistrial – basically the game industry equivalent of spending $10 million on a game only to cancel it a week before launch.

Apropos, I must say.

The Law of Unintended Consequences

12-6

In the process of researching the Virginia man‘s quest to claim unclaimed land on the Egyptian-Sudanese border to make his daughter a princess, I ran across (not for the first time, granted) the Outer Space Treaty of 1967.

10397107_790905254253517_497034777268888290_o

Basically, it holds that no celestial body is able to appropriated or claimed by any individual, corporate body, or national entity, and that any development of said must be spread for the benefit of all humanity.

Lofty, ideologically warm and fuzzy, completely theoretical in 1967 – and utterly, destructively idiotic to the longterm practical development of spaceflight.

A 1950s artist’s impression of a moon base. x-ray delta one

A 1950s artist’s impression of a moon base. x-ray delta one

Here’s a quote from The Conversation by Steven Freeland, a professor of International Law at the University of Western Sydney that sprains my brain trying to figure out how someone could honestly not see how this principle accomplishes exactly the opposite of the intended goal of promoting the development of humanity off this tiny-ish blue marble we are all on:

“[T]his principle of non-appropriation helps to protect outer space from the possibility of conflict driven by territorial or colonisation-driven ambitions. In this regard, it is a necessary requirement to promote the exploration and use of outer space for peaceful purposes.”

So, explain why would anyone spend any money, time, or effort whatsoever to develop spaceflight if you subsequently…can’t actually control the fruits of such development?

I mean, look, I am the first person to be skeptical of national political machinations, corporate malfeasance, and general douchebaggery as perfected by homo sapiens. I trust the principle of corporate beneficence about as far as I trust the principle of car salesmanship – an unapologetic booster of unfettered capitalism I am not.

1511150_494547120664608_1281135199_n

That being said, this is no longer a theoretical proposition or a problem solely for a public policy student’s dissertation.

There are currently three active, theoretically feasible, plans for mining asteroids, and this braindead piece of legislation has actually thrown these plans into legal limbo since the treaty effectively bars such ventures.

kozicki_COSPAR_2008_Mars_base_SMALL

Want to see permanent settlement of Mars? The Moon? Europa? Yeah, by international law that is actually illegal, because the second you use local resources such as, well, water, silicates, whatever, said nation, individual, or corporation is in violation of the treaty.

To be sure, there are several potential methods of getting around this, such as operating under the flag of one of the nations in grey or yellow on the map below – those who have neither signed nor ratified the treaty.

Outer_Space_Treaty

Alternatively, one could see the exercise of a deliberate act of abandoning association with any signatory of the treaty – hello there, brand new Sovereign Republic of Ceres! Another possibility is simply that people simply ignore the treaty as unenforceable when it comes to that. We’ll see eventually, one way or the other.

Right now, this ill-considered treaty does nothing but discourage private development of spaceflight, introducing unnecessary uncertainty and unproductive roadblocks into a venture that ultimately benefits our entire species.

In other words, exactly the opposite of what the treaty was supposed to do.

Can You Own Culture?

IMG_8271

Recently, I was faced with an interesting dilemma: A strongly-worded request to not share photos of an ancient site on grounds that members of a group laying cultural claim to it also wished to reserve usage claim to the knowledge about it.

Now, I do understand where this is coming from. It is undeniably true that small populations and cultures under threat of being subsumed by other populations and cultures can feel under siege, and in so doing there is a temptation to entrench and monopolize claim to identifiable elements of one’s culture.

Ultimately, however, I believe this represents a profound misunderstanding of what culture is, how it is formed, how it changes and the individual’s relation to it.

cul·ture [kuhl-cher]
noun
1. A particular form or stage of civilization, as that of a certain nation or period: Greek culture.
2. The behaviors and beliefs characteristic of a particular social, ethnic, or age group: the youth culture; the drug culture.

(There are, of course, other definitions of culture, but I want to be clear that this is the aspect of the term I am focusing on in this post.)

In fact, I would go further and propose the following, more memetically-specific definition of culture: “A culture is a set of prevalent memes found amongst a set of individuals who identify themselves as a group.”

Culture is, thus, nothing more and nothing less than a set of ideas, techniques, aesthetics, and styles held in common by a self-defined grouping of individuals. Cultures do not form Athena-like from Zeus’ head – they are born by blending with or fissioning from other cultures, as well as evolving to better match their environment.

In much the same way as parallel evolution happens biologically, so too do disparate cultures come up with very similar, even identical solutions, particularly when their respective challenges are similar. In other words, just because two different cultures have a similarity doesn’t mean one copied that aspect from the other. It might be, but it just as easily might not be. Original evolutions, moreover, are far less common than copied adaptations – witness the speed at which various art styles or technologies have been repeatedly spread between various populations.

How does this relate to the original question?

A cultural group may legitimately lay claim to a location and the access and direct usage of that location. In appropriate circumstances, control of commercial exploitation of associated specific images and iconography may also be justified.

A cultural group may not, however, lay claim to the knowledge, form, aesthetic or shape of cultural elements, whether this be a location, a practice, or an aesthetic style.

Yes, this means that people will sometimes copy or adapt cultural elements in a way that some will find offensive or disrespectful. At the end of the day, however, cultural elements are fundamentally memes and ideas. They will mutate, they will evolve, they will see usage of both a profound and profane nature.

Someone may say, “We claim/built this temple/church/sacred site and reserve the right to control access to it.”

Someone may not say, “We claim this symbol/idea/concept and reserve the right to control access to it.”

Ideas cannot be subject to monopolization, but are the birthright of everyone; to maintain otherwise is to deny our individual and collective right to learn, grow, adapt, create art, develop philosophy and construct out of the building blocks of today the aspirations of our tomorrows.

Sensitivity, respect, and courtesy are all things that should be striven for and held up as an example of ethical behavior. Unfortunately, such has frequently not been particularly in evidence throughout the history of anthropological research, not to mention the myriad less academic cultural contacts over the course of history.

Regardless, the best thing one can do for the expression of culture is to recognize it as a living, breathing, evolving creature, and not some strange kind of immutable memetic fossil. Embracing growth, adaptation and change are as necessary for cultural health as they are for biological growth.

The Scourge of Hyperabstraction and Politicization of the Video Game Industry

8073.0_cinema_960.0

There have been rumblings for years, but this last year has seen a blitzkrieg of media coverage focusing primarily on a few particular accusations:

  • Video games have a causative, not just a correlative effect on real world violence.
  • Studios are brutal places requiring their employees to forgo any semblance of a normal life.
    • Corollary: Working in the industry will turn you into a basement dwelling misogynistic troll. Assuming you weren’t one to begin with.
  • Video games are inextricably misogynistic.
  • Video game studios are inextricably misogynistic.

Before I go further, let me be absolutely clear about my position:

  • Yes, there are studios that are indeed ruthless and unprincipled in their predatory abuse of their employees.
  • Yes, there are studios with ingrained subcultures of misogyny.
  • Yes, marketing departments do indeed tend to focus on the historically proven markets for video games, which are, in fact, young and male.

This being said, the media frenzy has lately gone off the deep end, and is now doing more harm than good in its witch hunt against the industry. Moreover, as with all witch hunts, this one has so unnerved those targeted by this witch hunt that it has become easier to stay quiet and not say anything than point out the sloppiness of the media assault.

Personal History
I have been at studios that did conduct themselves in each of the the ways accused.

During my tenure in QA, for a time I worked twenty hour days, six days a week to the point that I wound up with viral pneumonia in the Emergency Room. Did I “have” to work the “optional” overtime? No, but it was well known that those who did not “volunteer” would not have their contract renewed, so if you had any sense at all, you grit your teeth and did what you had to to make sure you could afford rent at the end of the month.

Similarly, was there a “Boy’s Club” atmosphere? Certainly, at least to some extent. As a guy, I am quite sure far more happened than I ever personally witnessed, but even there it was certainly not a case of absolutely every guy being a misogynistic asshole or tolerating misogynistic behavior, which is, in fact, the too-frequent inference.

I have also been at more studios that do not conduct themselves in the ways accused.

Both Cryptic Studios, my last place of employment, and Gazillion Studios, my current place of employment, have staffs that probably average somewhere in the mid-30s, meaning there are some 20-somethings, but there are also a notable number of 40-somethings. Both of these studios do sometimes have crunch periods, but they are, by comparison, mild – maybe a day on the weekend and a couple of extra hours in the evening for the month or two before a critical deadline. Rough, yes, but nothing all that different from any number of other industries. This represents a vast improvement from other former practices in the industry, and should be acknowledged as such. Both make a focused, if sometimes imperfect, effort to offer a reasonable work-life balance.

Misogyny and the Dearth of Women Developers
Are there misogynistic individuals in even the best studios? By both personal experience and by conversation with female friends in the industry, there absolutely are. Where said individuals happen to also be in positions of power, this will of course cause considerably worse situations. However, it is a gross hyperabstraction to extend this to a claim that an entire studio is, as a whole, necessarily fundamentally misogynistic, which is the common inference.

One of the claims that particularly irritates me is that game studios are somehow the sole cause of fault for there not being more female developers. This claim generally appears to rest on three related claims:

  1. First, that developers don’t want to hire women.
  2. Second, that development studios are so misogynistic that no woman would ever want to work there.
  3. Third, that the games studios make aren’t the kinds of games women would want to play, ergo not the kinds of games women would want to develop.

Let’s get the third claim out of the way first. Yes, of course marketing departments are going to focus on demographics that are proven rather than hypothetical. The graveyard of studios who chased after imaginary markets is vast, and marketing departments know that. Are the rewards great for those who succeed in embracing new markets? Absolutely. The problem is, the chance of actually pulling that kind of coup off is very tiny. In other words, bad business. Moreover, the basis of this claim is really even only valid for certain types of games – first person shooters, for example, are indeed vastly represented by men. MMORPGs such as World of Warcraft, simulators such as SimCity, and the myriad badly-termed “casual” social games do not share that vast over-representation.

(I should also note that there are, in fact, women who do play these games, but they are still a significant minority. This is not meant to indicate in any way that women don’t enjoy such games – simply that as it is now, they are in fact not the dominant demographic.)

The second claim – that all studios are fundamentally, pervasively and structurally misogynistic is simply crap. There are studios that fit this claim. There also are individuals, even individuals with power, in otherwise benign studios who are misogynistic. Neither of these realities is reasonable cause to tar and feather an entire industry.

The first claim I can speak to from personal experience, at least on the MMORPG side of things. Over the ten or so years I have been in the industry, I have been a hiring manager for probably close to half of that. I have gone through – literally – hundreds of resumes, sometimes for a single position. The cold hard fact is that the number of women who apply even for design jobs – let alone programming jobs – is depressingly small. As in, I can literally count the number of women who have even applied to a job I was filling on the fingers of one hand.

(Notably, of the positions over the years in question, as I recall three offers went out to women; two of these were accepted. This, out of probably a dozen separate positions over all the years I have been a manager.)

Possible Solutions
Now, this does beg a legitimate question: Why do so few women want to get into game development?

Some of it, to be sure, is a legacy of the early years of video games. Moreover, since the industry thrives on the relative certainty of sequels, new, untested ideas are rare, and so relics tend to persist far beyond the historical time when they made sense.

Some, as well, is a perception of hostility or outright misogynistic behavior. Some women undoubtedly do not want to go into an environment they suspect would be hostile to them, whether or not such is actually the case.

Personally, I suspect this is more than anything else an unfortunate function of basic sociodynamics. That is, to you or I, standing where we are in time, we see our own experience over, say, five years or even ten years and easily become frustrated at what we see as a lack of change or evolution, whether we are talking about a society or, as in this case, an industry. Moreover, we confuse this perception as a failure, as if things “should” change faster.

The reality, of course, is that societies evolve and change excruciatingly slowly from the perspective of individuals, mostly due to the generational timescale and the “people like to hire people who are like themselves” effect, which while applicable to gender and race is just as applicable to personality, education background, geographical region, and myriad other factors.

Is this frustrating? You bet. Is this one of the root causes of injustice and nepotistic practices? Without a doubt. Fixing it, however, is not something that can be done by waving a magic wand, either in the form of regulation or wishful thinking. It can be done, but by fair and persistent cultural pressure over the course of decades.

There are things wrong in the industry. As it is, even in just the ten years I have been in it there have been significant strides. Are we, as an industry, at the place where we should be in these areas? No, but that’s okay, so long as we don’t get discouraged and continue to do what we can to improve the status quo as individuals and as studios.

What is not helpful, however, is the currently popular machinegun approach to condemning every studio and every developer as an equal participant of unfairness, misogyny or other injustice.

If we, as a society, want to see this – or any other – industry improve, we are best served by, yes, condemning those studios and those individuals who act badly, but also by holding up those studios and individuals who make an effort to behave fairly.

Moreover, the current presumption of guilt in the absence of proof to the contrary is not only itself unjust, but it is strategically foolish; it makes enemies of those who might otherwise be allies in this effort. Lack of action is not necessarily complicity; lack of action is, to be sure, not to be lauded, but neither should it be condemned the way we condemn the actual individuals who behave badly.

The Secrets of Introverts

We live in a world of extroverts.

For better or worse, some three-quarters of the population are what is commonly classified as extroverts, which means every introvert is inherently living in a world built by and defined by extroverts.

What is an extrovert?

An extrovert is a person who is energized by being around other people. This is the opposite of an introvert who is energized by being alone.  Being an introvert doesn’t mean you don’t like people, but it does mean you will quietly go completely insane if you are forced to interact with other people – especially unfamiliar people – for extended periods of time with no option for respite.

There are some great tracts on the web about introversion in a game attempt to explain introverts to what is a majority population of extroverts, but in most of the various essays there have been some points that I often felt were missing.

Introversion is a not a disease
Being an introvert is not a psychological condition that needs to be fixed.  It is not a disease, nor is it a personality deficiency. Introversion is, as well, not in fact purely social.  In the same way that introverts prefer a small group of close friends to a large network of acquaintances, introverts tend to specialize rather than generalize and prefer depth over breadth.

Note that this is not the same thing as being shy; shyness is a characteristic of social anxiety which in extreme cases is in fact a social disorder in that it prevents the individual from effective social functioning.  Most introverts are not, in fact, shy, and are perfectly capable of interacting at a social level, they simply do not seek it out as their primary focus.

In other words, trying to “help” an introvert by “breaking them out of their shell” or “conquering their solitude” is – perhaps inadvertently – condescending in that it implies there is something wrong with not being gregarious.

Texting is a mark of respect
Mainstream society grants considerable privilege to bold, aggressive communication, and in fact, in certain areas such as sales – whether as a matter of commerce or simply interpersonal communication – this can be very effective.

Direct, aggressive communication has its strengths, but it also has weaknesses that are not as commonly acknowledged.  It can be invasive, pushy, rude, and inadvertently strong arm people into actions and verbal agreements they might come to regret.  In other words, it is good for the quick sale, but bad for the long game.

Where an extrovert will prize face to face or at least voice to voice communication, often because it is easier to force a response or action faster, and sometimes simply because talking is how many extroverts work through their own thinking, introverts will often value indirect communication.  In days past this meant snailmail, then email and instant messaging, and most recently texting.

I have heard it all; texting is a mark of casualness, of informality.  Texting is lazy.  I have even heard texting compared to cowardice.

For introverts, texting allows the recipient to respond when it is convenient for the recipient, not the sender.  It allows the recipient time to collect their thoughts and consider the best response.  It is, in other words, a mark of respect that values the substance of the communication over the speed of the communication, and respects the recipient’s time.

None of this is to say there is not a time and place for face to face communication, for phone calls and in-person talking.  Obviously, there is, and to state otherwise would be silly.  For matters that require discussion or debate, or communication between intimates, nothing else will substitute, but these are rarer situations than is sometimes imagined, and for everything else there is an array of options broader than is often assumed.

Voluntary social interaction is significant
For an extrovert, social interaction is like breathing; it is the medium through which relationships, likes and dislikes, status measurement and innumerable other social measures are gauged.  An extrovert may gush happily with an arch-enemy, or make enthusiastic promises of future interaction which they actually have little likelihood of genuinely following through on.  The extrovert isn’t lying – they just assume that everyone is using the same dictionary.

For an introvert, social interaction is is like an embrace; it is a significant action with particular meaning.  An introvert is more likely to avoid interacting with someone they dislike, and far less likely to pretend amiability.  Where an extrovert may simply be flirting for fun, an introvert will tend to flirt with deliberate intention, even if the manner is playful.

In other words, if an introvert makes the specific, voluntary, non-task oriented action to talk to someone, invite someone to something, or join a social activity, it most often implies a significant level of intent.  The intent may simply be an extension of friendship, or an attempt to deliberately network, or to pay off a perceived social debt, but it is rarely happenstance.  It is a signal, and if the signal is ignored, many introverts will simply shrug and walk away, assuming their offer to have been rebuffed.

Introverts are not insensitive
In fact, it is just the opposite.  Some interesting scientific studies have demonstrated that introverts feel more intensely on average than extroverts.  Their introversion and distancing is, in other words, a mechanism for controlling an intensity of sensory and emotional input that might otherwise become overwhelming.

On the flip side, this sensitivity means that introverts often pick up subtleties that may be missed.  As well, introverts tend to be more cautious and aware of the risk side of any equation, whereas extroverts tend to respond more readily to the reward side of the equation.  Fortune can in fact favor the bold, but it also tends to kill them off a lot faster as well.

Creativity does not arise from teamwork
Recent research has overwhelmingly demonstrated that despite claims that open office environments encourage teamwork and the free exchange of ideas, the truth is their true value lies in these environments being cheaper.  Training and creativity have been repeatedly and rigorously tied to solitary practice and individual effort.

Again, this does not mean that teamwork, brainstorming and the exchange of ideas do not have a place; they absolutely do.  What it means, however, is that these things are most effective when they are segregated to particular points along the creative process and not saturated throughout the process…like soaking french fries with mayonnaise.

The Utility of Art

This evening I happened to read a journal questioning the usefulness and utility of art. After all, what are paintings and sculpture and poetry compared to the material necessities of life?

I would suggest that in the trenches of survival, nothing becomes a sustained force for long if there is not some usefulness, some value in it beyond the relief of boredom. Art is no different.

Societies are at their heart engines for organizing people towards the ends that people always tend towards. Cultures, I would propose, are the bonds and raw materials that society uses to convince us hard-bitten individuals and skeptics that unified purpose has, well, a purpose to us as an individual.

So where does that leave art?

Art is a building block of culture. It is the shared expressions, the shared histories and common contexts that shape our way of thinking, our priorities, our values. What decides whether aggression or diplomacy is valued? What determines whether war or education is prioritized? What shapes whether marriage is defined in terms of love or in terms of economics?

Of course, each of us has our own opinions and beliefs, but these opinions and beliefs do not arise in a vacuum. Our opinions and beliefs arise out of the common context of our lives. The subtle messages that live in the romance novels or fantasy novels or action movies we share as a common culture subtly poke us a culture towards certain assumptions and a certain shared context.

This is the utility of art. This is the great task that the writers, the sculptors, the singers and the musicians serve. They create this common context for us, they create our culture, giving our society the ability to mobilize the people, the ideas, the resources to accomplish the ancient goal of survival.

Art is not a distraction from the business of survival and the goal of material success. It is vital to it, whether we admit to it or not.

Stories

There is a quote that haunts me: The things we tell another are not the experience, but the story of the experience.

Stories are far more than just words on a page. Stories are the reality we react to, the reality we base our sense of worth and accomplishment upon, and the reality that we are remembered for.

Certainly, factual reality matters. If I strike someone in the face, no story will change the fact of the bloody nose on their face. However, the story of what happened – no matter how divergent from the factual reality – will be what others react to. The understanding of why that punch was thrown will provide the justification and the reasoning that follows from that action.

Long after the fact is dead and in the ground, the story remains. It remains, too, not as something frozen in time, but as a living, breathing thing. Stories grow, change, adapt and breath in our minds until they take upon themselves a far greater importance than that original, long-forgotten fact.

Our lives unwind upon the stories we tell ourselves and the stories we tell each other. When we look upon the ruins and the monuments of our actions, there is always a part of ourselves – deeply buried – that remembers the truths at the center of the stories we have woven like veils about ourselves.

We lock the crypts of our souls with words like chains to bind our eyes to our own secret truths. Sometimes our stories are the things that set us free; sometimes they are the very things that destroy everything we had thought to save.

Expedience

At some point along the path to maturity, most of us internalize a set of standards and beliefs as to what is ethical and what is unethical behavior. Tacitly or explicitly, we profess that certain behaviors are simply wrong. Sometimes this is the result of careful consideration, but as often it is a belief that emerges from the chrysalis of time tempered by experience.

These are the goals of behavior we aspire to, the exhibitions of which in others win our praise, and the failures of which in others we condemn.

In times of stress, uncertainty or internal conflict, we waver. We see the light of our own definition of honorable behavior, and we flinch from it. Some overcome our temptations to grasp that light. In so doing we do not banish our own temptations, but we take responsibility for our behavior and in so doing anneal ourselves into something better than we were.

Some fail, some fall. Failure to adhere to our own definitions of ethical behavior is probably inevitable for all of us at some point, so what matters is what we choose to do with those failures. Rarely, we see those failures for what they are, we learn from them, we adapt and steel ourselves so that we do not repeat our own mistakes.

More often we justify, we rationalize, we make excuses for how what happened was not our fault, how anybody would have done the same in our place, how it “just happened”, how we were somehow not responsible for those parts that we might fairly claim responsibility.

Sometimes, too, we push aside our self-defined standards of ethical behavior when it is hard, when circumstances would lead to events turning against us were we to hold to our ideals. We lie, we compromise our ideals, we reinvent the event to shine a kinder light on our actions.

The codes of behavior we aspire to are not there for the easy times. They do not exist for the times when the adherence to them will bring us advantage. Our self-imposed standards are there for the hard times, when the adherence to them might bring us ruin in one form or another.

All of us at some point are faced with the choice of whether to stand by our ideals and maintain our self-respect, or else to give in to expedience and convenience. We may excuse away our injury of others to ourselves; others may even accept these excuses, but in the middle of the night, when the world is still and sleep itself is still a dream, we cannot escape the truths of ourselves.