I. “Prom Week”

When I was in school, I attended a lecture given by Michael Mateas, who talked a bit about videogames involving social interaction. Mateas was already well-known for Facade (actual game here), and was working on a new game called “Prom Week”.

“Prom Week” was developed to explore the gap that I’ve touched on (channeling Bogost) while discussing the Sims and Grand Theft Auto- the missing “social” aspect that the player is expected to infer.

Prom Week is a next-generation social simulation game driven by a novel artificial intelligence engine. The goal of Prom Week is to make social interactions truly playable. While games have increasingly gotten better at physical simulation, social interactions in games still tend to be scripted, using dialogue trees or other static structures to represent in-game choice. As a result, games are more often about combat or physics-based behaviors (which are easier to make playable) since these simulations are the only part of the system dynamic enough for interesting gameplay.

Prom Week uses the AI system Comme il Faut (CiF) to enable rich, emergent storylines by letting players use “social physics.” Just as games like Angry Birds support emergent solutions to physical challenges, Prom Week’s underlying simulation of social considerations (containing over 5,000 rules of social norms and behaviors) allow for emergent solutions to social challenges, like getting geeky Simon a prom date or convincing Buzz to give Monica a second chance. And unlike games like The Sims which use abstractions such as nonsense words or icons to represent language, Prom Week features actual English dialogue, and characters with detailed histories, likes, dislikes, permanent traits, and temporary statuses—all of which can be leveraged by the player to produce just the right prom night story.

The system underlying the game could be used for other purposes.

In the case of Prom Week, we’re figuring out how to build computational models of social interaction, allowing players to explore the social consequences of their actions,” said Dr. Michael Mateas, director of the Center for Games and Playable Media and UC Santa Cruz. “Once you’ve developed a new capability like this, it can be used for all kinds of purposes, including education and training.”

It’s the training aspects that might be of most interest to those who, like Coburn, apparently don’t think the public is getting its money’s worth out of the Prom Week grant. Mateas says Prom Week‘s social modeling technology has since been adapted for use in a scenario-based simulation used to help soldiers and police “cope in unfamiliar cultural contexts and [help] them to get their jobs done in a way that minimizes escalation and loss of life.” Another project is using the Prom Week technology to help prevent bullying among middle school children.


II. Algorithmic Prison

What a great phrase for the new relative ease that computerization allows for the bureaucratic nightmare to flourish. Recall, again, the apprehension of the ’60s about computerization, becoming reduced to a punchcard, lost in a big machine.

Some can no longer get loans or cash checks. Others are being offered only usurious credit-card interest rates. Many have trouble finding employment because of their Internet profiles. Others may have trouble purchasing property, life, and automobile insurance because of algorithmic predictions. Algorithms may select some people for government audits, while leaving others to find themselves undergoing gratuitous and degrading airport screening.


Algorithms also constrain our lives in virtual space. They determine what products we will be exposed to. They analyze our interests and play an active role in selecting the things we see when we go to a particular website..

Eli Pariser, argues in The Filter Bubble, “You click on a link, which signals your interest in something, which means you are more likely to see articles about that topic” and then “you become trapped in a loop.” The danger being that you emerge with a very distorted view of the world.


Algorithmic prisons are not new. Even before the Internet, credit reporting and rating agencies were a power in our economy. Fitch’s, Moody’s, and Standard & Poor’s have been rating business credit for decades. Equifax, the oldest credit rating agency, was founded in 1899.

When algorithms get it right (and in general they do a pretty good job), they provide extremely valuable services to the economy. They make our lives safer. They make it easier to find the products and services we want. Amazon constantly alerts me to books it correctly predicts I will want to read. They increase the efficiency of businesses.


Most of us would not be concerned if 10 or 100 times too many people ended up on the TSA’s enhanced airport screening list as long as an airplane hijacking was avoided. In times when jobs are scarce and applicants many, most employers would opt for tighter algorithmic screening. There are lots of candidates to hire and more harm may be done by hiring a bad apple than by missing a potentially good new employee. And avoiding bad loans is key to the success of banks. Missing out on a few good ones in return for avoiding a big loss is a decent trade off.

But we’ve reached the point where, in many cases, private companies and public institutions stand to gain more than they will lose if a lot of innocent people end up in algorithmic prison.


Even if an algorithmic prisoner knows he is in a prison, he may not know who his jailer is. Is he unable to get a loan because of a corrupted file at Experian or Equifax? Or could it be TransUnion? His bank could even have its own algorithms to determine a consumer’s creditworthiness. Just think of the needle-in-a-haystack effort consumers must undertake if they are forced to investigate dozens of consumer-reporting companies, looking for the one that threw them behind algorithmic bars. Now imagine a future that contains hundreds of such companies.

A prisoner might not have any idea as to what type of behavior got him sentenced to a jail term. Is he on an enhanced screening list at an airport because of a trip he made to an unstable country, a post on his Facebook page, or a phone call to a friend who has a suspected terrorist friend?

Finally, how does one get his name off an enhanced screening list or correct a credit report? Each case is different. The appeal and pardon process may be very difficult—if there is one.



I’ve decided to go ahead with Sandy Pentland’s Social Physics, and I’ll share notes here. My last major exposure to his perspective was from lecture he gave at Carnegie Mellon when I was there, where he touched on his thesis from Honest Signals (which I have not read). The talk was actually about a system called “Reality Mining.” I expect that, contra some of the reviews I’ve read, Pentland will carefully navigate the obvious “Smart City“-style problems in Social Physics. It would be pretty disappointing to see him slip into those ruts.

Another book in the wings- Sunstein’s new(ish) book, which I just purchased on a whim. It had a terrible score on Amazon, because conspiracies. I expect to enjoy it, as long as it doesn’t retread too much on his older material. (In my opinion, “Simpler” had a little bit too much of “Nudge” in it).