In economics there’s something called the Edgeworth box which attempts to illustrate how two people with different production possibilities might trade with each other.

I’ve always maintained that a similar construct could be used to illustrate how two people with slightly different idea/word-meaning values might communicate (i.e. trade and barter meanings).

This theory neatly explains how spectacularly I fail to communicate with programmers.  It’s because I completely lack any meanings or ideas that they want to trade in.

In the game industry of Seattle, everyone integrates easily with the same geek-chic humor we’ve had since the early 1990’s but programmers quickly separate themselves from the non-linear thinking of non-programmers.  When they saw us data wonks floundering because our managers keep asking for different slices of the data, 5 programmers slammed the same sticky note to the Scrum board: Automate finding patterns in the data.  At that point, I had nothing of value to add to the conversation.

My abstractions about player values and the equilibrium of the game’s economy solicited the same sympathetic disdain they show artists debating the color schemes of a game.  They quickly disappear into the safety of their code.  (Meanwhile the concept artists are drawing ridiculously chibi-headed caricatures of the programmers as zombie fodder for their next project.)

The product they came up with is what is hailed as the newly minted discipline of Data Science.  The application of apps to data from apps to build better apps from.

But instead of actually improving the outcomes it simply puts the mantra of “fail cheaply and fail often” on steroids.  Although you can’t derive any moments from data that comes from a Cauchy-generating process, you can subject it to a power series that will quickly reveal…you guessed it, something to find patterns in.

One of the places I saw this clearly was in the balance of soft and hard currencies in a builder game we were designing.

In the early 2000’s everybody was enamored with agent model simulations that could simulate the data you’d get with agents that a finite set of rule-based behaviors.  I don’t know why we thought this was so special.  Game designers have been using simulations on board games forever.

For games that monetized based on currency shortages, however, you had to build separate simulators for currencies so that you wouldn’t accidentally nerf or buff a virtual good that you just spent the last 5 levels increasing the value of.  In mid-2012 Joris Dormans created an online simulator called Machinations to do just that.  You constructed sources and sinks of each currency and a few other logical conditions to build an animation of the flows along with a graphic that simulated what you would see from an instrumented game in Beta.  Now with a model of the mechanics, a par sheet of probabilities and well constructed on-boarding I should be able to balance the economy right?  Well…except for those pesky players who refused to play as scripted.

No matter how well we balanced the economy in the builder game, results came back from tests in Canada that players were not making choices that could be classed as strategies or player-types.

When our programmer got hold of Joris’ open source program, he immediately bolted it to a genetic algorithm to maximize the flow of resources (including In App Purchases). This would seem the right thing to do as a programmer but it ignores the fact that entrepreneurs and gamers live by:  One plays to beat expected values, not to meet them.

Once again, the charts produced by Machinations that most closely matched actual data were those with the fattest tails of a distribution (Cauchy).

No matter how many ways the problem was ‘diagnosed’ player retention dropped (with a freemium ‘Thud!’) right at the point the player met a paywall.

Now it was time to bring out the big guns of behavior analysis…psychology.

Advertisement