Last week I went oop north to discuss decision making with people. We talked about Bayesian decision analysis too. And as is my wont, I thought “can this apply to RPGs?”
Because RPGs involve constant decision making as individuals and groups. That’s often the point, it’s where conflict arises, and the decision-action-feedback cycle is where a lot of the satisfaction comes from participating in a RPG.
Take the principle that internal decision making is at base perfect and entirely internal, uncoloured by analysis or communication. But it’s also based on individual rational belief.
Then you move into the realm of the irrational. Heuristics and biases. System 1 thinking is fast and intuitive and how many people make decisions, System 2 is slow and formalised. And analysts (psychoanalysts, or GMs) help the individual formalise System 1 to System 2, making internal decisions explicit in the context of system.
But at this stage we’re talking about a 1-2-1 relationship between GM and player. More often than not the GM acts as interpreter for the internal decisions of the player on behalf of their PC. The ideal state for a gaming group is a group discussion and group decision making, but frequently it devolves into many player-GM conversations. Some GMs actively encourage this behaviour (secret side meetings, passing notes, etc.) and some players lap this up and deviate from the group arc into their own. This goes hand-in-hand with My Guy Syndrome:
you disclaim decision-making power and responsibility by acting like “what my character would do” is inevitable and inviolable, even if it gets in the way of actually having fun in the game or being able to play the game at all.
My Guy Syndrome is the killer of fun; players who engage in that behaviour are rightly shunned.
But, maybe we shouldn’t be too harsh because the reason they’re disclaiming decision-making power is precisely because group decision-making is hard, and furthermore there is no rational model of group decision making. OTOH My Guy is making perfectly rational, internal decisions.
And the way to overcome this — surprise, surprise — is changing the GM from a rational analyst role into a social facilitator role. Treat group decision making like a social problem rather than a rational problem to navigate around individual biases, etc. There are still issues of governance and process (e.g. do you have permission to make that move at this time) and values and uncertainty (when I make that move, what outcome is reasonable to expect — and will an unreasonable outcome spoil the game?)
This will be all very obvious to anyone who has GM’d a game. But these are my points:
One: “How To GM” is tacit knowledge
People learn how to do the social facilitation by doing, despite the glut of advice on how to interact socially. GMing arises from contextual experience. Also GMing is like any other skill — you need 10,000 hours of experience to call yourself an “expert”.
Two: The GM is a decision-maker
Simply this: the GM doesn’t just facilitate a conversation, they also make decisions (on behalf of the game world) and react to player action. So the GM is also bound by System 1 and System 2 thinking. In this context:
- System 1 (intuitive) thinking is GM gut feel and calls based on experience and observation on how the game is going and what fits the situation. E.g. player voices action, GM asks for an attribute check and interprets the results (as yes/no/maybe).
- System 2 (formalised) thinking is the GM interpreting the game in the context of a rules framework, and basing a response on the rules. E.g. player voices action, GM places that in a system context with pass/fail thresholds and consequences, player rolls, GM enforces outcome.
Now, going back to point 1: when people learn to GM they frequently learn to take those decisions intuitively, developing their intuitive sense of how to run games. Learning the explicit, rules-laden aspects of GMing anything crunchy is almost a separate skill.
Three: Apocalypse World makes everything System 2 Thinking
The difference between Apocalypse World and Everway is that AW forces both players and GM into System 2 decision making all the time (or forces the GM to force the players into System 2) by forcing all consequential actions to be codified as moves. It gets away with this because the number of choices is relatively small and the moves have consequences baked in, so the decisions for the GM are simpler to make — but still, the GM’s response will always be in the context of the Moves’ outcome or as one of the codified MC Moves.
Compare that with Everway and other fairly minimalist or freeform games: Everway’s Karma, Drama and even Fortune resolutions are pretty much intuitive, based on principles and value expectations.
Four: Is “Social Contract” enough?
I’m a fan of the concept of Social Contract in the Big Model.
But the problem with the social contract is the assumption of consensus. We’ve already said that group decision making is non-trivial and rational group decision making doesn’t exist. This means that the Social Contract, Creative Agenda and other items are already facilitated objects, reliant on the facilitator to reach group consensus.
This formalising and externalising of system is common to a lot of storygames. But in this case it often works because:
- scenes are between two players, both with their perfect internal decision-making, and all that matters is the conflict between their decisions, not the decision process
- also, scenes are often about vocalising options and getting consensus between two characters; the process of translating your System 1 thoughts to System 2 is part of the game
- Character biases are… sort of the whole point of storygame interactions.
Whew, that’s probably enough academic noodling for one post. TTFN.
One thought on “Decision Analysis and RPGs”
Comments are closed.