2017 Selected Workgroup Topics2017 Workgroup Topic Proposals

Better Than Dialogue Trees: The Interface of Human Interaction

For nearly as long as we have had video games, those games have tried to emulate the experience of talking to (and developing a relationship with) a human being.

100% of those efforts (including the ones that I have built) have produced experiences that do not at all resemble actual human interaction.

But this problem runs deep. Dialogue trees are so much the default design for human interaction in games that when I have proposed to many designers that we need a better interface, the largest hurdle has been overcoming the assumption that there simply cannot be a better design for this; that nothing else is possible.

I believe there are other possibilities, unexplored. I believe that we will find them. We must! If we imagine a future, 100 years from now, where we are still playing video games, do we believe that those character interactions will still use Mass Effect-style dialogue trees?

I do not believe that, in part because the modern design of dialogue trees is so fundamentally inhuman:

  1. Dialogue trees assert that NPCs should say one thing in response to the player’s choices. That design puts the power of choice for the NPC’s response on the shoulders of the players. In effect, the player is the one who chooses the NPC’s reaction. Modern NPCs are verbal vending machines.
  2. In modern games, there are generally only a very few (usually one) relationship spectrum being measured (often, how close are we to having sex with the character). Yet, real human beings track a wide variety of statuses in their relationships: trust, admiration, desire, intimacy, rapport… it is the multiplicity of values that make human relationships rich. We do not reflect this richness in our game designs.
  3. Context matters. It is strange for a person to unpack their heart to someone they just met–yet, this is so common in video game design as to be a trope.

The list goes on. Current NPC interaction systems do not produce interactions that feel like interacting with a person.

So: how might we do that? What is a better interface and a better design for interacting with an artificial human? How could we turn the player’s attention in an NPC interaction towards the same types of things they think about in a real-life conversation with a real person?

What kind of interface might better capture something closer to the true emotional & experiential nature of human relationships?

This group will not discuss the promise of neural networks, or of big data analysis, or of other magic-wand black-box approaches to this problem. We will deliberately keep the conversation out of the “technology will save us” domain. That is not to say that such advances are not coming; they clearly are. But there are lots of groups out there focusing on those approaches to this problem.

Instead, this group will focus on interface, on character data model, and on how we might generate meaningful gameplay, all around the subject of interactions with artificial humans, using today’s technology.

Let’s build a better relationship interface.

6 thoughts on “Better Than Dialogue Trees: The Interface of Human Interaction

  1. Two of the most exciting projects I’ve ever worked on were around this topic and both were spearheaded by Emily Short. If you haven’t checked out Spirit AI yet, they are doing interesting things around human/character interface, even if it’s still closer to tech demo. Anyway, I would love to chat with you about this whether or not the group happens!

  2. Worked on a project back in 2005 that addressed this very thing. Unfortunately it was a corporate training game that got buried so the project never went anywhere, but I can tell you exactly what we did that led to significant improvements from dialogue trees.

  3. I’ve worked on this too, and hope to return to it. IMO the character’s emotions are the center of the puzzle. All the dialog comes from semantic content, and semantic content is worthless without emotional tone and color. I *think* (but can’t yet demonstrate) that with an NPC with sufficient memories and opinions about things (in particular how they *feel* about things), we can move from semantic/emotional symbols to constructed dialog (not canned dialog strings) — but maybe that’s getting too black-box for you (no ANNs involved though).

    One key insight for me, still valid 15+ years later alas, is that until we take the emotional bull by the horns and figure out how to represent difficult, layered, qualitative, contradictory emotions within an NPC, we won’t make much real progress in social interaction (outside of narrow, sterile domains). There are various models of emotion, all IMO insufficient — too unary, state-driven, etc. Anyway. The emotional part I can do. The semantic part I can mostly do. Putting that into dialog? Remains to be seen.

Comments are closed.