Submissions by saluk tagged cutscene

A submission for Make games. 326

I found the AI had some trouble navigating the interior of the house. I had to use half-tile collisions due to how the art is drawn, which is less than ideal, and still (after ~3 years of knowing my engine has this problem) can't navigate around them correctly. We'll do more with the npc a little later. So instead for today's update I did something a little more frivolous:

I quickly hacked up some intro text that fades in. I can have different intro text pop up for each scene, and I could trigger similar text from an event whenever necessary. I like it because it doesn't stop you from messing about if you just want to get going (a golden rule of Erik is to remove control from the player as little as possible). I'm sure the aesthetics could be improved. Version 2.

To build an event system. In my little story outline, even within the single "scene" I designed for Act 1, there are several mini-scenes that the npcs are supposed to go through. At the beginning, erik is separated from his party, and has to go looking for them. When they see him, they should be glad to see them. And so on, for multiple other things going on. While I intend for the core minute to minute gameplay to be driven by the ai simulation, there are some simple things that I need to be able to do to control the narrative somewhat. So I built a high level event system. Each map has it's own queue of events that it listens for, and than various actions in the game can trigger those events. This will be used for controlling high-level narrative flow - and should be used sparingly. The opening sequence of the game is one of the few obvious places to use such a system.

This:

{
  "":[
    {"type":"init","result":[
      ["console",{"text":"Game initialized"}]
    ]}
  ],
  "woods":[
    {"type":"init","result":[
      ["cam_focus",{"npc":"trask"}]
    ]},
    {"type":"camera_near","distance":20,"npc":"trask","result":[
      ["conversation",{"conversation":"intro1","npc":"yelda"}]
    ]},
    {"type":"conversation_end","conversation":"test","result":[
      ["cam_focus",{"npc":"erik"}]
    ]},
    {"type":"trigger","trigger":"exit_tutorial","npc":"erik","result":[
      ["cam_focus",{"npc":"trask"}],
      ["conversation",{"conversation":"test2","npc":"erik"}]
    ]}
  ]
}

Turns into a quaint little cutscene that plays at the beginning, along with some "gameplay", where the npcs see erik when he moves in view. It's fairly unoptimized, checking all events every time a part of the code signals a possible event. I don't know how long the list of events for a given map is going to grow, so it might not even be a problem. In fact it probably wont. But it could certainly be optimized by building some kind of search tree according to each event type.

The worst event is the "camera_near" one, because the camera fires the event every frame (triggering a match attempt on ALL events in the current map) even when there are no events looking for it. An easy early optimization there would be to keep track of what event types are in the queue, and silently return if we know there are no camera_near events waiting.

I'm still not sure about this. It makes a lot of things really easy, but also is going to be a temptation to hardcode narrative rather than let the ai simulation create one. Finding that balance between narrative and simulation is what this project is all about though!