Quickfire 5 minutes of good, 5 minutes of bad on Spore, Gears of War 2 and Bioshock’s AI. Very interesting overview of the problems encounted, and what kind of technology and design decisions were made.
Format: 5 minutes on good, 5 minutes on bad.
Eric – Maxis – Spore
Really 5 games/stages, different demands placed on the AI. Cell, creature, tribe, civilization and space.
Difficulties in the behaviour system. Checked out Lua and others for designers to work with – but engineers did a lot of the work anyway since it was so complex. Decided to write it in C++.
The behaviour system had issues with the state, stimuli and priority of behaviours – if defending nest then attacked, need to go back to the nest at some time. There were also issues with groups, where synchronising dancing was very difficult.
Simulation and avatar based game in the same game was difficult too. There were meant to be 5 LOD – it zoomed out so the stuff it was doing before it cut back on – but there is still some of that in the game.
Success wise, behaviour trees worked really well, all the behaviours were in one place to see what was going on, and also allowed transitions from one to another behaviour was easier. Player versus environment also was going to work better for the game – one versus many. This also means the player can seek out other creatures to interact with. There was also global pacing – global control stopped for instance multiple tribes attacking at once.
There was also good work done with revealing what the AI is thinking – people also thought the AI was better. More feedback from sounds and music which also helps the tension. Relationships are also visible which help, and are simplified – there is no need for the player to know the relationships between NPC nations.
Lessons learnt include using proven AI technology with new IP – the procedural creation was a lot of risk already. Need to focus on the player experience, the player’s mental model and simulation “under the hood” – you want the complexity underneath but have what the player needs to play the game be fairly simple. Also being transparent is a good lesson – show the player, and if you can’t show it, simplify it so the player can understand it.
Matt – Gears of War 2
Winners or unwinners!
Winners – AI commands – Finite State Machines used to contain behaviours was used before, big long unreadable script, but now used in different sets of AI commands. There is also a stack of commands – which can also invoke each other. High level commands like “Attack steve” will invoke other commands then call itself again (so might move, recheck the original high level state, and so forth to carry on attacking).
Debugging tools were good – every AI has it’s own log – state transitions all logged. However not all AI’s have unique names – the QA had to either guess which one, or send them all. Used a tool called BugitAI a command line tool to dump the current AI logs, take a screenshot with debug shown, and log player coordinates. Toolchest of tools like this for designers and QA.
Pathfinding needed finding cover nodes a lot. Modularised the path search, so that the cover system finder could also be reused for other things – like a person at the middle back who revives people.
Punchlist meetings – forcing designers and programmers together, to find out what was a priority to make the game fun.
Battle between scripting and AI – the designers do need the control, at least briefly. Script errors caused real problems, so needed a debug tool for checking it – put a wireframe around something that couldn’t get to it’s scripted point or the script failed. No one was quite sure who was scripted and who was normal AI.
Procrastinated behaviours – need to put temporary behaviours in there so the designers have something, since it will force an action rather then people not knowing what is designed so levels need changing etc.
Hard to do smart things in a corridor. Hard to flank when there is no flank. Gears 2 had long battles, but with no flanks. Not very open but very long, which the AI would just come from one direction.
Q – Finite state machines or something else in the future? – We’ll stick with FSM’s, we think it’s working so why fix it.
Q – With your stack based sequence are you constantly checking to see if the end goal is still valid? – Tons of situations where an AI can’t carry on killing something since it dies. If a behaviour can’t do an action there are push or a pull to cancel that action on the stack.
Q – How much precomputation do you do, and how much will you do in the future with faster and faster computers? – Do as much as you can all the time, precompute as much as possible since there is usually more memory room then CPU room.
John – Bioshock
What went right?
AI Strike team – AI and Animation programmers (2-4 total – 2 permanent, up to 2 temporary), Animators (3), Designers (1-2). Huge problem since there wasn’t a dedicated AI designer for most of the life of the project. The same room though allowed short meetings and consultation, met at least once a day to discuss progress.
Extensible systems so there was as little rewriting as possible. Had some groundwork in Unreal engine and SWAT4/Tribes engines. If you do a similar game to a previous game reuse and don’t reinvent*
*although reinvention may be necessary
Schedule for experimentation! The prototyping of little sister/big daddy was something that went well. 2 months of the John (the lead’s) time was needed to do it properly.
Ragdoll recovery (or “Get up from Ragdoll” system). Havok animation system had some but not any transition from unreal collisions. Was going to possibly cut it at one point – at one time the collision detection was turned off, then turned on when the ragdoll went back on – but they got stuck in the terrain. Tried a few proactive ways but didn’t work, but a reactive method worked well. Used navigational space to see where the nearest space was – and moving the body to a place it can stand up while it is getting up while it got up.
What went wrong?
Planned on very little scripting due to SWAT2 – but in fact there was a ton of scripted moments with the AI’s. Painful adding this later, for the E3 demos – and of course training and complex battles.
Interfacing with the designers is necessary. Much of the AI design was done in a hap hazarded way. Should have had them work together – a few times they did come forwards, but the AI strike team should have been the first people to go to. Tuning was outside the AI strike teams area – the attributes should be done in the AI strike teams values since this helps testing and keeps people working on the same combat system.
AI performance was the last issue – AI behaviours for throwing objects couldn’t take multiple updates/frames to work out things – basically this caused lag spikes. Was optimised, and limited, but wasn’t a fix. Need to have an AI that should always support multiple updates.