Skip to content

Emotion in Game Characters

Array ( [blocks] => specificItem [show] => [itemId] => 15152 [exactSize] => 200 [itemFrame] => none [albumFrame] => none )
Error (ERROR_STORAGE_FAILURE)
  • in modules/core/classes/GalleryStorage.class at line 494 (GalleryCoreApi::error)
  • in modules/core/classes/Gallery.class at line 202 (GalleryStorage::search)
  • in modules/core/classes/helpers/GalleryUserGroupHelper_simple.class at line 105 (Gallery::search)
  • in modules/core/classes/GalleryCoreApi.class at line 1876 (GalleryUserGroupHelper_simple::fetchGroupsForUser)
  • in modules/core/classes/helpers/GalleryPermissionHelper_simple.class at line 64 (GalleryCoreApi::fetchGroupsForUser)
  • in modules/core/classes/helpers/GalleryPermissionHelper_simple.class at line 39 (GalleryPermissionHelper_simple::_fetchAccessListIds)
  • in modules/core/classes/GalleryCoreApi.class at line 501 (GalleryPermissionHelper_simple::fetchAccessListIds)
  • in modules/core/classes/helpers/GalleryUserHelper_simple.class at line 76 (GalleryCoreApi::fetchAccessListIds)
  • in modules/core/classes/GalleryCoreApi.class at line 566 (GalleryUserHelper_simple::hasItemPermission)
  • in modules/imageblock/classes/ImageBlockHelper.class at line 217 (GalleryCoreApi::hasItemPermission)
  • in modules/imageblock/classes/ImageBlockHelper.class at line 93 (ImageBlockHelper::_getBlockData)
  • in /home/aarmgorg/public_html/journal/wp-content/plugins/wpg2/wpg2embed.inc at line 1352 (ImageBlockHelper::loadImageBlocks)
  • in /home/aarmgorg/public_html/journal/wp-content/plugins/wpg2/wpg2embed.inc at line 867
  • in /home/aarmgorg/public_html/journal/wp-content/plugins/wpg2/wpg2embed.inc(1229) : regexp code at line 1
  • in /home/aarmgorg/public_html/journal/wp-content/plugins/wpg2/wpg2embed.inc at line 1229
  • in ??? at line 0
  • in /home/aarmgorg/public_html/journal/wp-includes/plugin.php at line 213
  • in /home/aarmgorg/public_html/journal/wp-includes/post-template.php at line 230
  • in /home/aarmgorg/public_html/journal/wp-content/themes/barthelme/page.php at line 12
  • in /home/aarmgorg/public_html/journal/wp-includes/template-loader.php at line 75
  • in /home/aarmgorg/public_html/journal/wp-blog-header.php at line 16
  • in /home/aarmgorg/public_html/index.php at line 4
1

Phil Carlisle, Indie Developer/Bolton University Lecturer

Phil will present an overview of the field and a review of recent research for developers who are curious about emotional reactions for game characters. He’ll also discuss this from a practical perspective, outlining the techniques that are ready to be applied and how.

A good session and overview of emotional modelling, although no huge amount of detail because of the time, but with lots of books to reference for further reading.


Motivation? Wanting to create experiences and engage players. Intelligence is kind of a side effect. Want to have artificial “performance”. Creation of something that acts human enough to convince people versus trying to be human.

To have convincing characters you need a lot of non-verbal communication – movement and expressions. There are lots of little things – gestures, gaze, posture which are all important. There are some which can’t be done in games – body contact (but maybe someday you’ll do that with robotics). Proxemics are interesting – they are how you feel by people being closer or nearer you – and vary by culture – some people find moving closer to be aggressive, others defensive.

Phil is not doing verbal communication so is working on a deaf and mute character – so non-verbal vocalisations, such as blushing and their look.

Example of the emotion in Fallout 3 – Moria Brown – sounds perky but looks like they just got rained off.

Example of Team Fortress 2 – The Heavy – Much more convincing – they worked backwards from the voice. They knew what the expressions were going to be from the sound. They nailed the movement of the face which is the most important thing. It isn’t even a real-looking person so that doesn’t even matter.

Peter Molyneux – Milo – Emotion recognition and research. This is going off a tangent from the game stuff. It’d be more embodied conversational agents, but can see it is reasonably convincing – the facial movements, and actions, even though even the demo video goes wrong – recognising a fish as an orange.

Wall-E – The fun thing to realise is that it has some of the best animation in a CGI film, and the main character is almost completely without speech. All of the expression and emotion from the animation, important point to take from it. Do animations, so much easier.

Disney book “The Illusion of Life” is amazing for animation since this is how Disney learnt their techniques to do characters (is a bit expensive though).

Looking into the research into the area – Neuroscience and look into pattern recognition – based off Antonio Damasio. Sematic markers from him, meaning recognising things in the world and applying it to a pattern. We’ve evolved emotional reflexes, so the monkeys who saw the big bad dinosaur and ran away lived to go on.

Look into psychology which is important for emotions. There are various models for emotions, personality and mood – a lot use scaling values for a set amount of emotions.

Cognitive models – OCC model for agents, events and objects. Will detail more in the talk.

Behavioural research – Darwin is someone you can’t refute, and he’s dead. “The expression of emotion in man and animals” done on his own family – a lot of slides were of him doing things to his kids and them crying. Worth a look however, even if the methodology at the time wasn’t the best.

Non-verbal communication has some good info, in “Bodily Communication” – a bit old, but worth looking at.

Facial expression and analysis, Professor Paul Ekman’s book “Emotions revealed” since he’s done it all his life. Some expressions of emotion are universal across cultures, such as anger, probably from evolution. Gives a good understanding in the book.

Research into expression synthesis – lots of areas, such as work done by prof. Magnanet Thallman at Miralab in Geneva, and Arjan Egges who has a paper on emotional modelling.

The embodied conversational agents – such as Milo’s area – Justine Cassell’s book “Embodied conversational agents” is worth looking at for the area. Lots of papers and conference things too.

Working on from these areas, the Oz Project using behaviour trees with psychology. Facade is good and interesting done by the group, but it does look hideous – side note that academics need to pay artists for a bit to get some decent looking graphics!

Ken Perlin has worked on procedural animations – and worth looking at since he does very basic work, and has lectures online and demos of his work.

Scott McCloud has a great book – “Understanding Comics” about iconic representation of emotions in comics, and suggests it doesn’t need to be realistic. He suggests you need more people trying more things – can be more abstract, and add more emotion to a highly emotional medium.

Future work – by Phil – is looking at player perception of characters and working on models of emotion, and the procedural animation of actors.

On the implementation – first observation, players can be dumb (players comments on youtube videos)

Observation 2 – need more observation of players. For example, what happens when someone comes up behind someone else (NPC’s shouldn’t turn on the spot!), what happens when someone they like or dislike joins the group (they should be happy or sad!).

Lots of models of emotion – they usually revolve around variables of emotion (1, 5 or 22 or … etc.), with a behaviour tree to choose what to do. Cycle of think-act-feel – can use blackboard for feeding info around.

On verbal communication – it’s really hard – fake it using dark barks, or sims language. There’s a ton of research, just nothing massively convincing.

In summary, need to express emotions through our bodies, so can model and express this in AI if we want them to be more believable – can do it very simply!

Questions

Do you do just happy or sad?
For the amount of emotions it entirely depends on how much data you have to express emotions. Without an expression for disgust, you can’t show them being disgusted by a smell.

How do you deal with blending animations?
It’d have to be a really catastrophic event that changes emotions so fast from good to bad. You can scale the change in emotion from an event too.

Post a Comment

Your email is never published nor shared. Required fields are marked *