This thesis addresses the analysis and generation of expressive movements for virtual human characters. Based on previous results from three different research areas (perception of emotions and biological motion, automatic recognition of affect and computer character animation), a low-dimensional motion representation is proposed. This representations consists of the spatio-temporal trajectories of end-effectors (i.e., head, hands and feet) and pelvis. We have argued that this representation is both suitable and sufficient for characterizing the underlying expressive content in human motion and for controlling the generation of expressive whole-body movements. In order to prove these claims, this thesis proposes:
i) A new motion capture database inspired by physical theater theory. This database contains examples from different motion classes (i.e., periodic movements, functional behaviors, spontaneous motions, and theater-inspired motion sequences) and distinct emotional states (happiness, sadness, relaxedness, stress and neutral) performed by several actors.
ii) A user study and automatic classification framework designed to qualitatively and quantitatively assess the amount of emotion-related information conveyed and encoded in the proposed representation. We have observed that although slight differences in performance were found with respect to the cases in which the entire body was used, our proposed representation preserves most of the motion cues salient to the expression of affect and emotions.
iii) A simple motion synthesis system able to capable of: a) reconstructing whole-body movements from the proposed low-dimensional representation, and b) producing novel end-effector (and pelvis) expressive trajectories. A quantitative and qualitative evaluation of the generated whole body motions shows that these motions are as expressive as the movements recorded from human actors.