The present invention concerns a human emotional/behavioural/psychological state estimation system (1) comprising a group of sensors and devices (11) and a processing unit (12). The group of sensors and devices (11) includes : a video-capture device (111); a skeletal and gesture recognition and tracking device (112); a microphone (113); a proximity sensor (114); a floor pressure sensor (115); user interface means (117); and one or more environmental sensors (118). The processing unit (12) is configured to : acquire or receive a video stream captured by the video-capture device (111) and data items provided by the skeletal and gesture recognition and tracking device (112), the microphone (113), the proximity sensor (114), the floor pressure sensor (115), the environmental sensor (s) (118), and also data items indicative of interactions of a person under analysis with the user interface means (117); detect one or more facial expressions and a position of eye pupils, a body shape and features of voice and breath of the person under analysis; and estimate an emotional/behavioural/'psychological state of said person on the basis of the acquired/received data items, of the detected facial expression (s), position of the eye pupils, body shape and features of the voice and the breath of said person, and of one or more predefined reference mathematical models modelling human emotional/behavioural/psychological states.La présente invention concerne un système d'estimation d'état émotionnel / comportemental / psychologique humain (1) comprenant un groupe de capteurs et des dispositifs (11) et une unité de traitement (12). Le groupe de capteurs et de dispositifs (11) comprend : un dispositif de capture vidéo et (111) ; un dispositif de suivi et de reconnaissance squelettiques et gestuelles (112) ; un microphone (113) ; un capteur de proximité (114) ; un capteur de pression au sol (115) ; des moyens d'interface utilisateur (117) ; et un ou plusieurs capteurs environnementaux (1