Este artigo faz parte de uma série de artigos redigidos por colaboradores do Departamento de Ensino e Ação Social da ANEEB. Apoie o autor lendo o artigo no seu LinkedIn.

Emotions were despised for several years and their scientific study set aside by considering that they were illogical and dysfunctional. However, lately, that mindset changed and emotions have started to be seen as a crucial way to understand human behaviour and to improve methods of learning, teaching, diagnosing mental diseases and improving the way we deal with one another. Even so, the definition of emotion is still not unanimous in the scientific community. Nevertheless, it can be described as a neural impulse that triggers a response to a stimulus or a thought. This response, as we know, can be displayed by facial expressions like smiling or by certain behaviours such as crying. Moreover, they can also be detected through physiological reactions such as the heart rate or breathing. That is where the Biomedical Engineering comes in, with the creation of solutions, which can go from biomedical instruments to computer programs allowing us to collect, interpret and classify emotions.

By focusing on facial expressions, intelligent systems have been developed to allow us to interpret what people are feeling at each moment. Most of these systems are non-invasive and include a lot of possibilities such as monitoring an individual’s health, evaluate students’ interest in classrooms, helping to diagnose symptoms of certain diseases, and developing robots who need to interact with humans. However, this is not so simple: as we understand, emotions are really expressed in different ways from person to person, and facial expressions are not an exception, so, a lot of the systems invented until now, are not very trustable. One of the latest advancements on this area is a machine-learning model that uses a technique called Mixture of Experts (MoE) which means the computer can “understand” which model is best adapted to the person being studied in order to better classify his/heremotion, instead of being based on a single model as it was done until now. This is proof that the study of emotional signs’ future will be personalized to achieve more accurate results, opening a lot of doors. It can also allow us, for example, to interpret the feelings of autistic children and understand what kind of environments, or activities, can improve their communication capabilities as well as recognise the ones who bring them more stress and anxiety. It can be a big step for the therapy of this type of syndromes and also being applicable to other kind of behaviour problems, such as social anxiety and depression.

Furthermore, this is another way we have to interpret human emotions that are related with physiological responses such as heart rate, breathing and piezoelectric skin response. One of the most promising applications of this type of method, is the detection of potentially dangerous levels of stress in driving that can be responsible for accidents. The idea behind it, is to incorporate functional sensors in the vehicle that allow us to measure the heart rate and piezoelectric response and when these values achieve a certain limit it gives a warning sign to the driver in order to stop the car, or to calm down. It can also be used in cases where the driver is falling asleep. Another application is to monetize the physiological response of individuals to certain environments or behaviours, in order to achieve better methods of education, work, and interaction with people with diseases, for example. One particular study has been made with the physiological response that children have to different tones of voice to understand what’s the best way to educate them and to have them doing what the parents ask. These are just some examples of the incredible applications of this type of techniques.

Anyway, emotions are definitely something to have into account nowadays as a crucial way to understand the human behaviour, and to help us accomplish better ways to deal with each other in the most various scenarios. Let’s hope that we will be continuing to see advancements in this area of science, and who knows, if in a near future we won’t be able to see biomedical emotion detectors determining the truth in court , revolutionizing the way we educate our children, or allowing us to have a app in our phone that through facial recognition can reveal if that special person is also in love with us.


Selvaraj J., Murugappan M. & Yaacob S., Classification of emotional states from electrocardiogram signals: a non-linear approach based on hurst, BMC – Part of Springer Nature, 2013, Available from: Assessed in 02/05/2020.

Kulkarni S., Reddy N. & Hariharan S., Facial expression (mood) recognition from facial images using committee neural networks, BMC – Part of Springer Nature, 2009, Available from: Assessed in 02/05/2020.

McClure E., Pope K., Pine D., Leibenluft E. & Hoberman A., Facial expression recognition in adolescents with mood and anxiety disorders, The American Journal of Psychiatry, 2003, Available from: Assessed in 02/05/2020.

Matheson R., Mit News Office, Machine-learning models capture subtle variations in facial expressions, SciTechDaily, 2018, Available from: Assessed in 02/05/2020.

Carey B., Stanford University, Engineers develop video game controller that can sense players’ emotions, SciTechDaily, 2014, Available from: Assessed in 02/05/2020.

Cardiff University, Using the right tone of voice may be key to getting teenagers to cooperate, SciTechDaily, 2019, Available from: Assessed in 02/05/2020.

Correia A., Universidade Fernando Pessoa, A competência no reconhecimento da expressão facial da emoção: estudo empírico com crianças e jovens com perturbação do espetro do autismo, 2014, Available from: Assessed in 02/05/2020.

Categorias: Artigo