What is fear?
The answer seems simple, yet we have discovered that fear comes in many shapes and manifests itself in different ways. I learned that children access and express their fears more readily than adults, who have more complex ways of explaining this emotion. Unsurprisingly the older you get, the more ideas you will have on the meaning, the roots and the function of fear. On another note, when we try to describe the word and its meaning within an academic context we will see that a vigorous debate concerning its meaning has been playing out in different fields such as neuroscience, science, psychology, education etc. Scientists have been trying to define this emotion for decades, yet there is not one correct answer. There are many different theories and ideas on the topic. Theses and hypotheses on the idea of what fear is have been constructed by people in the field of science, as well as deconstructed over time, retracting their own theories and sometimes declaring them as wrong.

I am introducing three of the most influential contemporary scientists discussing their perspective on fear. Further I will give a slight inside on three theories of the most influential contemporary scientists discussing their perspective on fear.

Joseph E. LeDoux, who is an American neuroscientist whose research is primarily focused on survival circuits, including their impacts on emotions such as fear and anxiety.

Ralph Adolphs, who is an American Bren Professor of Psychology and Neuroscience and Professor of Biology, studies the neural and psychological basis for human social behavior.

Kay M. Tye who is an American neuroscientist and professor in the Salk Institute for Biological Sciences.

Looking at three very different opinions on the topic, it becomes clear that there are many contrasting and quite subjective notions on what fear is. One might find it difficult to draw a conclusion, when one is overwhelmed by the diversity of the answers. For example Joseph E. LeDoux makes his opinion clear that “it is not fear which can be seen as universal, but danger. And this experience of being in danger is obviously very personal and unique.”5 I am mentioning his ideas, since I agree to an extent with his approach. In my opinion fear is an emotion which is experienced all over the world, even though it is in many different ways. Ralph Adolphs states that: “fear is a psychological state with specific functional properties. Science is going to revise this picture and it will show that there are many different kinds of fear, that depend on a variety of neural systems.”5 One of the theories I find most interesting is from Kay M Tye.She says that: “fear is an intensely negative internal state” and that “it resembles a dictator that makes all other brain processes its slave.”5 What a strong message.

In the next chapter I will introduce the power relations of religion and new established structures, such as AI and other technologies. The reason I am talking about this in my thesis is simple: Our Christian institutionalised Western world is still very much influenced and built upon the stories written in the Bible: The garden of Eden, Adam and Eve and the Fall of Man. They are origin stories discussing fear and temptation, anger and the devil. I am questioning this given system and want to highlight the relevance of new imposed power structures such as AI.
Garden of God - How AI became an incredibly powerful tool detecting our emotions and maybe even our fears
“The story of the Garden of Eden is a theological use of mythological themes to explain human progression from a state of innocence and bliss to the present human condition of knowledge of sin, misery, and death. The Garden of Eden, also called the ‘Terrestrial Paradise’, or simply ‘Paradise’, is the biblical ‘Garden of God’ described in the Book of Genesis and the Book of Ezekiel.”(Genesis 13:10)6 7

In 2020, we find ourselves living under a new powerful system: Technology. AI and other technologies influence our daily lives and infiltrate our society with fear. I call this power structure the new ‘Garden of God’, a constitution using AI, amongst other technologies, to influence our day to day life and infiltrating our society with a network of fears. Within the Bible is the first ever written documentation of fear, the fear of the Lord. The Bible states: “Who shall not fear thee, O Lord, and glorify thy name? for thou only art holy: for all nations shall come and worship before thee; for thy judgments are made manifest.” (Revelation 15:4)8 These ideas on fear in the 21st century draw upon the documentation of fear in the Bible that has been preached to us for centuries. Big companies are trying to understand human emotions more than ever before, because technologies such as AI, give them immense control and knowledge in understanding the human emotions and its facial expressions, providing them with a position of power. One can draw an analogy with what is stated in the bible: “The LORD God commanded the man, saying, ‘You may surely eat of every tree of the garden, but of the tree of the knowledge of good and evil you shall not eat, for in the day that you eat of it you shall surely die.’” (Genesis 2:16-17)9

Leading tech companies, such as Microsoft, IBM, Amazon and Affectiva, have already started developing algorithms to detect facial expressions in connection with our emotions. Experts suggest that this industry will be worth billions of dollars by 2024. “Despite these predictions many scientists think that claims of software could read emotion may be premature.” 10
This App knows how you feel
One of the pioneer companies within this field is Affectiva. I recently watched a Ted talk from 2015 by Rana el Kaliouby called “This App knows how you feel-from the look on your face” Rana el Kaliouby (born 1978) is an Egyptian-American computer scientist and entrepreneur in the field of expression recognition research and technology development, which is a subset of facial recognition designed to identify the emotions expressed by the face. She is the co-founder, with Rosalind Picard, and CEO of Affectiva. In her talk she states: “Our emotions influence every aspect of our lives, from our health and how we learn, to how we do business and make decisions. Our emotions also influence how we connect with one another. We live in a world that’s devoid of emotion in the digital realm. What if our technology could sense our emotions? What if our devices could sense how we felt and reacted accordingly, just the way an emotionally intelligent friend would?” 11
Knowledge is God
It led her and her team to create technologies that can read and respond to our emotions. Rana says that: “the human face is one of the most powerful tools, which we all use to communicate social and emotional states. Like when we are enjoying something, when we are surprised, when we are curious, etc.” 11 Rana and her team are teaching a computer how to read each of these emotions in someone’s face, but this also comes with a challenge. “Teaching a computer to read these facial emotions is hard, because these emotional expressions can be fast, they’re subtle and they combine in very different ways.” 11 Rana and her team are feeding the algorithms with a huge collection of images, which they gathered from open source programmes. These images vary from images that show people that are smiling, people from different ethnicities, ages and genders. “The algorithm looks for all these textures and wrinkles and shape changes on these faces and basically learns that all smiles have common characteristics, all smirks have subtly different characteristics. And the next time it sees a new face it essentially learns that his face has the same characteristics of a smile, and it says ‘aha, I recognise this.’” 11 With this huge amount of data they created the app Affectiva. “We call each reading within the app an emotion data point. So far we have amassed 12 billion of these emotion data points. It is the largest emotion database in the world. We have collected it from 2.9 million face videos, people who have agreed to share their emotions with us and from 75 countries around the world. This data is used today in understanding how we engage with the media, for example understanding voting behaviour, and also empowering or emotion enabling technology. I think that 5 years down the line all our devices will have an emotion chip and we won’t remember what it was like when we couldn’t just frown at our device and our device would say: ‘hmm you did not like that, did you?’ We recognise that there are potential risks and potential for abuse, but personally having spent many years doing this, I believe that the benefits to humanity from having emotionally intelligent technology far outweigh the potential for misuse.” 11

Clearly this quote states that we are aware of the misuse and power AI can have in the fields of emotional recognition, yet that it is being ignored as the ‘positives’ outweigh the negatives. Only after we have opened ‘pandora’s box’ will we see the detrimental effects we have chosen to ignore. “Then the LORD God said, ‘Behold, the man has become like one of us in knowing good and evil.’”(Genesis 3:22b-24)12

Today, in the year of 2020, 5 years after the Ted Talk quoted above, we don’t have an emotion chip in every device. Nevertheless, Affectiva is one of the main pioneer companies using Artificial Intelligence on a mission to humanize technology in order to know everything about our human emotions. Knowing this makes me feel afraid. This is an incredible set of data, which they gain through all of our online actions. Companies like Affectiva are able to track and document, like already explained in the paragraphs above, the slightest change in our facial emotion–indicating the emotion we feel right now and come to a solution based on that, whether this solution is correct or not. If we think our digital device knows everything about us right now, clearly there is still more to come. It will know things about us, we do not even know about ourselves. Imagine how tremendously it will change the world when AI is able to detect and read our fears. As mentioned before scientists are very sceptic and criticise companies like Affectiva using the face as the only source to identify human emotion, leading to potential risks for all of our lives. So what happens if private companies can potentially use this emotion against us? How will this affect our future generations?
Universal (?) Emotions and Skin Deep
“The human face has 43 muscles, which can stretch, lift and contort it into dozens of expressions. Despite this vast range of movement, scientists have long held that certain expressions convey specific emotions.”13 “Perspectives on emotions from evolutionary theory were initiated during the mid-late 19th century with Charles Darwin’s 1872 book ‘The Expression of the Emotions in Man and Animals’.”14 He theorized that “emotions were innate, evolved traits universal to the human species. Darwin largely argued that emotions evolved via the inheritance of acquired characters. He pioneered various methods for studying non-verbal expressions, from which he concluded that some expressions had cross-cultural universality. Darwin also detailed homologous expressions of emotions that occur in animals. This led the way for animal research on emotions and the eventual determination of the neural underpinnings of emotion.” 14 15

Photographs illustrating emotions of grief from Charles Darwin’s work “The Expressions of the Emotions in Man and Animals”, published by J. Murray, London, 1872. Image courtesy of the Beinecke Rare Book and Manuscript Library, Yale University.


One of the pioneers in the field of studying emotions and their relation to facial expressions is Paul Ekman. He is an American psychologist and deals with this topic for a long time. “Through a series of studies, Ekman claims that he has ‘found’ a high agreement across members of diverse Western and Eastern literate cultures on selecting emotional labels that fit facial expressions.” 16 Grossness, joy, loneliness, scaredness, wrath and shock are some of the expressions, which he says to be universal. In order to really prove his theory, that emotions are universal, he worked together with Wallace V. Friesen and they “demonstrated that the findings extended to preliterate Fore tribesmen in Papua New Guinea, whose members could not have learned the meaning of expressions from exposure to media depictions of emotion.” 17 Within the findings of this research Ekman proposed that there are six emotional expressions universal to people all over the world, which are happiness, sadness, surprise, fear, anger, disgust. “In 1978 Paul Ekman and Wallace V. Friesen published a system called Facial Action Coding System (FACS) to classify human facial movements by their appearance on the face.” 18

Facial Action Units (AUs) of upper and lower face

Emotions and Emotion Recognition
The article: “Why faces don’t always tell the truth about feelings” states: “These ideas stood largely unchallenged for a generation. But a new cohort of psychologists and cognitive scientists has been revisiting those data and questioning the conclusions” since fear has a commanding moral position within the society of the 21st century, especially when it comes to emotion and facial recognition. “Many researchers now think that the picture is a lot more complicated, and that facial expressions vary widely between contexts and cultures.” “Researchers are increasingly split over the validity of Ekman’s conclusions. But this debate and doubts have not stopped companies and governments accepting his assertion that the face is an emotion oracle and using it in ways that are affecting people’s lives.” 19

These kinds of softwares, which are used in Western countries for example for reading the emotions of a defendant, are being misused and racially biased and therefore also dangerous. Therefore there have been psychologists who have tried to work with these emotions in a way that is actually considered empirical and universal in the last years. Scientists agree more or less on the fact that “basic emotions, however many there may be, serve as the foundation for the more complex and subtle emotions that make up the human experience.” 20

In Emotion recognition studies, scientists used to focus primarily on the face.Research into expression has relied on showing subjects posed photographs of faces re-presenting different emotions. But they are faces with no context, also because they are static they lack the temporal context which could be read in a moving face. This can lead to biases and subjectivity. And because scientists do not want to rely on this anymore they are now trying to focus on faces in context which means to in-clude external factors like the situation a person is in, the people the person is surrounded by, the context of their body, etc.

“The ‘Theory of Constructed Emotion’ offers a radical new take on what emotions are, where they come from, and how they shape our lives. Presented by psychology professor and neuroscientist Dr.Lisa Feldman Barrett in her bestselling book “How Emotions Are Made”, it also contradicts many of our most firmly held ideas about how human emotions work.” 21 As one of the leading experts in this field Lisa says that: “Emotional expressions are much more nuanced and variable than people have assumed.” 22 Which is very interesting, since emotions and especially the emotion of fear is one of the most powerful tools in order to control certain aspects of society. She also explains: “The interest in facial expressions and the modern science of emotion really began in the 1960s with a scientist named Silvan Tomkins and two of his postdoctoral fellows Paul Ekman and Cal Izard. They were very interested in testing the hypothesis, that everyone around the world smiles when they’re happy, etc. and correspondingly that everyone around the world recognises those facial movements as expressions of emotion.” 22 But that is just not true. More factors need to be taken into consideration within studies like that, such as “the context sensitive and temporarily changing patterns. So whereas scientists used to focus primarily on the face, now they’re focusing more on faces in context.”22 “In order to avoid biases and subjectivity in their studies, researchers are turning to technology to find more natural ways of studying emotion nowadays.” 22 With technologies evolving and AI becoming more precise, we can just hope that these softwares become less biased and more accurate. With a rising interest in this field of subject and more critical thinking when it comes to emotion recognition, I see great potential for this to actually happen. Another study I want to mention is from Rachel Jack, a psychologist and Professor of Computational Social Cognition, and her colleagues who dealt with the question: “Do facial expressions reliably communicate emotions?”

Within her research she uses a computer, which randomly generates dynamic facial expressions. Rachel Jack states: “We’ve used this to look at main differences between cultures in terms of the six classic emotions: happy, surprise, fear, disgust, anger and sadness and we found that East Asian facial expressions of these emotions tend to be signaled primarily with the eye region. So the eyebrows and differences in the eyes. Whereas with Westerners the face movements tended to vary more in the mouth region.” 22 In the video they also state: “Differences like this are leading some researchers to believe the expressions are not as universal as once thought and that some behaviours might be learnt in childhood.” 22

Nowadays researchers are turning to technology to find more neutral ways of studying emotion. Rachel Jack’s research uses a computer, which randomly generates dynamic facial expressions. They used this technique to look at main differences between cultures in terms of the six classic emo-tions: happy, surprise, fear, disgust, anger and sad.

Rachel Jack and her team claim that they found that East-Asian facial expressions of these emotions tend to be signaled primarily with the eye region. Whereas with Westerners the face move-ments tend to vary more in the mouth region.

Relating to Rachel Jacks’s idea Lisa explains: “In early childhood emotion knowledge is being bootstrapped into the brain, into the brain’s wiring. This is also what happens when you change cultures and when you move from one culture to another. You have to learn what is the nod of a head or the raise of a lip or the shake of a head. You know what do these things actually mean in this culture.” 22 This is a very interesting thought. What I learned from the visual research and the collection of the children is that children from the same age group and same cultural background have similar thoughts and ideas on what fear is. Whereas children coming from a different cultural background had quite contrasting stories to draw. “Despite these complexities the idea that facial expressions could be used as a way to read emotions has excited big business.” 22

“Decoding emotions is also at the core of a controversial training programme designed by Ekman for the US Transportation Security Administration (tsa) and introduced in 2007. The programme, called spot (Screening Passengers by Observation Techniques), was created to teach TSA personnel how to monitor passengers for dozens of potentially suspicious signs that can indicate stress, deception or fear. But it has been widely criticized by scientists, members of the US Congress and organizations such as the American Civil Liberties Union for being inaccurate and racially biased. Such concerns haven’t stopped leading tech companies running with the idea that emotions can be detected readily, and some firms have created software to do just that.” 23

This is the ‘Fall of the Man’.
(Now Adam and Eve are banished from the Garden of Eden.)