Fun With Stress 1.2

Sensory Input

Reading time: 7 minutes

Illustration prism: BlenderTimer. Illustration owl: GDJ. Composition: Erik Stout

In part 1.1 we discovered that our memories are labeled and buttonized (turned into buttons). These processes are executed by our brain, but before it can do that it has to be provided with input. This chapter will focus on input from our external environment, the world outside of our body, which is all the information we can perceive with our senses. We’ll explore what it is, how it can be perceived, how it travels from our sense organs to our brain, and how it is then transformed into something meaningful.

With our senses we can perceive much information, but not everything. We can for example not directly perceive ultraviolet light, magnetic and electric fields or infrared radiation. From the ones that can be perceived, we only pick up a limited part from the whole range. An average adult for instance can hear sounds between 20Hz and 16.000Hz. A cat by contrast has a hearing range between 100 and 60.000Hz and a bat between 3.000 and 120.000Hz.[1] This is just to illustrate that our human perception is limited: we literally are not able to perceive all the available information on Earth with our senses. In other words, by default we cannot observe and perceive everything there is in the physical realm.[2]

Illustration: shyla_marsare

Thanks to our senses we’re able to navigate through and experience the world outside of ourselves. Our various sense organs are designed to pick up different types of Sensory Information: eyes for picking up Visual Information (seeing); ears for Auditory Information (hearing); nose for Olfactory Information (smelling); tongue for Taste Information (tasting); and skin for Tactile Information (feeling). These types of information have different characteristics, but one thing they have in common: they have to come into physical contact with us – that is, the sense organs in our body – before we can perceive them. That is evident for touching/feeling sensations via the skin and taste sensations via the tongue. Also with smelling sensations it’s quite clear that a certain odour needs to get in touch with our nose in order for us to smell it. With light and sound however it’s a little less obvious how they literally get in touch with our eyes and ears.

Eyes are light sensitive organs, meaning that we can see objects much better when they’re illuminated by lightwaves. As soon as a light source – either sunlight or artificial – bounces off a certain object into the lens of our eyeballs, all of a sudden shapes and colours can be distinguished easily. In the dark that’s much more difficult due to lack of a light source. Therefore if we’re in a dark room with a red closet in it, we can’t see it. Yet as soon as the light is turned on and the lightwaves illuminate the room and bounce off the closet into our eyes, the red closet is immediately identified. Thus the speed with which things are seen depends on whether or not they’re illuminated by a light source, and if we’re in close enough vicinity to distinguish shape and colour.

Ears on the other hand are sensitive to soundwaves – which are really vibrations. As soon as they reach eardrums, we can experience those waves as sound. However, soundwaves need a conductor in order to travel, whereas light doesn’t – that’s why light travels faster than sound. Soundwaves can only travel if they can piggy back on some form of solid, liquid or gas matter. Metal sewage pipes in old buildings are great sound conductors; hit the pipes in the basement with a large pipe wrench and people in the top floors will believe the building is falling apart. Sonar technique uses water as a soundwave conductor, and with speech and music, all the gases that make up the air conduct the soundwaves we produce by means of our vocal chords or musical instruments. Therefore in space, which is considered to be a vacuum, sound cannot be heard since there’s no conductor for the soundwaves to piggy back on.

Any information that enters the body through our senses is a form of Physical Input. If the particular form of sensory information does not get into physical contact with our sense organs, we’re not able to perceive it. So if our favourite band is playing a concert in London while we’re in Philadelphia,[3] we will not be able to see or hear the concert, perceive the inevitable smell of stale beer or feel our fellow mosh pitters. Despite the fact that modern technology makes it possible to see and hear that concert ‘live’ from Philly anywhere in the world, the experience is not the same without fellow fans, the stale beer smell and mosh pit bruises.

Image: Pexels

Thus we can get into physical contact with different types of sensory information. However, taking sound as an example: as soon as a soundwave reaches eardrums, it still doesn’t mean anything; a soundwave by itself which doesn’t reach a nervous system and brain remains meaningless. In order to create meaning, two more steps have to be taken: the auditory information from the soundwave needs to be transported from the ear to the brain, and then the brain needs to process that information in order to attach meaning to it.

The auditory information of a soundwave reaching our ears is being transported by particular Sensory Nerves to specific areas in our brain designated to process that type of information. The same goes for visual information that comes in through the eyes, taste information that enters via taste buds in the tongue, olfactory information that comes in through nostrils and tactile information that finds its way in through the skin; all these types of sensory information need to travel from the sense organs via sensory nerves to the brain before they can be processed and have any kind of conscious impact on us. Before any sensory information reaches our brain it is still Impersonal Sensory Information; similar for everyone who perceives it and lacking in meaning.

Now assuming that all our neural wiring is functional and up to date, all the different types of sensory information eventually reach our great processor: the brain. There each type is being guided to its designated area where it is considered significant or insignificant. The latter category contains the vast bulk of daily sensory information which we pick up but is immediately forgotten, like the colour of the ground we’re walking on. When information is considered significant however, it will be moved on down the line for further processing to a central part of the brain which is called the Emotional Brain or Limbic System.

Illustration from: kids.frontiersin.org

There all the significant sensory information is being weighed against our memories by questions such as: Have I been in a similar environment before when this particular sensory information was perceived? If so, was whatever happened there pleasurable or painful for me? If it was pleasurable, like for instance the taste of ice cream on a summer day, a label is created saying: “Let’s do that again!” If it was painful, like falling over with a bike and ending up with a bloody knee, the created label says: “Let’s not do that again.” Here we see emotions being attached to our memories: ice cream being pleasurable and good, a fall being painful and bad. Pleasurable memories are usually attached to the emotion ecstasy, painful memories to either of the emotions anxiety, anger or grief.[4]

As soon as an emotion has been attached to a memory, an emotional button has been created. The next time we see or smell ice cream on a summer day, an emotional button will be pushed, saying: “Ice cream! I love ice cream, let’s go and buy some!”, and directs us to the ice cream man. The next time we need to descend a steep hill with a bike, the emotional button might say: “Remember what happened last time? Maybe better to take a detour.” Here we see quite clearly how the pushing of our buttons actually determines our actions and behaviour. The positive emotion ecstasy creates desire for pleasure, the negative emotion anxiety creates aversion to (and avoidance of) pain.

The limbic system therefore acts as our personal filter through which every bit of significant sensory information is translated into something meaningful – for us. By means of this system we are able to make sense of the world. Yet as soon as we have made the world ‘sensible,’ it has become our personal version of the world; what was first impersonal sensory information has been transformed into our personal belief system – also known as our personal frame of reference. This is important because the moment we have ‘personalized’ the outside world, by default we do not see it anymore as other people do. So what we call ‘reality’ can never be something that has passed through our personal filter, just like water can never become water again after it has passed through ground coffee in a coffee filter. It is outside the scope of this study to explore this deeper, but suffice to say that without this realisation, misunderstanding and violent conflict are inevitable.

In part 1.3 we’re going to investigate how input becomes meaningful, and what that implies.

For now,
Jolly Greetings
Erik Stout

[1] Anything below 20Hz is called infrasound, anything above 16.000Hz ultrasound. Elephants can hear sound frequencies as low as 1Hz. A porpoise as high as 150.000Hz.

[2] Some fun examples: A buzzard can see small rodents from a height of 15.000 ft. A cockroach can detect movement from amoeba’s twenty yards away. A rabbits tongue contains 17.000 taste buds and a pigs 15.000, whereas we ‘merely’ have 9.000. It is as yet not proven that consuming these animals improves our ability to taste.

[3] For 45-plus people the link towards 1985 should be evident. For the younger generation, google The resurrection of Freddy Mercury.

[4] Anxiety, ecstasy, anger and grief are often referred to as the four basic emotions. Every time our buttons get pushed, the underlying emotion can usually be traced to one of these four, and they all  trigger a stress response in our body.