Wednesday, September 28, 2011

Do You Have Be Ugly to Hear Well? – Owls and Body Plan Symmetry

Biology concepts – body plan, bilateral symmetry, cephalization, form follows function

Paradox alert – the most complex organisms in nature are the best at reducing complexity. How is that? Nature tends toward symmetry through evolution. Lower organisms do not to have much symmetry, while more complex organisms usually have a symmetric body plan.


Sponges have no symmetry, some lower animals have radial symmetry, while higher animals (including us) have bilateral symmetry. 
Bilateral symmetry results in a front end (anterior) and a back end (posterior).
As far as animals go, sponges have little or no symmetry (yes, sponges are animals). As you move up the ladder of complexity, you first see radial symmetry (starfish, worms, octopuses), then bilateral symmetry (one side of an animal is mirrored by the other side).   

Having a mirror image means no additional planning. It’s like building a second building using the plans from the first - no added cost. By repeating units (metamerization, as in worms and arthropods) or mirroring structures in bilateral symmetry, the animal may become more complex, without having a more complex organizational plan and therefore fewer possible mistakes in development. Paradox averted.

Symmetry leads to distinct front and back ends and movement in a certain direction (see the lobster above). This leads to cephalization (development of a head). Cephalization in turn leads to more bilateral symmetry, including of the head itself. The two halves of the face are close to being mirror images.

Bilateral facial symmetry is thought to be important in determining what people think is pretty. Asymmetries in facial characteristics, even if not noticed consciously, may play a role in determining who we believe to be attractive. For example, men with more facial symmetry have more sexual partners and are more likely to have partners outside their primary relationship….. apparently, symmetry has little to do with decency.


The symmetry of different aspects of the face may be a sign of health.
Facial symmetry might be even more important evolutionarily, as small asymmetries that begin in utero (in the womb) could indicate an inability to resist the harmful effects of environmental or infectious perturbations. This would be a sign of weaker genes and would discourage potential mates. In research on some South American tribes, the more symmetric males tended to have more children survive to adulthood and those children had fewer diseases. It may be that in the search for the healthiest possible mates (therefore the most desirable genes), we use symmetry as a discriminator. Modern medicine has made much of this moot, but instinct is hard to kill.

Now for our exception. One animal has abandoned the move toward symmetry in order to improve its ability to hear. Owls are truly more interested in substance over style; they break the rule of symmetry in order to survive. Nocturnal owls must be able to locate their prey in the dark, and for this they rely on their hearing more than sight, so their auditory sense is truly a survival mechanism.

To hear a mouse, or a ferret, or a cheeseburger from a long distance away is one thing, but owls also need pinpoint from where that sound is coming. Many animals (including humans and owls) are capable of detecting small differences in the time that sounds reach each ear. This is one of the beauties of bilateral symmetry - we have two ears.

If a sound originates from your left, it reaches your left ear before it reaches your right. Your brain senses this time difference and calculates how far left of center the object must be. Humans and owls are equally good at this; we can detect a time difference of less than 10 millionths of a second (0.00001 sec)! In addition, if the sound is closer to your left ear, the sound reaching it will be just a little louder than the sound reaching your right ear. Your brain can sense this difference as well.


There is a large disparity in the vertical position of the ears
in many species of owls. Does it make them unfit or ugly?
However, owls take this a step further, since the third dimension is of more importance in their world. Owls need to know if their prey is above or below them (and by how much) in order to hunt efficiently. With ears at the same height, a sound from below or above reaches both ears at the same time – no help in locating dinner.

On the other hand, if the ears were located at different heights, the sound would reach one ear before the other, and locality information could be obtained. This is what evolution has done for many species of owl. Perhaps not the prettiest solution, but beautiful none-the-less.

Looking at the picture of the owl skull, you can see that the right ear operculum (opening) is placed well above that of the left. This asymmetry does wonders for their sense of hearing, but leaves them with a lopsided head. Lucky for their love life, most owls’ head and facial feathers tend to even out this disparity.
           
The ear asymmetry isn’t the owl’s only body design modification. The combination of the elements listed below allows owls to hear a mouse burrowing under six inches snow up to 100 ft. away, or hear it squeak from a half mile away! It would seem that owls are designed to pick up noise.

Head turn – The asymmetry of the ears and their location on each side of the head allow the owl to localize the sound to a certain degree, but it is increased by the owl turning its head from side to side. The head position where the sound reaches the owls ears at the same time defines when the prey directly in front of its nose. The owl can do this without moving because of its extraordinary ability to turn its head 270 degrees in either direction. This would mean that you could turn your head to the right and end up looking at your left shoulder!
           
Facial disk – In general, the bigger the facial disk on an owl, the more it relies on hearing as opposed to sight to locate prey. The disk is shaped to collect sound and funnel it toward the ears, much the same way that our outer ears collect sound for us, or how a satellite dish collects TV signals. 


The size of the facial disk on an owl gives you an idea how much it relies on hearing to catch its prey. The barn owl on the left and the masked owl in the center rely on hearing most, while the Northern Hawk Owl on the right uses primarily its eyes to hunt.

An owl’s facial muscles change the shape of the facial disk in order to fine tune the sound entering the ears and better locate prey. This is an important hint as to the importance of hearing in some owls; they have muscles to change the shape of their sound collecting face, yet they can’t move their eyes.
           
Spatial auditory mapping - The signals relating to where a sound is coming from are coordinated in the owl’s medulla (an old area of the brain). This part of an owl’s brain is three times as large as a crow’s, and hints that something special may be going on there. The up and down location information generated by the asymmetry of the ear positions is integrated with the left and right information generated from the ear positions on the side of the head and the head turn. The distance is also estimated by the time and intensity differences of sound wave arrival, especially as the head is turned.

For asymmetric owls, the left-right directional cues lie in the interaural (inter = between, and aural = ears) time difference (ITD), while elevation cues are processed via interaural level (loudness) differences (ILD). These two cues are processed in different parts of the brain and then converge to form an aural map. A recent study has compared the size of the auditory nuclei and evolution of these nuclei in asymmetrical- and symmetrical-eared owls. 

The Canadian research team found that all auditory nuclei in asymmetrically-eared owls are larger than the same nuclei in symmetrically-eared owls, even those not involved in ILD, ITD, or converging of signals. They hypothesize that the enlarged nuclei result in increased locating abilities, but also in an extended hearing range in asymmetrically-eared owls. Comparison between phylogenetic trees indicates that increased locating ability preceded an increase in hearing range, and that they both have arisen more than once in different groups.


Since the sound reaches the ears with different qualities, 
the owl can map the sound to a position in his visual field, 
based on time differences (panel b) and intensity differences (panel c). 
This position is mapped directly on to the owl’s visual map.
All this sound information is then translated onto the owl’s visual map in its brain (the location of objects in space as the owl sees it – most mammals have one of these). The owl actually sees the position of the sound, as if the little mouse’s voice was a big red flag waving at the owl. This auditory and visual cooperation must be important for owls, because when they go blind early in life, owls can no longer hunt by sound.

In each example above, the owl has manifested a functional ability, and in each instance this ability has been honed to increase that function. The form may not be elegant, it may not be easy, it may break a rule of biology, but it must be the way it is to perform its function. This is a basic tenet of biology – form follows function. How something looks is more related to its job rather than to the overall esthetics or appeal of the organism. That is why it is so easy for organisms to break rules – function sometimes demands it. Owls are built for hearing, and as a result, they look like owls. Amazing…..and beautiful.

Gutiérrez-Ibáñez, C., Iwaniuk, A., & Wylie, D. (2011). Relative Size of Auditory Pathways in Symmetrically and Asymmetrically Eared Owls Brain, Behavior and Evolution, 78 (4), 286-301 DOI: 10.1159/000330359  

For more information, classroom activities and laboratories, see:

Body plan –

http://faculty.clintoncc.suny.edu/faculty/michael.gregory/files/bio 102/bio 102 lectures/animal diversity/lower invertebrates/sponges.htm


cephalization –

owl hearing –
http://islandwood.org/forkids/owls-at-islandwood/how-can-we-see-owls/owl-ears-1/asymmetrical-ears 

Wednesday, September 21, 2011

Your Ears Hear, But Can You Hear Your Ears?


The auditory mechanism is very smal
Of your 10 or 11 senses (remember last week? – sight, hearing, taste, smell, touch, hot, cold, pain, kinesthetic awareness, balance, and maybe proprioception), hearing and sight have the most complex mechanisms. This is amazing, as the entire auditory apparatus in your ear is smaller than a peanut M&M. Into this space are packed more than a dozen individual structures needed to convert waves into sound. And even more amazingly, the same structures can also produce sound!

In generation of a normal auditory response, the sound waves contact the tympanic membrane (eardrum), and vibrate the ossicles of the middle ear (the three smallest bones of the human body; the incus, the malleus, and the stapes – great crossword answers by the way). The movements of these bones transfer the vibrations to the fluid filled cochlea. The vibrations create a travelling wave.

Look at the pictures below of the cochlea along its length and in cross section. Three fluid filled chambers work together to change the fluid wave into an electrical impulse. The wave travels from the ossicles up the scala vestibuli (the top chamber) and then back down the scala tympani (the bottom chamber).

The left cartoon shows the cochlea unrolled, while the right drawing is the rolled up cochlea in cross section.

The Organ of Corti transduces waves to nerve impulses.
In the middle chamber is the Organ of Corti, laying on the basilar membrane. Depending on the frequencies contained in the travelling fluid wave, the basilar membrane is vibrated in specific places along the length of the cochlea. Where the basilar membrane vibrates, inner and outer hairs of the Organ of Corti rub along the tectorial membrane, and the rubbing triggers electrical impulses in the nerve cells attached to the hairs. This creates the nerve impulse that the brain interprets as sound of a certain frequency. I told you it was complex – and I’m giving you the simple version.

Now comes the shocker. This same mechanism can be reversed, to send a wave OUT of the inner ear, and both waves are occurring all the time. Why you ask? The outward wave is a side effect of a mechanism that allows us to hear accurately.

Here (or hear) is how it happens. The traveling wave is moving in cochlear fluid, the hair movements are stiff, and the space that the wave is moving in is cramped. All these factors cause a loss of energy in the wave, so much so that the frequencies could be lost and our hearing would be inaccurate and not very sensitive.

The inner hairs of the Organ of Corti generate the electrical impulses, but the outer hairs are attached to muscles and can actively move, as opposed to the inner hairs that are passively moved by the traveling wave.

The active movement of the outer hairs is in the opposite direction of the liquid wave in the cochlea. This keeps the wave swirling, instead of dying (no, I don’t quite understand how this happens either). The energy loss in low energy waves is therefore counteracted, and the energy of strong signals can be amplified. Hence, this mechanism is called the cochlear amplifier.

If the generation of this opposite wave by the amplifier is not uniform across the entire cochlea, which often it is not, then the wave produced will travel back to the ossicles and eardrum, and generate a new traveling wave in the cochlea. This is new wave, generated and then detected by your own cochlea, is called an otoacoustic emission (OAE).

Why would anyone search for sounds coming from the ears? It’s like conducting research to see if pigs really fly. But for OAEs, it seems there was a reason. In the 1940’s, an astronomy graduate student named Thomas Gold deduced that the damping of the traveling wave would be too much to overcome without some sort of compensatory mechanism to amplify the frequency. Although he did not discover the cochlear amplifier mechanism, he did predict that one would be found.

Gold’s ideas were loudly rejected and he soon returned to cosmology research, but not because of the controversy. Dr. Gold had many controversial ideas. His 1968 hypothesis that the newly discovered pulsars were rotating neutron stars was initially rejected, but proved correct. However, he missed the target when he suggested that the dust on the surface of the moon was many feet thick and astronauts landing on the moon would be sink in and disappear. According to Dr. Gold, “Science is no fun if you’re never wrong.”

Dr. David Kemp picked up Dr. Gold’s work in the 70’s, and did publish on the existence of the amplifier, with an acoustic emission as a byproduct in 1978. He showed that the movement of the eardrum in OAE’s is very small. A movement of just 10-10 meters (about the width of a hydrogen atom) will produce a strong OAE. He told me that some people can hear their own OAE’s, but they are too soft for other people to hear. Most people learn to ignore them, like how you don’t feel your backside against the chair after sitting for a while. This is called sensory adaptation, and we may talk more about it in a few weeks.

It is interesting enough to know that OAE’s exist, but scientists have taken advantage of them to help identify us and to track our hearing. OAE’s are also unique to each individual, so some companies are developing security identification instruments based on OAE recognition. More importantly, OAE’s can only be generated if the cochlea is functional, so you can monitor inner ear function with OAE’s.

Many studies have begun to use OAE's in research on auditory functional and health. One study published in late 2012 showed that damage to the inner ear caused by the cancer drug cisplatin could be detected by monitoring OAE's. This same study provided evidence that using gingko bilobo can reduce the ototoxicity produced by cisplatin, as revealed by OAE production.

Conventional hearing tests like that shown on the left can't be used with infants, so OAE's offer a new testing mechanism..
Remember the old hearing test where you would raise your hand when you could hear the tone? Well, babies and some people with disabilities can’t do this. Scientists can induce OAE’s by broadcasting clicks into the ear, and then listening with a microphone for the OAE. It is a new test for the function of the cochlea.

Furthermore, different frequency OAE’s are generated by the outer hairs at different distances along the Organ of Corti; therefore, a lack of certain frequencies in the OAE response would correspond to a dysfunction in that part of the cochlea.

OAE tests cannot replace hearing tests, as many hearing problems have nothing to do with the cochlea. For example, there can be dysfunctions in the nerves that carry the signals to the brain, downstream of the OAE. Tinnitus, a ringing in the ears that affects some 36 million Americans can be caused by many things, including damage to the hair cells or even antibiotics, but is not related to OAEs.

But this not to say that OAE monitoring is not helpful in tinnitus. A recent study in Poland shows that OAE function is often affected in people with tinnitus. The results suggest that damage to the basal region (low tone region) of the cochlea may result in an ear that can then hear the ringing (tinnitus).

But OAE’s do help monitor the impulse-generating portion of the system and can locate problems in specific aspects of hearing. The next time no one else hears that noise you swear was there, maybe you should just chalk it up to your ears talking to you.

Next time we will take a look at an animal that is built to be the best listener on Earth - even if it has to break a biological rule to do it.


Fabijańska A, Smurzyński J, Hatzopoulos S, Kochanek K, Bartnik G, Raj-Koziak D, Mazzoli M, Skarżyński PH, Jędrzejczak WW, Szkiełkowska A, & Skarżyński H (2012). The relationship between distortion product otoacoustic emissions and extended high-frequency audiometry in tinnitus patients. Part 1: Normally hearing patients with unilateral tinnitus. Medical science monitor : international medical journal of experimental and clinical research, 18 (12) PMID: 23197241

Cakil, B., Basar, F., Atmaca, S., Cengel, S., Tekat, A., & Tanyeri, Y. (2012). The protective effect of Ginkgo biloba extract against experimental cisplatin ototoxicity: animal research using distortion product otoacoustic emissions The Journal of Laryngology & Otology, 126 (11), 1097-1101 DOI: 10.1017/S0022215112002046


For more information, classroom activities, and laboratory activities:

Auditory mechanism –

Cochlear amplifier and OAE’s –

Thomas Gold –
http://physicsworld.com/cws/article/news/19733

Wednesday, September 14, 2011

Why does your telephone have two holes? – Perspective on our Senses

We all know that we have five senses: seeing, hearing, tasting, touching, and smelling. Anyone disagree? Wanna put some money on it?

 


The five senses. Are these all there are? Notice how
four of the five are located on your face.

Our senses are the ways that we receive information about the world. Everything we know, feel, and interact with comes to us in  just a few select ways. And the ways we send out information are even more limited.


Would you consider all your senses of equal importance - which one would you hate most to lose? Neurophysiologically, humans are sight (visual sense) dominant. Thirty to forty percent of our cerebral cortex is devoted to vision, as compared to 8 percent for touch or just 3 percent for hearing (auditory sense).

Even though touch claims only 8% of our brain’s real estate, the tactile sense is really humans’ second dominant sense. Why do you think babies stick everything in their mouths? Our lips and tongue are the most sensitive areas for touch; the little ankle biters are just gathering information in the best way they know how at that point in their development.


Helen Keller was the first blind/deaf person in
America to graduate from college.
One of the most famous examples of how our senses affect our lives, but don’t have to control our lives, is Helen Keller. At the age of 19 months, Helen lost her sight and hearing as a result of an infection (probably rheumatic fever or meningitis).

Helen lost her most dominant sense (sight), but retained and made good use of her other dominant sense (touch). She even learned to speak by using touch – her fingers on her teacher's lips and throat help her to mimic the movements and vibrations. In general, Helen lost her ability to transduce (change energy from one form to another) waves of energy. Light waves could not be detected or changed to electrochemical nerve impulses, and neither could sound waves. 

Even without her abilities to sense waves, Ms. Keller retained her ability to sense chemicals and change those molecule/receptor interactions into nerve impulses that could be interpreted as taste (gustatory sense) or smell (olfactory sense). And she still had her important sense of touch.


There are many components included in our sense
of touch: pressure, pain, hot, cold.


Our sense of touch is actually a system of different inputs. Some scientists don’t lump them together, and state that humans actually have 10 senses. We have sensors that detect pressure (touch), hot, cold (yes, in terms of receiving information, hot and cold are different), nociception (pain), kinesthetic awareness (stretch receptors in our muscles tell us where our limbs are in space), and a vestibular sense or balance (in our ears we have the semicircular canals that tells us about where our head is in space).

Finally, some scientists consider the coordination of our inputs (proprioception) to be an 11th  sense. Any physical task that would require visual inputs of your position, balance inputs from your semicircular canals, and kinesthetic inputs from your muscles in order to make the proper responses would use proprioception – for example, most circus acts….. or motherhood. Since proprioception is a coordination of senses and not a direct intake of information about the world, I will let you decide if you think it belongs in the same category as the others.  Do you still think we have only five ways to bring in information?


Telephones are designed this way
because we talk and hear in two
different places.
Now let’s consider the opposite activity- sending out information. I offer no answer, but I like pondering the reasons why evolution developed some systems just for inputs and different systems for outputs. Take the telephone question in the title of this post. We take in and interpret sound waves through our ears, but we make sound waves with the muscles of our throat and diaphragm. Imagine what your smart phone might look like if your ears and mouth weren’t located so close to one another!

Not only are our input and output systems different, but the ways we transmit information are even more limited than the few ways we extract information. The principal way we send out information is by our muscular movements. Our muscles move us into and out of other people’s visual field, and our body’s posture, action, and expressions can also transmit information visually.

Muscles move our larynx to control the frequency of the sound waves that are generated by our diaphragm muscle pushing air out of our lungs. Our muscles also control our physical interaction with others; they can feel the pressure when me move to touch them. Sometimes our muscles generate enough pressure to cause pain.

Maybe we could provide more information if we were a cannibal’s meal (“She might not have had good taste, but she sure tasted good!” - Don Johnson in A Boy and His Dog). Bill Cosby always said he wouldn’t eat tongue because he didn’t want to taste anything that might taste him back.


The vomeronoasal organ, if adults have one,
is located forward of the olfactory bulb.
Finally, we might also communicate by the pheromones we produce. These are chemicals sensed by the vomeronasal organ (VNO), a part of the smell sense not associated with the olfactory bulb (see cartoon). Pheromones certainly affect social and physiologic behavior in lower animals; scent trails laid down by ants help the next ant find food or home. Hunters take advantage of pheromones to attract male deer or elk, as spiders use them to attract male moths to their web. But pheromone function in humans is more controversial. 


There are many companies that are more than willing to sell you pheromone concoctions aimed at increasing the physical attraction between a guy and a gal, but the latest research is equivocal at best.  Adult humans may retain a small VNO (up to 70% of adults show a VNO organ) and a gene for pheromone reception has been found to be expressed in the VNO.  Infants may sense and discriminate their mothers from other adults using pheromones.

Related to possible pheromone receptors in the VNO, recent research has shown that the human nose actually contains solitary chemosensory cells, with their receptors and signal pathways. This means that you have taste cells in your nose for umami, sour and bitter tastes! The purpsoe for these is not to taste the compounds that excite the receptors, but to signal that irritants are present. The signal pathways then trigger the trigeminal reflexes to get rid of noxious irritants - sneezing, watering eyes, runny nose, etc. Bitter receptors cells are most plentiful in the VNO, while the others are spread out evenly throughout the nasal mucosa.


Let’s review.
1. We have considerably more than five senses, but the actually number is a matter of some dispute. We settled on 10 senses for this post, but some scientists go all the way up to 17; and this doesn’t include seeing dead people or having common sense!
2. Our systems for inputs from the world don’t overlap with our outputs of information to the world. Our retinas and visual cortex don’t give off light, our smell receptors don’t produce odors, and our touch sensors don’t push on other people.

However, there is one of our senses that actually is a two-way street – you knew the exception was coming, didn’t you?  I won’t give it away, but we will HEAR about this exception in the next post.


Braun T, Mack B, Kramer MF (2011). Solitary chemosensory cells in the respiratory and vomeronasal epithelium of the human nose: a pilot study. Rhinology, 49 (5), 507-512



For more information, classroom activities or laboratories about the senses, proprioception, or pheromones:

Senses –

more than five senses –

proprioception –

pheromones –