Alice in Wonderland syndrome

AliceSyndrome.png“‘What a curious feeling!’ said Alice; ‘I must be shutting up like a telescope.’” “’Curiouser and curiouser!’ cried Alice (she was so much surprised, that for the moment she quite forgot how to speak good English); ‘now I’m opening out like the largest telescope that ever was!’”
Lewis Carroll, ‘Alice’s Adventures in Wonderland’

There are many illnesses and issues that can cause the sensory anomalies, but one of the most interesting is Alice in Wonderland Syndrome (AIWS). The typical symptoms for AIWS are Micopasia (objects appear small) and Macropasia (objects appear large). In addition, body parts sometimes seem to be larger or smaller. Individuals with AIWS often describe their perception of the world like looking through the wrong end of a telescope.

AIWS was identified by British psychiatrist John Todd in 1955, as a “singular group of symptoms intimately associated with migraine and epilepsy, although not confined to these disorders.” Todd named the syndrome after Lewis Carroll’s book, Alice’s Adventures in Wonderland. Carroll (Charles Lutwidge Dodson) reported “bilious headache” (migraine) in his diaries, and the effects of AIWS: “… experienced, for the second time, that odd optical affection of seeing moving fortifications, followed by a headache.”

Here’s some of what we know about AIWS.

  • Cause: AIWS is not considered optical problem or hallucination. It is likely caused by an issue (such as blood flow) in the parietal lobe where perception of the environment is processed.
  • AIWS triggers: infections, migraine, stress and drugs (some cough medicines), brain injury are some of the more common.
  • AIWS in Children: Studies show that between 6 and 10 percent (perhaps a bit more) of children have an onset of AIWS at an average of 8.5 years of age with many growing out of the syndrome in adulthood. In 65% of cases AIWS occurred in children under 18 years of age.
  • Migraine is the first cause of AIWS in adults (27.6%) and the second in children (26.8%).
  • Symptom spectrum: Micopasia (objects appear small) and Macropasia (objects appear large) are the most common distortions caused by AIWS, but other symptoms include a perception that body parts are larger or smaller, that the individual is falling through or merging with the floor. In addition, other senses such as odor and taste are sometimes distorted.
  • Other symptoms reported in AIWS: kinetopsia, auditory hallucinations and verbal illusions, hyperacusia/hypoacusia, dyschromatopsia, zoopsia, and complex visual hallucinations.

In a study of 166 cases of AIWS the most common causes were:

  • Migraine – 27.1%
  • Infections – 22.9%
  • EBV – 15.7%
  • Brain lesions – 7.8%
  • Medicament – 6%
  • Drugs – 6%
  • Psychiatric disorders – 3.6%
  • Epilepsy – 3%
  • Peripheral nervous system disease – 1.2%
  • others – 3%

There’s some speculation that AIWS may have provided inspiration to some of the arts such as writing and painting. A host of artists are known to have suffered from migraines including Carroll, Picasso, Emily Dickinson, Vincent Van Gogh, Thomas Jefferson, and Gustav Mahler.

Web Applications

Related Articles



Illusions occur when sensory data is misinterpreted by the brain. The illusions we perceive are proof that our minds construct our perceptions and may become confused as they try to process information. The brain works on certain assumptions of what it has or will perceive in nature. When those assumptions are broken, the brain uses what it has and constructs the best perception it can – an illusion. Some illusions are subjective (for example, the now famous blue or gold dress); different people may perceive what they see or feel differently. Most illusions tend to be optical (visual), but there are also tactile, auditory, taste, and scent illusions. There are so many types of illusions that I can only cover a small amount of them.

Visual/ Optical Illusions
An optical (visual) illusion is one in which images are perceived abnormally because of an overload of information or an underling assumption that prove false (the brain organizes sensory information in specific ways which then prove false, so the brain uses the information given and tries to fill in or construct the rest). There are three main types of illusion

  • Literal optical illusions: create images different from the objects that make them,
  • Physiological illusions: effects on the eyes and brain of excessive stimulation (brightness, tilt, color, movement)
  • Cognitive illusions: when the eyes and brain make unconscious inferences

Let’s look at three examples:

The celebrity’s illusion: plays on the strength of the fovea and the weakness of peripheral vision. The fovea is only about 2% of the visual field, the center of our vision where we see clear and crisp images. Outside of the foveal view, our actual vision is a little burly and our brain constructs a picture from that information, but when the brain is given excessive data (as in the celebrity illusion) our brain tries to compensate. This is probably compounded by the images being faces, which are very important to us and which our brain invest a lot of energy into understanding.

Forced Perspective:
perspective is a very old an important perception – it helps keep us alive by telling us how far things (like predators) are – but it’s built on certain assumptions. When our assumptions are broken, then we experience forced perspective.


Color assumptions: When we perceive colors we tend to think that they are universal, red is always red, blue is always blue, but that’s not correct. How we perceive colors depends largely on context. In the classic example below, the brown square on top center of the block is the exact same color as the “orange” square in the front center. Our brain uses light references to tell us how we should perceive the color, not what the actual color is.


Auditory Illusions
Auditory illusions can be either sounds which are not present (filling in) in the stimulus or “impossible” sounds. A simple example: you may perceive a voice coming from a dummy when watching a ventriloquist since the words seem to synchronize with the dummy mouth movements.

The Shepard tone is a well-known example of an “impossible” sound – it’s cycles between a limited set of tones, each separated by an octave, the illusion sounds like an ever raises continuously (the equivalent of the Penrose stairs illusion).

One important point to know about auditory perception is that it often depends on presumptions, which the brain can quickly learn to overcome. Here’s an example:

Taste Illusions
There are several types of taste illusions, but the classic involves the effect of color on taste. Using either a blind taste test or changing the color of white wine to read confuses even wine judges. And changing the color of sweet drinks (like a lime flavored drink to red) was suggested enough that people perceived a completely different flavor.

Olfactory/Scent Illusions

The sense of smell is very old and may not be as easy to fool as our other senses. There’s very little information available on olfactory illusions and some argument over whether they exist. The one type of illusion I can think of is when unlike molecules smell the same – for example Benzaldehyde (the smell in almonds) and cyanide. It’s difficult to call this an illusion since the brain isn’t being overloaded and no presumption is being warps, but it clearly represents an event when the brain can’t tell the difference between two molecules.

Tactile Illusions
There are several types of tactile illusions. Phantom limb syndrome is one, but since it is in effect a disorder, lets look at another type. The Cutaneous rabbit illusion can be induced by tapping two or more separate regions of the skin in rapid succession. Example: a rapid sequence of taps near the wrist, then near the elbow can create the sensation of sequential taps hopping up the arm even though no physical stimulus was applied between the two actual locations.

Web Applications

It’s always a good to have a professional graphic artists for Web development. I’ve seen unintended optical illusions on sites that distracted from the content – that hurts usability.

Related Articles

Balance – Equilibrioception

Every step we take dances on the edge of disaster – one miscalculation at the moment when we are shifting weight from one foot to the other, and we fall. Balance is a sixth sense and a crossmodal perception. It has nothing to do with hearing, but clearly begins with sensors in the inner ear. We call it Equilibrioception.

Our sense of balance depends on the integration of three sensory systems:

  • Vision: seeing helps us determine our body’s position in reference to the world (gravity). Note: some blind people have issues with balance
  • Proprioception: (see related post) uses the skeletal systems (the muscles and joints and their sensors) to determine the position of the body
  • Vestibular system: The section of the inner ear composed of semicircular canal system, which indicate rotational movements; and the otoliths, which indicate linear acceleration.

The vestibular apparatus (shown below) includes the utricle, saccule, and three semicircular canals (Anterior, Horizontal, and Posterior). The utricle and saccule detect gravity (information in a vertical orientation) and linear movement. As we move our heads fluid moves through the canals and tells us the relative position of our head and its movement. The otoliths act as a kind of accelerometer, helping determine the speed of the body or heads movement. The vestibular system then sends signals to the neural structures that control eye movements, and to the muscles that keep an animal upright.


Human balance perception is not quite terrestrial, that is, we certainly have some perceptual systems or strategies that other land animals do not. For example: a human can stand in a bus holding onto a pole and have little or no issues with balance, but a horse standing in a horse trailer has significant problems with the movement of the vehicle, even though it has four legs (four points of stability). We almost certainly owe this extra bit of balance expertise to our ancestors of the trees.

Web Applications
Virtual reality comes to mind again. It should be noted that professional level flight simulators use hydraulic mechanisms to provide a sense of pitch and acceleration by moving entire simulator rooms. Without some feedback of this sort, virtual reality will always be semi-virtual.

Related Articles


Synesthesia literally means “joined perception” (“syn” together & “aesthesis” perception) and is a blending of two or more senses simultaneously perceived into one anomalous event – a cross-wiring between brain areas that are normally segregated in nonsynesthetic individuals.

For example, synesthetes may hear, smell, taste or feel pain in color. Others taste sounds or perceive letters and numbers in color. And the perception is the same every time for an individual, although they may differ from person to person.

The most common type of synesthesia is called grapheme-color synesthesia and involves seeing monochromatic letters, digits and words in unique colors.


An example of how nonsynesthetic (left) and synesthetic (right) individuals might see a set of numbers.

The neural mechanism causing synesthesia remains unknown but it seems to be a dominant trait and it may be located on the X-chromosome (supported by the fact that synesthesia appears often inherited). Some developmental scientists speculate that all humans are born synesthetic but that normal developmental results in more segregated perception areas of the brain, and still synesthesia is not considered a neurological disorder.

The occurrence of synesthesia is in question, with some estimates being one in 2,000 and others being as low as one in 200. Women are more likely (as much as three to eight times) than men to have synesthesia, and synesthetes are more likely to be left-handed.

Some Characteristics
Synesthesia is more common among artists, musicians, and novelists (8 times more likely) and some very famous people are known synesthetes including: Vincent Van Gogh, Marilyn Monroe, Leonard Bernstein, Duke Ellington, Itzhak Perlman, Stevie Wonder, Billy Joel, and Lady Gaga.

Franz Liszt was quoted as telling his orchestra instructions such as,

“O please, gentlemen, a little bluer, if you please! This tone type requires it!” or, “That is a deep violet, please, depend on it! Not so rose!”

And some synesthetes like the physicist Richard Feynman possess “conceptual synesthesia” where they see abstract concepts: units of time, mathematical operations, shapes. Feynman once said,

“When I see equations, I see the letters in colors – I don’t know why. As I’m talking, I see vague pictures of Bessel functions from Jahnke and Emde’s book, with light-tan j’s, slightly violet-bluish n’s, and dark brown x’s flying around.”

One interesting characteristics of synesthesia is that it seems to enhance memory – the secondary synesthetic perception is remembered better than the primary perception

And one perplexing characteristic is the typical experience of seeing characters in two colors at the same time: the original printed color and the synesthetic color.

Synesthesia as human experience
Neuroscientist Vilayanur Ramachandran gave an excellent TED Talk where he discussed three examples of connection between cerebral tissue in the brain, one of which was synesthesia. In the talk Ramachandran made the case that the type of cross-wiring which occurs in synesthesia actually happens with us all, but we don’t recognize it. One example would be the way we experience sound coupling in movies, with images and spoken words coming together as one experience. Ramachandran provided a simple example. He called the characters below letters of a Martian alphabet and named them “booba” and “kiki.” Then he asked his audience to guess which was which – they almost unanimously said that booba was the image with rounded shapes and kiki was the one with jagged shapes.


We are all somewhat synetheisc.

Web applications
I see an interesting potential for usability that might come from better understanding synesthesia – better consumption of visual and textual information. One simple example can be found in the Beeline Reader, an application that converts text into color gradient text to make it more readable.

Related Articles

Gaze detection

Our perception is, more often that we might guess, framed by our brain’s immersion in social interaction. One such perception is “gaze detection,” i.e., that sense that someone is looking at you. This is sometimes known as the “Psychic Staring Effect” (or Scopaesthesia).


National Geographic’s photo of young Afghan refugee Sharbat Gula’s piercing gaze. Is she looking at you?

In 1898 Psychologist Edward B. Titchener wrote that a class of his students believed they could “feel” someone staring at them from behind, which would force them to turn around. Since then many have claimed that the Psychic Staring Effect is actually a psychic phenomenon, but that notion has been discredited many times. The reasons for gaze detection are much more interesting.

The evolutionary importance of gaze detection 
There are very good reasons for human’s to be hypersensitive to gaze detection. The ability to tell where someone is looking is a critical non-verbal communication that can keep us alive by providing the an early warning system of an impending attack – we are “hard-wired” to err on the side of caution. Gaze detection also serves as an important social survival tool, to help us determine if someone has interested in us. Gaze detection is followed by direct eye contact, which is one of the most powerful non-verbal signals we can tap: it can convey intimacy, trust, intimidation, and influence. Even infants gaze at their parents to get attention and secure social bonds. The evolution of gaze detection is then evident – it’s an essential survival tool.

Factors of the gaze detection system

  • Gaze detection is an indicator that our peripheral vision may provide more information to our brains than we are consciously aware.
  • Gaze detection may be triggered by head and body positions. Reading specific body language clues likely alerts our brain to pay closer attention to the eyes.
  • Brain imaging has shown that superior temporal sulcus brain cells are activated when we see that we are being stared at.
  • The gaze detection system is particularly accurate at a distance. Human eyes make it easier to distinguish the dark center from the rest of the visible (white) eyeball. This makes gaze detection accurate within just a few degrees – we can tell if someone is looking at us or over our shoulders.
  • Gaze detection leads to direct eye contact, which provides crucial and complex communication for survival and reproductive success.

Web applications
Only two possible uses come to mind:

  1. There may be some application for images in marketing through the Web – it’s difficult to get users to look at advertising photos, a direct gaze may help.
  2. Direct eye contact may prove powerful in virtual reality applications/games.

Related Articles

Saccadic eye movements

A saccade (sakad′ik – French, twitch, jerk) is a quick, simultaneous movement of both eyes between two or more phases of fixation in the same direction. Saccadic eye movements are extremely fast voluntary movements of the eyes, allowing them to accurately “refix” on an object in the visual field, and change retinal foci from one point to another. Some Saccadic eye movements can be involuntary.

Saccades are one of the fastest movements produced by the human body with peak angular speeds of the up to 900°/s. An unexpected stimulus can commence a saccade in about 200 milliseconds (ms), and last from about 20–200 ms, depending on their amplitude. 20–30 ms is typical movement for language reading.

We do not look at the world with fixed steadiness, although our brain tells us otherwise. Our eyes move around, locating interesting parts of the scene and building a mental map in three-dimensions. Our eyes saccade, or jerk/twitch quickly, stop, scan, and then move again. The fovea (the high-resolution portion of vision, 1-2 degrees of vision) is one of the main reasons for Saccadic eye movements – we must move our eyes to resolve objects in our minds.

Saccadic masking
One of the most interesting points about Saccadic eye movements involves what we don’t perceive as our eyes move. One would think that no information is passed through the optic nerve to the brain while the eyes move in saccade, that is at least our perception experience, but that’s not correct. Saccadic masking or saccadic suppression begins just before your eyes move and keeps us from experience a blurred or smeared image. You can experience the saccadic masking effect with a very simple experiment: look in a mirror, look at your left eye, then change your gaze to look at your right eye – you won’t perceive any movement of your eyes, which is evidence that the optic nerve has momentarily ceased transmitting or that the brain just refuses to process the transmission.

Spatial updating and Trans-saccadic perception
One of the continually amazing things about perception is that our brain often perceives information that isn’t there. Spatial updating occurs when you see an object just before a saccade, and allows you to “make another saccade back to that image, even if it is no longer visible.” The brain somehow “takes into account the intervening eye movement by temporarily recording a copy of the command for the eye movement” and compares it to the remembered target image.

Trans-saccadic memory is the process of retaining information across a saccade. Neurologist think that perceptual memory is updated during saccades so information gathered across fixations can be compared and produced, creating what researches believe is a type of visual working memory.

Saccadic Dysfunction
There are a series of disorders that can produce abnormal eye movements. One is Nystagmus (also known as “dancing eyes”) a condition of involuntary eye movement (side to side, up and down, and other) that may reduce or limit vision.

Web Development Application
The understanding of saccadic eye movements has had a remarkable impact on Web usability in the form of eye tracking studies. By employing technology that monitors eye movements that can pinpoint precisely where a user is looking on a page, usability testers can study and better understand how people interact with text or online documents.

Related Articles


“Mere color, unspoiled by meaning, and unallied with definite form, can speak to the soul in a thousand different ways.”
Oscar Wilde

There are so many things to say about our color perception. For example, the Human eye is capable of seeing between 7 and 10 million colors. Question: why is that massive range of perception so important?

We can define color vision as the ability of our eyes and brain to distinguish objects based on the wavelengths (or frequencies) of the light they reflect, emit, or transmit. We can point out that the typical human eye is only capable of perceiving light at wavelengths between 390 and 750 nanometers (the “visible” spectrum for Humans), and that our perception of colors is a somewhat subjective process – we all see the same illuminated object or light source a bit differently.

cone_cellWe can delve into the neurological process of three types of cone cells in the retina (up to 7 million) gathering information about visible wavelengths of light that correspond to short-wavelength, medium-wavelength and long-wavelength (red, green, blue). We can even frame the definition of vision as the process of perceiving color:

“‘Vision,’ by common usage, suggest a process; but it is now known that it is built out of many processes, subdivided into at least 32 areas. Before they eye’s input gets to the cortex it goes through two vision systems. One sees motion, and the other color. They largely come together in the primary visual cortex. Nonetheless, the overall visual system continues to stream continues this division into higher areas of the cortex. Vision goes into what a stream that identifies things and into a where stream that locates their positions. And even this description is a gross simplification.” Sagan & Skoyles, Up From Dragons

But the perception of color is much more complex …

What we see as color is, in a sense, an absence of colors. White light from the sun hits an object – let’s say a blue object – and all of the color in the white light (the full spectrum) is absorbed by the object with the exception of blue (and perhaps ultra violet and infrared). The blue is reflected to our eyes. The cones in our retina are stimulated at a specific frequency and sent to our brains for interpretation. We see blue – unless, of course we’re color blind.

But all color is perceived in context, as explained in Neuroscientist, Dr. Beau Lotto’s TED Talk,

Dr. Lotto sums our experience up like this, “The light that falls on your eye (sensory information) is meaningless, because it could mean literally anything. … There’s no inherent meaning of information, it’s what we do with that information that matters.”
Dr. Lotto also provides an example of why we evolved to see colors – pure survival. The example he gives in his TED Talk involves seeing a predator in the jungle. But there are alternative hypotheses.

Theoretical neurobiologist Mark Changizi has speculated that the reason Humans see is because it gives us an advantage in sensing emotions or health on the skin of others. Other neuroscientists suggest that our ability to see many shades of green help us differentiate between and choose plants to eat versus avoid poisonous ones.

No matter how we see the world, there always seems to be another view (as followers of biocentrism might say). Here are just a few interesting facts that we know about color:

  • Color Blindness: Men have a higher chance of being color blind than women. 1 out of 12 as opposed to 1 out of 255. The most common type of color blindness is the disability to tell the difference between red and green.
  • Tetrachromats: About 1.36% of the world’s population (only women) have a fourth type of cone cell in their retinas, (resulting in Tetrachromacy) giving them true four-color vision allowing them to see more than 100 million colors.
  • Depression & Color: Research published by Dr. Emanuel Bubl demonstrates that the retinas of depressed patients were less sensitive to contrast – making colors appear duller.
  • Shades of green: Human cones in the eye are more sensitive to green frequencies than any other. Humans are omnivores, so that not only can differentiating between shades of green plant help us find edible plants and avoid poisonous ones.
  • Ultraviolet colors: The Human eye is capable of seeing ultraviolet when the lens is Removed – some people are born with Aphakia – the absence of the lens on the eye. The great impressionist painter Claude Monet developed cataracts in his old age and after struggling to paint (with his colors washed out) he decided to, at age 82, have the lens of his left eye completely removed – the operation allowed him to see familiar colors, but it also let him see, and paint in ultraviolet (colors we cannot see).

Water Lily Pond by Claude Mone – circa 1926 – includes ultraviolet colors

Web Development Application

Understanding color in for development can be very helpful, underestimating its importance can be crippling, for example, not understanding the limits of users with color blindness. The best bet for avoiding issues is to have a knowledgeable graphic artist on your team.

Related Articles

Field of View

Field of view is the angular extent of what can be seen with the eye. Various animals have different visual fields. Predators generally have more forward facing with binocular oriented vision, whereas prey have side facing visual fields with greater range (for defensive vision). Eyes positioned on the sides of the head is common in prey species, and increases an animal’s total field of view, but it’s often at the expense of sharper binocular vision.

A deer’s field of view can reach 280 degrees, a Rabbit’s field of view can be 360 with just a small blind spot for a short distance behind their head, but with limited binocular vision. A cat has a 200 degree field of view, but with an amazing 140 degrees of binocular vision. Nature has evolved and found advantage in many variations.


Human’s have general static view of about 135 to 180 degrees horizontally, with about 120 degrees of binocular vision. Ho ever, with eyeball rotation (about 90 degrees) the field of view extends to 270 degrees. In addition, vertical field of vision for humans is about 50 degrees in the upper visual field and 70 degrees in the lower visual field.


Peripheral vision is a part of vision that occurs outside the very center of gaze. There is a broad set of non-central points in the visual field that is included in the notion of peripheral vision.

In addition field of view is, in a way, limited by the fovea, the part of human eye responsible for sharp central vision (the only part of the retina that permits 100% visual acuity), which is only about two degrees of field. Our wide, 120 degree field of view for binocular vision is the basis for stereopsis and is important for depth perception, he remaining peripheral 60–70 degrees does not provide binocular vision.

Web Development Application

Currently the applying knowledge of field of view for Web development is only of minimal importance, understanding the limits of a foveal view is more important. But as our use and understanding of virtual technology increases, it will doubtless require a significant understanding of field of view.

Related Articles