Thursday, April 3, 2014

An Exercise in Self-Implication: Does the Type of Thinking I Teach Foster the Experience of Awe?

In what was yet another serendipitous moment of the world speaking to me, I returned home from my trek across the Southwest where I tried to see if I could experience awe every day, to find a series of links sent to me by my friend Greg. It appears there is a fair amount of interest in the academic community to try to test for the effects that the experience of awe produces, and particularly in the way it might change people's outlook on the world. My own experience had led me to conclude that it does, indeed, change one's outlook, but I was curious to see how far the claims of researchers were going to go.

Dacher Keltner from the Greater Good Science Center has done studies that have shown "awe to be a potentially powerful emotion that might help students develop empathy" by reorganizing the participants sense of self to feel more connected to the world. The speculation is that awe might make adolescents less narcissistic, and self-absorbed. I am always interested in these kinds of studies, but, I confess, I often wonder about some of the claims. If one group looks at a T-Rex skeleton and another looks down a long hallway, can you really claim that one group "feels part of a larger whole" because of that experience?


In another recent study, researchers Melanie Rudd and Jennifer Aaker of the Stanford University, and Kathleen Vohs of the University of Minnesota, examined whether awe can expand perceptions of time availability. They found that participants "who felt awe, relative to other emotions, felt they had more time available, were less impatient, were more willing to volunteer their time to help others, and more strongly preferred experiences over material goods." Can you really claim that listening to Beethoven's "Ode to Joy" will make people change that much on the spot? Still, I find it interesting that these kinds of studies are being developed.

There have been people, however, since the 1990's who have been promoting the experience of awe as an important "habit of mind." Art Costa is probably the most well known of these thinkers, and he developed a list of habits of mind that "are the characteristics of what intelligent people do when they are confronted with problems, the resolution to which are not immediately apparent." This list was his response to Jean Piaget's belief that the "principle goal of education in the schools should be creating men and women who are capable of doing new things, not simply repeating what other generation have done...Intelligence is what you use when you don't know what to do." "Responding with wonderment and awe" or "searching for wonderment and awe" has been one of Costa's 16 habits of mind from the very beginning of his work.

These past few months I have been on a subcommittee for "21st Century Learning Skills" as part of a strategic plan to be implemented at the school where I teach. Costa's work on habits of mind seemed to me to be an important and missing addition to the debate over "skills acquisition."  But I think there is a missing precursor before we can even begin to talk about skills or dispositions. The work of Carol Dweck, popularized in her book Mindset, has shown that we must consider how our particular mindset creates a certain culture of learning. In short, this kind of 21st century learning is not done through curriculum design, training regimen or program addition, it is done by creating a culture that supports it. If we do not understand the present culture of learning we have created, we will reduce our chances of making the necessary adaptive changes.

So, I set myself a little thought experiment of trying to describe the foundational beliefs about the nature of thinking that the schools I have been involved in inculcate. Could I describe the kind of thinking that the learning culture of my school embraced most whole-heartedly?  What is the cognitive bias of that culture of learning?

To my aid came the work of Guy Claxton in his wonderfully engaging book Hare Brain/Tortoise Mind which champions the "slow ways of thinking." In the beginning of the book, Claxton describes a certain kind of thinking he calls "d-mode" --- "d" standing for either default or deliberation. Many of the facets of what Claxton describes as the basis for "d-mode" overlap with the following list that I created.


What is the innate bias in the dominant type of thinking that I have been engaged in since I was in school?

The culture of learning that I am part of-

--favors the analytic; is primarily concerned with taking things apart and naming them
--believes in knowledge that is rational and is distrustful of knowledge from other sources
--is much more concerned with answers than with questions (though states the opposite)
--tends to reward thought that converges down toward an answer
--values short-term memory and recall very highly
--gets nervous when there is no answer or multiple answers 
--values proof over exploration
--values structure that is straightforward and easily comprehended
--values explanation (sometimes at the expense of detailed observation)
--rewards ability to explain precisely why a particular action is chosen
--requires rational justification and evidence for any proposal (but is skeptical of hunches)
--rewards the ability to sound like a critic and make judgments
--favors exposition and persuasion over exploration and insight
--judges the value of the thinking by its demonstrated utility
--praises clarity and coherence (shies away from and/or fears confusion)
--values quickness, urgency, time pressure and production
--values production over presence (but has graduation speakers urge people to value presence)
--creates lists as a form of organization so that items can be "ticked off"
--values punctuality and segments time into confined boxes
--values effort and being busy (and gets nervous when people are playful)
--rewards precision and direction (sometimes tolerates the implicit but is skeptical about indirection)
--loves generalizations, rules, principles, universals, traditions and familiar routines
--gravitates towards patterns and is made nervous by anomalies 
--likes to categorize things, label them and put them in order
--values talking and being in control over listening and being messy/undisciplined
--prefers "concrete" precise definition to metaphor or analogy
--is biased towards thinking that is not conscious of itself (devalues meta-cognition)
--sees intelligence as a personal possession and some people have more of it than others
--values knowledge over understanding
--sees all of the above as exhibition of mastery and control

I am not suggesting that this "d-mode" way of thinking is not useful or helpful. Quite the opposite, in fact. In certain situations, they are the cornerstones of good decision-making, well-being and future learning.

However, I am suggesting that this way of thinking was not useful for me at Delicate Arch (though it was vital in getting me to Delicate Arch).

The kind of thinking described above does not put us in the present moment, it does not connect us to others, it does not connect us to our surroundings; it may not even enhance our well-being in terms of how we feel about ourselves.

The kind of thinking described above puts us in control of things, and it may create a particular sense of "self" that the experience of awe actually strips away. Awe challenges us in fierce and wonderful ways to reconfigure what we thought we knew and to create new mental models that we can assimilate into a new way of looking at the world. Awe demands that you put aside the self you have created in order to control the world, and to create a new self that is connected to the world in very different ways.

One of the things that I noticed while I was on the committee for 21st century skills was that all the literature seemed to echo exactly what people on the committee wanted to include in their lists-- creativity, ingenuity, innovation, entrepreneurial spirit, originality, vision, design thinking...and so on. What my experiences with awe have led me to wonder is whether or not the type of thinking that we have committed ourselves to in schools is actually not sufficient to foster any of these skills. In some cases, they may even undermine and counteract those skills. 

(And I do confess that I have begun to wonder whether what we are trapped in is like an old "Bert and I" Maine humor routine about how to get to Millinocket--you can't get there from here.)

What, then, would the type of thinking look like that would foster this kind of experience and the nurturing of those skills? That would seem to be worthy of another blog entry. It would also give me some clues as to what I think the strategic plan for my school ought to include.

Sunday, March 30, 2014

The Paradigm Shifts Induced by the Experience of Awe

In the weeks after visiting Delicate Arch outside of Moab, Utah, my wife, Nicki, and I continued our trek through virtually every National Park in southern Utah, Arizona and New Mexico. Each day became a new adventure into a different landscape. Dead Horse State Park, Canyonlands, Capitol Reef, Bryce, Zion, the Grand Canyon, Sedona and Red Rocks, the Petrified Forest and the Painted Desert and finishing way under ground with the bats in Carlsbad Caverns.

Many questions arose for me around this concept of "awe."  Would the repetition of the experience of awe become what Woody Allen depicted with the Orgasmatron in his movie Sleeper? Was there something about the way we exist as human beings that makes us more or less prone to the experience of awe? Finally, is the experience of awe tied to what we are as human beings and is it something that should come naturally on a regular basis?

But before I start to explore that topic, I think I should offer some kind of definition of what I think awe is--or what it does. As I have read about awe, there area multitude of definitions over such a long period of time that it is a difficult word for people to agree on. Awe is often a kind of linguistic Procrustean bed that gets chopped up in order to make it fit the situation. Sometimes it is associated with wonder, or reverence, or surprise, or fear, or the apprehension of the sublime. There is even a campaign afoot to certify it as the eleventh scientifically accepted emotion.  One definition that I like is used by Jason Silva from National Geographic which he has taken from a Stanford University study-- "an experience of such perceptual vastness you literally have to reconfigure your mental models of the world to assimilate it." Obviously, he is using vastness because nature is his primary text, but what makes a particular impression on me is that he understands that one of the prime ways that an event becomes an experience is precisely because it, "reconfigures your mental models." I also think awe changes the person who experiences it in potentially deep and dramatic ways; that is why it is so important for experience-based learning.

I first came across the concept of awe when I was studying in Divinity School where, as you might guess, awe is a regular topic of conversation at morning coffee. In fact, my first exegetical essay in grad school was an exegesis of Genesis 28:16 where Jacob has a dream in which God speaks to him at Bethel-- "When Jacob awoke from his sleep, he thought, "Surely the Lord is in this place, and I was not aware of it." He was afraid and said, "How awesome is this place! This is none other than the house of God: this is the gate of heaven." Two important qualities of awe emerge in this short passage. First, considered through the lens of a paradigm shift, awe often is perceived by and taps the unconscious; in Jacob's case it is portrayed by his awakening from a dream. But when he acknowledges with his rational mind what has happened, he is changed. He was asleep, and now he is awake: he has become aware. His experience of awe changes his perception of his physical place in the world, but also it changes his understanding of himself. The experience of awe is a challenge to the world as he presently constructs it and Jacob is, rightly, "afraid." The first paradigm shift that awe engenders is a re-working of the relationship between the unconscious and the conscious. The unconscious takes a firm hold on the steering wheel at the beginning of the journey of awe.

As second major paradigm shift that also occurs involves our conception of time. The Greeks had two words for time that capture the transfer of what happens in the experience of awe--chronos and kairos. Chronos is just what it sounds like, chronology. It is "clock time," and it can be represented by a number. In short, it is the measurement of time. Kairos, however, is a different conception of time. It is "event time." If chronos is quantitative, then kairos is qualitative. Chronos tells you it is March 28, 2014; kairos tells you it was the day you first went for a walk after surgery. Chronos is a number for measurement; kairos is an experience of an event that requires interpretation.

An experience of awe is always an experience of time as kairos. Time doesn't actually stand still in this sense as you are swept up completely in the moment. It is really that the sense of chronos that dominates out daily life is replaced for a period of time with a sense of kairos. Digital watches embody chronos. Each number that flips by is an exact measurement of that moment. I remember when digital watches came into existence and replaced the analog watch that I refused to wear them. It took me a long time to realize that a clock with arms that sweeps through a circle is time in a closer relationship to kairos. That kind of time has a relationship to something outside itself--it is part of an hour. The question this all raises for me is this--is it true that the more chronological we become, the further away we place ourselves from the capacity for awe? Clocks are the enemy of awe; digital clocks are the enemy doubling down. (There is one exception, in my experience, that is actually quite captivating--watch the Millennium Clock Tower in the National Museum of Scotland.)

Perhaps the most powerful paradigm shift that seems to be part of the experience of awe is the way it often transforms your relationship to your world and to yourself. My experience at Delicate Arch (and at Weeping Rock in Zion National Park, deep underground in the Green Lake Room at Carlsbad Caverns and with myriad other natural wonders) was such an entrance into the DKDK zone because there was a novel, surprising vastness (though that could be internal as much as external) that forced me to stop and think about my relationship to what I was seeing. Different people have different emotional reactions to those kinds of situation; awe is never experienced the same way for people. For some it might be amazement, humility, fear, reverence or fascination, but the effect is that you are jerked out of your conscious self in chronological time and forced to implicate yourself in the sense that you are brought into an intimate connection. In short, the way you thought the world was constructed has been challenged and there is a forced, involuntary re-evaluation; and this can happen anywhere at anytime.


I am presently taking a MOOC course at Harvard X (with around 10,000 other people from all over the world) with Professor Bob Kegan entitled Unlocking Immunity to Change. The course explores a technique that counteracts the impulse to stasis as well as exploring why we fail when we try to change. One of the first things you have to do is choose a goal that is "adaptive" rather than "technical." An adaptive goal is one that requires a change in mindset, attitude or beliefs; it is a goal that cannot be solved with a technical fix. Another way to look at this is that technical problems can be solved with your reasoning skills and through thinking. Adaptive problems require changing people's beliefs and are addressed with your stomach or your heart. Rearranging the deck chairs on the Titanic to achieve more space would be a technical problem. Coming to the realization that you need to get off the Titanic would be adaptive.

The other part of my goal setting for this course requires me answering four questions: Is it true for you? Is there room for improvement? How important is it to you? Does it implicate you? It is this last question that has generated a number of questions from the class participants. Many of them want to know exactly what "implicates" means. In fact, the instructors warn, "A common mistake is people choosing a goal that does not implicate them." It is like they are choosing a goal that has a chronos solution, when, in fact, they need to look to kairos for some answers. Sometimes I think that self-implication may be the most difficult experience-based learning concept to explain.

My goal for the course is, "How can I increase the number of times I experience awe?" After you set your goal, the course instructors ask for a rationale for why you have chosen this goal. I just wrote in a quote from Albert Einstein that I love, “The most beautiful emotion we can experience is the mysterious. It is the fundamental emotion that stands at the cradle of all true art and science. He to whom this emotion is a stranger, who can no longer wonder and stand rapt in awe, is as good as dead.” As they say in Fiddler on the Roof-- TO LIFE!

Sunday, March 23, 2014

Cultivating Awe @ Delicate Arch

I spent two months this fall seeing how many times I could put myself in a place where I might experience "awe." I had a hunch that what we sometimes call "awe" might be something that had gone underdeveloped in thinking about why and how we can make some experiences deep, memorable and life-changing. Usually when I hear the word "awesome" in contemporary culture, it seems to be a cheapened and even falsified use of the word. Somewhere in the early 1980's with the creation of the Official Preppy Handbook and the rise of the film Valley Girl, the word became synonymous with "totally" and often followed by "dude." In fact, sometimes I think that people use that word as a kind of wish fulfillment; we actually desire and even need awe in our lives, but we don't actually have the feeling very often.

Awe, I am speculating, may be like other concepts--empathy, serendipity, availability--that I have explored earlier in this blog. These are all ideas that, if we can identify and develop them as skills and dispositions, might give us increasing number of ways to make our learning truly transformative and life-long.

I began outside of Moab, Utah at a place I had read of long ago in Edward Abbey's book, Desert Solitaire--Delicate Arch in Arches National Monument. To get there you follow the well-worn path up sandstone and sliprock for about two miles enduring a number of false "peaks" and promises that make you think you're there. The actual arch appears alarmingly quickly as you follow up a narrow path that hugs the side of a cliff. But then the path ends, the cliff wall recedes and ....

One of the things that I have discovered about awe is that it is highly subjective. One person might look at this scene and become mute and immobile; another person give it a glance and check their I-phone.  When I arrived there was only one other person there, a man from Seattle who had been to this spot decades before and had returned because he had just retired and was not sure what to do with the rest of his life. He was looking for inspiration, and we sat silently for a long time periodically sighing. But then some recent college grads arrived and after a cursory glance at the scene before them sat down and analyzed the "awesome" party they had been to the night before.

And, finally, a middle aged couple weighted down with photography equipment turned the corner from behind the cliff and, in what seemed to be the blink of an eye, the man had set up his paraphernalia and had turned to his wife and was saying something about the glare from the sandstone and his need for a device that would calibrate the light for him. I was reminded of Annie Dillard's words of wisdom from Pilgrim at Tinker Creek when she realizes that she experiences different ways of seeing. After describing how sometimes she has to verbalize and analyze what she is seeing she writes,  "But there is another kind of seeing that involves a letting go. When I see this way I sway transfixed and emptied. The difference between the two ways of seeing is the difference between walking with and without a camera. When I walk with a camera I walk from shot to shot, reading the light on a calibrated meter. When I walk without a camera, my own shutter opens, and the moment's light prints on my own silver gut. When I see this second way I am above all an unscrupulous observer."

For me, Delicate Arch set the stage for a series of future events as I became obsessed with trying to see if I could recreate the feeling I had there, but it was not until a good deal later that I could even begin to process and give language to what it felt like. One of the characteristics of awe might well be that it is pre-verbal, and that it resists capture as well as duplication. In fact, as I look at the picture at the top of this blog post, I find it so inadequate as to be laughable. Again and again I found myself in the next months standing next to someone at Dead Horse State Park, or Bryce, or Zion, or peering into the Grand Canyon saying, "I can't describe this; no picture of this will make any sense."

If I try to put words to the feeling I would say that first, time ceased to feel chronological; to look at my watch would have seemed comical. My focus became sharper; I lingered on certain vistas for longer periods of time. At the same time, I found myself involuntarily asking all kinds of questions about the relationship between myself and my surroundings but not with the desire for explanation but something more like connection. Second, that desire for connection came, paradoxically, I think, from the vastness and the novelty of what I was seeing. I would describe it as a simultaneously moving outward and inward. Oddly, a line from Fitzgerald's The Great Gatsby floated through my head, "I was within and without. Simultaneously enchanted and repelled by the inexhaustible variety of life."  And I chuckled to myself as I imagined sharing this moment with a more cynical Nick Carraway. And third, in that same vein, I found myself being more self-aware but not self-conscious. In the same way that time had changed, my sense of self was more open, more what I have called in an earlier blog post "available." All of these characteristics mimic what Mihaly Csikszentmihalyi found in his study of peak experiences in his book, Flow

Abbey described his first experience of Delicate Arch this way, "The beauty of Delicate Arch explains nothing, for each thing in its way, when true to it own character, is equally beautiful. If Delicate Arch has any significance it lies, I will venture, in the power of the odd and unexpected to startle the senses and surprise the mind out of their ruts of habit, to compel us into a reawakened awareness of the wonderful--that which is full of wonder...The shock of the real. For a little while we are again able to see, as the child sees, a world of marvels." Awe awakens us to new possibilities, but, at the same time, it also challenges our customary way of moving through the world.

So here is something else I learned-- You do not find awe, awe finds you. And perhaps that is one reason why it is so subjective as an experience. But what I want to explore further is whether, even though I know you cannot create awe, can you prepare for it in a way that will increase the chances that it will find you? Are there things that we do, mindsets that we embrace, that actually divorce us from what might be a daily dose of awe? These were questions that came much later, however; they could not have been of the moment.

Tuesday, March 18, 2014

On Felt Experiences, Rituals, Saying Yes and Being Present

For the past two months, I have been practicing being fully present in the moment. But it wasn't until I went, again, to Sleep No More the immersion theater piece that I have written about in an earlier blog post, that some realizations about how to be fully in the present came together in powerful ways.

Sleep No More is a theater production that contains twenty-one characters, multiple plot lines loosely based on Shakespeare's play Macbeth and Hitchcock's film Rebecca that rotate three times each night and a six floor hotel as its setting. It also completely eliminates the "fourth wall" in a way that intentionally pushes the audience out of their comfort zone. In short, it is a wonderful creation to explore how to live in an experience-based way.

Jim James at the McKittrick Hotel, home of 'Sleep No More' in New York.

Some of the power of Sleep No More is that it is performed without language. The absence of the ability to speak (it is forbidden for the audience and cast) means that what you know most deeply over the course of the evening you know in your body first. In this case, unlike most of most of daily life, your body gives you the most immediate information about your surroundings. I tend to live in my head most consciously, and this theater piece confronts and confounds that impulse. During the course of my sabbatical year I have been experimenting with a process called Focusing--one that I described in detail in an earlier blog post. Perhaps the most revolutionary concept in the Focusing process is the ability to access what is called the "felt sense."

Focusing begins with the recognition of a "felt experience." The racing of our heart, our palms beginning to sweat, the aching in the pit of our stomach are all seen as our body recognizing something important occurring before our conscious mind can access what is happening. For example, the witches know that Macbeth is approaching through a felt experience, "By the pricking in my thumbs,/ Something wicked this way comes." Embodied cognition (as it is now called by some cognitive psychologists) is, for some people, a deeply powerful way to know something. While some literary critics see the witches as having supernatural powers, I think that they actually are just good at listening to their bodies. Acknowledging what we know in our bodies brings you into the present in dramatic ways.

A second factor increasing the capacity for being present was offered by the actor playing Macbeth in the talk back after the play. He explained that "since the the text is an unwritten one--it is physical and it is repeated three times every night--then becomes a routine that is like a ritual. You have to follow the ritual because that is what allows you to be fully present with the people who surround you."

Finally, the actor playing Hecate immediately pointed out that, even though there is a ritual, things never go as planned; there are always changes that have to be made spontaneously. What was most revelatory to me, however was what she said next, "You are not going to be able to fix what has been changed, so you have to accept the changes, not deny them. You have to say 'Yes!' to whatever happens."

One unusual aspect of Sleep No More is that in a show where everyone is wearing masks (see the picture above), the actors choose audience members at different times in the show to engage in a private "one on one." When someone asked how the actors chose the audience members to engage in one on ones, they answered, "There is usually just something about the way they are engaged with everything; they are the people who are most present."

Two months ago my wife had basal cell cancer and a subsequent surgery on her nose that meant that she was ordered to be immobile for the next six weeks. My role in this was just to be there--to be present. So, the daily routine for a month revolved around changing bandages twice a day. We would sit down to the dining room table--now covered with bottles of hydrogen peroxide, aquaphor gel, xenoform strips, boxes of sterile gauze pads and what would eventually be hundreds and hundreds of Q-tips--and methodically go through the ritualized procedure of taking off the old bandage and setting a new one in its place.

I was really more like a sous chef in this process--I layed out the materials and then was responsive whenever needed to supply the correct item. Since the procedure always had its little quirks, I had to simply watch and say whatever I thought was needed in the moment. As days went by, I began to notice that the consistent verbal chatter that I had been offering began to recede--oftentimes because I was being told it wasn't helpful. It felt like I had become the soccer coach I always abhorred and avoided--the one who constantly yelled out at the players while they were performing.

So, in response, I began to settle in to simply watching intently and stopped talking. If I had to describe it I would say that I was much more aware of the immediate surroundings, of where everything was on the table and how far away it was when I had to reach for it. I also became more aware of my own place in the surroundings, of my breathing and anytime my body moved. Finally, because there were always little things that would go wrong during the bandage changing, I was aware of the purpose of each part of the process and how things had to be improvised to go forward. It was always true that "What's done cannot be undone;" you could not fix what was happening, you could only accept it and move forward.


Over the month, as I became more and more grounded, centered and connected, the bandage changing process took on a levity and a lightness. What happened, I believe, is that we both became more fully present with each other and with the moment. And that feeling began to grow out from those moments to the rest of the day. Days would go by where we had just been with each other for the day.

The power of being productive holds great sway over my own life, I am trained to get things done and often to them quickly and efficiently.  But I have begun to notice that the power of presence seems to have changed me in ways that make the world look a bit different. The world, I think, may be looking at me differently as well. I have been to Sleep No More four times before the other night, but I never had a "one on one" with any of the actors; the other night I had three.

Tuesday, August 27, 2013

Women's Ways of Knowing and Divergent Thinking

I got a note the other day from one of my former students, Carl, who was reflecting on our class together last year. He wrote, "I miss our history lessons! All the times we went off-topic and started talking about interesting things, haha!" Carl was remembering the times we were practicing connectivity on the tangent board (see this earlier blog post for a discussion of the role of tangents in fostering and inculcating divergent thinking) or explicitly practicing divergent thinking. Carl was someone who came into that class as a wonderful analytic, convergent thinker who had been well trained in the basic techniques of the sciences in Europe. 

What surprised him most about the class, however, were the times when we diverged and connected people, events and concepts that seemed far ranging and even, "off-topic." Sometimes these connections would take the form of creating analogies that seemed to contain portions of what we were exploring. Other times, we would have "metaphor practice" to try to construct a metaphor that described an historical event as fully as possible using our own experience. At the highest levels of thinking, analysis and metaphor meet as the critical and creative forces that make for original thought. To permanently separate them is to create a false dichotomy that chokes off imagination. I like to think that I was just trying to get Carl ready for a career in science (or whatever) by following the dictum of evolutionary biologist R. C. Lewontin, "It seems impossible to do science without metaphors."

As I was responding back to Carl's note I was reminded of one of the first times I ever thought about divergent thinking in this manner. I had made the conscious move to change schools from a rural all boys boarding school to an urban co-ed day/boarding school. Other than in summer schools, it was the first time I had taught girls, and it was remarkable to me how much more the girls' thinking was connected to both their own experiences as well as to other stories while they talked in class. 

This led me to start listening for "how" someone was saying something rather than only "what" they were saying. It is this added level of listening that is one of the cornerstones of shifting from thinking about teaching as being solely about the transmitting of information to examining teaching as also encompassing the exploration of how what is being transmitted is being received. Once you make this transition, it is like Alice falling down the rabbit hole or Dorothy waking up in Oz--a world full of talking scarecrows and mad hatters that consistently resists full comprehension. (I explored another form of this phenomenon in an earlier post on "The Hedgehog and the Fox.") It is has been one of the most profound paradigm shifts in my understanding of what I am doing in a classroom.

The confusion induced by my paradigm shift was compounded as I realized that having a full class of people who were ALL using different techniques to process what we were discussing was overwhelming and exhausting. But it also felt exciting every day because even if you had taught something before (I am up to having taught The Great Gatsby over forty times), you could never predict HOW a student was going read a book. 

I began to to investigate this phenomenon and, in a serendipitous moment, I discovered the Stone Center at Wellesley College. It was the work of Jean Baker Miller, the founder of the Stone Center, on the psychology of women that first drew my attention, but in the same year I had shifted schools a book appeared called Women's Ways of Knowing that made me re-think (and subsequently expand) the way I had been thinking about teaching. The discovery that Blythe McVicker Clinchy and her colleagues documented in that book was that there were identifiable epistemological levels to the way students engage material they are learning. I remember being so excited that I made my newly formed interdisciplinary course in Philosophy and Literature read the whole thing.

Clinchy discerned that there were levels of understanding that could be described and that were common to all students. But the real discovery, form my point of view, was that men and women appeared to have different techniques of making meaning at the upper levels of the stages. In short, men and women showed similar approaches in early stages of learning until they came to the level of "procedural knowledge." When people are in this stage they are asking questions about the accuracy and worth of the information they are receiving.  Is Nick a reliable narrator in Gatsby? Does Jefferson really believe what he wrote in the Declaration?

In other words, learners in this category were engaged in a reasoned reflection about the nature and authenticity of authority. Clinchy posited that in this stage there were "separate knowers" and "connected knowers." The former detached themselves from what they were studying, tried to remain objective and were often willing to argue and debate about whether something was reasonable.  These were predominately male. Connected knowers, however, were more likely to try to empathize with the source's point of view, to see the source in its real world context and to connect the source to their own experience. These learners were predominately female. Obviously, I wanted my students to be able to do both, but it seemed to be true that most students, like Carl, favored one form--separate or connected-- over the other.

But it was when I started teaching and coaching at a school out West that Clinchy's findings became even clearer. And it was not surprising for me that it was when I shifted from coaching boys to coaching girls in soccer that the difference between separate and connected knowers became most vivid--and most useful. Sports---like theater and music and all of the arts--are performance based activities and, as a result, you get immediate, real time feedback about whether something has been learned or not. Did the ball do what you wanted it to do?  Did you hit that note, or not? 

Furthermore, the more confident you were, the better you performed. But where did that confidence come from? What was its foundation? What was immediately apparent to me was that it was different for girls than for boys. With a girls team the more they empathized, saw what they were learning in a larger context and connected it to their own experience, the more confident they grew, the more they developed technically and strategically, the higher the their level of intrinsic motivation and the more it meant to them. From that moment on my teaching, and my coaching, became both more varied and more focused at the same time whether it was with boys or girls.

The point, however, is not that one way of procedural learning--separate or connected--is better but rather that it is good to know the epistemological strengths of the people you are teaching (and coaching) as well as where you think they could grow in the future. 

Finally, Clinchy's last level of learning--constructed knowledge--understands that learning is a process based on construction, destruction and reconstruction. These most sophisticated learners have a high level of tolerance for paradox, ambiguity and developed a narrative sense of self that tried to "establish a communion with what they are trying to understand." And being meta-cognitive--understanding the techniques that you yourself use to learn best--seemed to me to be a fundamental goal of any teacher or coach.

By the way, it turned out that Carl was a superb connected learner as well; he just hadn't done it much before.

Thursday, August 15, 2013

Divergent Thinking about the Purpose of Studying and Doing History

One of the hardest things about starting the school year is remembering the things I need to forget. Once you have been teaching for awhile (and this only gets worse the more you do it), you build up such a reservoir of tacit, assumed understandings about what you are teaching that it is even more important to remind yourself that you have to see what you are teaching from the student's point of view. If you don't, you will never move them forward. I need to forget that I already know a lot about doing history, and look at it from their point of view. For example, if I don't let students explore what they think about history and what their past experiences have been in classes, then I will not have an accurate benchmark to know where to begin.

For example, many of them are really Henry Ford historians--"History is more or less bunk. It's tradition. We don't want tradition. We want to live in the present, and the only history that is worth a tinker's damn is the history that we make today." (Chicago Tribune, 1916). But some of them might be Arnold Schwarzenegger historians--"Ba Ba Ba BOOM. You're History." (The Terminator, 1984).

Another vital skill I have to remind myself to introduce is the idea of "divergent thinking." I wrote about this briefly in an earlier post where I paid the price and induced a frightened non-engagement in my class for a week because I forgot how crucial this skill is to creating an experience-based learning environment. So, what is "divergent thinking?"

The objective of divergent thinking is to generate a lot of ideas in a relatively short period if time.  As two-time Nobel Prize winner Linus Pauling famously said, “To get good ideas is to get lots of ideas, and throw the bad ones away.”

Years ago there was a study that tested “divergent thinking” on a group of people.  These people were in kindergarten. The percentage of people scoring at “genius level” for divergent thinking was --- ready, 98%.  When that group was tested five years later that number was down to 32%.  When these children were fifteen years old, the number was down to 10%.  By the time they were twenty-five, the number was down to 2%. When I ask my students what they ascribe this downward trend to they are quick to say, "School." Regardless of whether they are right, it is kind of damning that they think this in the first place.

One of the major reasons we create environments that are not experience-based is because we are so focused on convergent thinking. We have a lot to cover, and we need every second to transmit that information to our students. In the past few years I make sure I have "shadowed" a student through their class day as an exercise to try and see the world from their point of view. What I find is the explicit or implicit goal of virtually every class is to converge down to a formula, a piece of information, a previously held interpretation. In short, the objective is to know something but the process is almost always a converging down on an answer, not on an opening up to an exploration. Divergent thinking is something that fosters the latter, and I always have to remind myself to include it as part or all of classes early in the year. And then I have to remember to keep doing it.

I thought I would practice a little divergent thinking myself on the topic of the "purpose of studying and doing history." There are a few rules in divergent thinking--avoid judging what you are thinking, try to be additive and play off what you just thought and be as playful as you can are vital. As Plato said, “What, then, is the right way of living?  Life must be lived as play.”  

So, here goes:

Some Reasons to Study and Do History: 

1)    George Santayana- “Those who cannot remember the past are condemned to repeat it." 

This seems to be the most common starting place for history teachers in staking a claim for the relevance of their discipline. However, I find that few historians actually subscribe to it. The "condemned" part seems both didactic and prescriptive. History certainly has to do with the past, but the past can't be repeated. At least that is what every historian I know thinks. This, of course, was the downfall of that non-historian, Jay Gatsby--"Can't repeat the past? Why, of course you can." 

2) You can't repeat the past, but there are cycles and patterns that can help you identify where you came from.  

Arthur Schlesinger was very big on this idea; he saw the identification of these cycles as leading to a more ideal society. For Schlesinger, the tension between pragmatism and idealism is part of the American character. It is important, however, that the identification of cycles is never helpful as a predictive mechanism. That is a fundamental difference between social scientists and historians. The former are trying to be predictive; the latter never are. Historians are wary of generalizations and dwell in particular settings, whereas social scientists are using particulars to achieve general theories and rules. Understanding the difference between social sciences and history is crucial and often muddled in a way that confuses students.

3) Maybe the past cycles, or maybe it provides models and analogies for the present and the future.  

Richard Neustadt was a big proponent of this. Much of this argument sees the past as a powerful analytic tool for making policy decisions. It uses case studies to examine whether something happening now is analogous to something that happened in the past. For historians though, models are like lenses on a camera, they bring some things into focus while blurring other things. It is a trade-off, you see some things more clearly but you miss other things completely when you use any model or analogy.

4) Perhaps the way to see the past most clearly is to see it through the lens of myth? 

Rollo May wrote a good deal about this. Myths, in this view, are not falsehoods, they are stories that are either living or dead. The way to understand the unconscious of another world is to understand its myths. As May wrote, "A myth is a way of making sense in a senseless world. Myths are narrative patterns that give significance to our existence [...] myths are our way of finding meaning and significance. Myths are like the beams in a house; not exposed to outside view, they are the structure which holds the house together so people can live in it." Myths tell us what we have internalized in our unconscious in ways that we are unaware of--another version of the DKDK zone.
(Divergent thinking should not be confused with brainstorming, by the way, although they are related. Brainstorming is a technique that encourages divergent thinking. Brainstorming is just one of many possible ways to produce divergent thinking, however.) - See more at:
5) If myths aren’t true or false but, rather, living or dead, then we gain self-knowledge by understanding change over time in the mythic as well as the “historical” sense.   

The unveiling of underlying collective, cultural myths gives one a greater control over one’s life in the present. In other words, an understanding of one's deeply held myths is essential to both national and individual mental health.

Here is an experiment I have been running for thirty years since I was reading May's work. What book has EVERY American read or had read to them? My findings have been that I can say four words to you and you will all give the same answer. Ready? Here are the words-- "I think I can." 

Answer--The Little Engine That Could. Every year, the students who are "American" howl with delight that they all shout out the same thing. The "non-American" students just look quizzically at that behavior. The reason is that the "myth" of that story, and its multiple lessons on persistent striving and being the underdog, is so deeply internalized in our culture that it tells us, as a country, who we are. Interestingly, from an historian's point of view, it may be one of the myths most in jeopardy of dying right now.

6) Mark Twain looked at the past in a kind of poetic way--"The past does not repeat itself, but it does rhyme." 

What I am realizing in this divergent thinking exercise is that studying and doing history provides a context for our lives. It reminds me of the philosophers--I remember reading Ludwig Wittgenstein in grad school--who believe that the origin of meaning is really in context. In  other words, meaning comes primarily when we are able to put something--a word, an event in our lives--in context. Without context, you have no meaning, only action. We need history in order to provide meaning for our lives. History is like the landscape of a scene you are looking at; you need that landscape to provide context that will tell you where you are. Perhaps history is to one's life what perspective is to a painter. I confess, however, that there are times that I think I am teaching history to people who are the least historical people (teenagers) in the least historical country (America) in the world.
7) And, of course, William Faulkner saw the past as always with us when he wrote, "The past isn't dead; it isn't even past." 

I think this is true, and it is most obviously seen in the idea of history as being actually similar to memoir--something I talked about in an earlier post. But Joan Didion is perhaps the person who resonates most deeply on this topic, and provides a nice closure to this first set of divergent thinking posts. She writes in "On Keeping a Notebook," “I think we are well advised to keep on nodding terms with the people we used to be, whether we find them attractive company or not. Otherwise they turn up unannounced and surprise us, come hammering on the mind's door at 4 a.m. of a bad night and demand to know who deserted them, who betrayed them, who is going to make amends.”

Put Faulkner and Didion together and you end up with a poignant plea for history as a necessary signpost to self-knowledge and deeply understanding who one is as a person. But that understanding only comes when something means something. Studying and doing history provides the necessary context that allows that kind of deep meaning to emerge.

What this little piece of divergent thinking has shown me is the power of connection, and the way in which you create idea through that act of connecting. Next time, I will play off some of these ideas to explore reasons to study history based on the role of narrative, the relationship of stories to intelligence, and empathy.

Tuesday, August 6, 2013

A Meditation on the Difference between Purpose and Relevance

I remember my college freshman economics book using the example of a someone dealing marijuana in order to explain a particular principle in the study of economics. It is significant, however, that I remember that the book used marijuana dealing as an example, but not the principle. The book was making a pandering play at trying to gain my interest--and perhaps the interest of my hall mate who actually was dealing marijuana--but it misfired because of a misunderstanding of the difference between relevance and purpose. Relevance is, unfortunately, dependent upon the perception and perspective of the viewer, and that, also unfortunately, is oftentimes myopic and shortsighted.

The poet and essayist Wendell Berry beautifully expresses the problem with relevance in teaching and learning-- “Of all the issues in education, relevance is the phoniest.  If life were as predictable and small as the talkers of politics would have it, then relevance would be a consideration.  But life is large and surprising and mysterious, and we don’t know what we need to know.  When I was a student I refused to take certain subjects because I thought they were irrelevant to the duties of a writer, and I have had to take them up, clumsily and late, to understand my duties as a man.  What we need in education is not relevance, but abundance, variety, adventurousness, thoroughness. A student should suppose he will need to know much more than he can learn.” Relevance has the negative side effect of actually closing us down to what we might most deeply need. But how do we combat this tendency?

Experience-based learning gains much of its energy and direction not from trying to be relevant to the student, but by trying to identify with great precision the purpose of what is being learned. Each summer I try to spend some time thinking through the purpose of whatever disciplines and skills I am teaching that coming year, and I have been pleasantly surprised by how my thinking has grown over the years with the repeated returns. This focus on purpose has had an effect on my relationship with my students as well --  most immediately when they want to know, "Why are we studying this?" and "When am I going to use this?" To me those are legitimate questions that we, as teachers, all ought to have sophisticated answers at the ready. Learning sticks with you when it has meaning, and meaning is directly related to purpose.

As one of my favorite cognitive psychologists Robert Sternberg has written concerning the major factor as to whether people achieve expertise (see the previous post for an investigation of how that operates with historians), "It is not some fixed prior ability (that determines whether one achieves expertise), but purposeful engagement." Purpose is both the engine and the compass of experience-based learning and if you can't articulate the purpose of something, then the "abundance, variety, adventurousness and thoroughness" that Wendell Berry talks about is never really embraced. Furthermore, Carol Dweck, author of Mindset, has posited that a sense of purpose a foundational precondition for creating a "growth mind-set" that is, in turn, a key to intrinsic motivation. In short, with having a sense of purpose the stakes are high.

The other day my friend Dan sent me an article that approaches this same idea from a different angle. What happens if you don't have a sense of purpose in your activities? A sense of purpose, the creation of meaning and feeling of control are all linked together. If having a defined sense of purpose gives you a greater feeling of control over a situation, what happens when you start to feel like you are not in control of your life?  British epidemiologist Michael Marmot has concluded that you risk a significant increase in the amount of debilitating stress you endure. He writes, "Although professionals may bemoan their long work hours and high-pressure careers, really, there’s stress, and then there’s Stress with a capital 'S.'  The former can be considered a manageable if unpleasant part of life; in the right amount, it may even strengthen one’s mettle. The latter kills. What’s the difference? Scientists have settled on an oddly subjective explanation: the more helpless one feels when facing a given stressor, they argue the more toxic that stressor’s effects. So the stress that kills, Dr. Marmot and others argue, is characterized by a lack of a sense of control over one’s fate. Psychologists who study animals call one result of this type of strain “learned helplessness."" If we want to avoid the toxic stress and the resultant motivational desert of "learned helplessness" that often results, we need to be able to articulate to our students the purpose of what we are doing together.

In clarifying our understanding of the importance of purpose we might also rescue the concept of stress with a small "s." Toxic stress is debilitating, but what often happens, as a result, is that people try to avoid all stress as much as possible. There is another kind of stress, however-- "understandable stress"-- that is the basis for the creative anxiety we feel in many of our most beloved activities. The "butterflies" that someone gets before a music recital, a dance performance, or in the locker room before the big game is a kind of stress that increases the depth of the learning and, for many people, the enjoyment of the activity. We are encountering a problem, but one that we think we can solve. That process, cognitive psychologists have found, is what triggers learning, not the addition of relevance. It's the feeling you have when you have heightened sensations that help you FOCUS more clearly and INTENSELY.  Anxiety or stress in this sense is creative because it puts you more fully in the moment, more alert, and more attentive to what is needed in that moment.

Experience based learning actually tries to induce that kind of understandable stress in order to foster creativity. It is the creation of an environment where you try to coax people into the DKDK zone. The DKDK zone is where the most transformative learning occurs, and it is often characterized by understandable stress when properly managed.  How would you know what you were capable of if you only did what was comfortable -- and what you thought at the time was relevant?  Wendell Berry is right-- "we don't know what we need to know... and we will need to know much more than we can learn." The problem with teaching and learning seen through a lens of relevance is that it provides a way for both students and teachers to avoid stress and anxiety, because it is always seeking a link/connection to what is already known. Whereas transformational learning inculcates the ability to tolerate - and embrace - that sense of understandable stress inherent to the DKDK zone, that is actually a necessary spark to creativity.

So, in the next post I will introduce the concept of divergent thinking as a way of mitigating toxic stress. And then I will practice that technique concerning the PURPOSE of studying and doing History; that will be a messy operation, I suspect.

I am just finishing Roger Schank's new book Teaching Minds: How Cognitive Science Can Save Our Schools and in it he issues this warning about the activity we are about to undertake to articulate purpose. He writes, "We say things to students like "You will need this later." But this is usually a bold-faced lie. You don't need algebra later. Making up nonsense convinces nobody." Now, THAT is throwing down the gauntlet.