No way out but through

A-Bomb group leaders, via NY Times/Bettmann/Corbis

Last week’s NMFS here at Virginia Commonwealth University discussed Vannevar Bush’s epochal (and, in its way, epic) “As We May Think.” The essay truly marks a profound shift, appearing just as WWII was about to conclude with a display of horrific invention that still has the power to make one’s mind go blank with fear. From Resnais’ Hiroshima Mon Amour to a film that can still give me nightmares, The Day After, the mushroom cloud that signifies this invention hung over my childhood and adolescence–and I don’t expect it will ever go away. Now that we know how, there is no unknowing unless civilization erases itself.

But as myth, fiction, and science continue to demonstrate, each in its own way, there are thousands of demonstrations of the real problem to hand every day: human ingenuity. It’s easy to get distracted by the name “technology,” as if it’s what we make, rather than our role as makers, that’s to blame. But no, it’s the makers we should lament. Or celebrate. Or watchfully, painfully love.

The state of man does change and vary,
Now sound, now sick, now blyth, now sary,
Now dansand mirry, now like to die:
Timor mortis conturbat me.

William Dunbar, “Lament for the Makers”

What shall we do with these vexing, alarming, exhilarating abilities? We learn, we know, we symbolize. Sometimes we believe we understand. We find a huddling place. We explore, and share our stories.

Presumably man’s spirit should be elevated if he can better review his shady past and analyze more completely and objectively his present problems.

Vannevar Bush, “As We May Think”

For several iterations through the seminar, that word “presumably” leapt out at me, signalling a poignant, wary hope as well as a frank admission that all hope is a working assumption and can be nothing more. This time, however, the word “review” glows on the page. Re-view. Why look again? How can repetition make the blind to see? Ever tried to find something hiding in plain sight? Ever felt the frustration of re-viewing with greater intensity, while feeling deep down that the fiercer looking merely amplifies the darkness? (Ever tried to proofread a paper?)

We console ourselves with the joke, attributed to Einstein, that the definition of insanity is to do the same thing again and again while expecting different results. Yet we hope that thinking, mindfully undertaken, may contradict that wry observation. We hope that thinking again can also mean thinking differently, that a re-view strengthened by a meta-view can yield more insight and bring us a better result than the initial view did. Look again. Think again. And, in Vannevar Bush’s dream of a future, a dream that empowered epochal making, looking again and thinking again would be enriched, not encumbered, by a memory extender, a “memex”:

[Man] has built a civilization so complex that he needs to mechanize his records more fully if he is to push his experiment to its logical conclusion and not merely become bogged down part way there by overtaxing his limited memory.

What is this experiment? When exactly did we sign the papers giving our informed consent to any such thing?

Our ingenuity is the experiment, the problem, the hope. Our birthright may also be our death warrant. Is that the logical conclusion?

Yet, in the application of science to the needs and desires of man, it would seem to be a singularly unfortunate stage at which to terminate the process, or to lose hope as to the outcome.

The word “science” signifies more than simply the methodological revolutions emerging in Renaissance Europe. For me, it signifies knowing. We in the humanities enact our own experiments in knowing, exerting our own ingenuity both constructively and destructively. We too are makers.

Re-view. Analyze more completely. “Encompass the great record and … grow in the wisdom of race [i.e., species] experience.” As we may think, and create and share “momentary stays against confusion.”

No way out but through.

Optimism

3. Hopefulness and confidence about the future or the successful outcome of something; a tendency to take a favourable or hopeful view. Contrasted with pessimism n. 2.

So the Oxford English Dictionary. I picked sense 3 because it seems most resilient in the face of abundant evidence that this is in fact NOT the best of all possible worlds (pace Leibniz, at least as he’s pilloried by Voltaire).

It seems to me that educators, no matter how skeptical their views (skepticism is necessary but not sufficient for an inquiring mind), are implicitly committed to optimism. Otherwise, why learn? and why teach?

Satan Overlooking Paradise

I think of this as I begin another semester thinking with faculty and staff across the university (last term Virginia Tech, this term Virginia Commonwealth University) about the possible good we could co-create, and derive, from interactive, networked, personal computing. To be pessimistic (not skeptical, pessimistic–they are not synonyms) about personal, networked, interactive computing is to be pessimistic not about an invention, but about invention itself–that is, about one of our most powerful distinctions as a species.

Computers have become woven into our lives in ways we can barely imagine, but the best dreams about the texture of such a world are hopeful, and stimulate hope. Are we there yet? Of course not. But to be pessimistic about computers is to be pessimistic about humanity. And while that’s certainly a defensible position generally speaking, it seems to me that education is an activity, a co-creation, a calling, that runs clean counter to pessimism.

Last week in the seminar we read Janet Murray’s stirring introduction to The New Media Reader. A colleague from the School of Dentistry. A colleague from the library. A colleague from the Center for Teaching Excellence. Colleagues from University College. And more. Once again, I read these words:

We are drawn to a new medium of representation because we are pattern makers who are thinking beyond our old tools. We cannot rewind our collective cognitive effort, since the digital medium is as much a pattern of thinking and perceiving as it is a pattern of making things.

Indeed–yet this is not to deny the meta level at which we consider our consideration, and think about our blind spots so we can find more light:

We are drawn to this medium because we need to understand the world and our place in it.

Yes–and now the world we need to understand is also a world transformed, for good and for ill but potentially for good, why not?, by the medium itself. Recursive, yes–but more deeply, a paradox, not an infinite regress. That’s the hope, anyway. And educators are committed to hope.

To return to [Vannevar] Bush’s speculations: now that we have shaped this new medium of expression, how may we think? We may, if we are lucky and mindful enough, learn to think together by building shared structures of meaning.

That mindfulness is the meta level. I am optimistic about that meta level. As a learner, I have to be. If mindfulness is impossible, then it’s truly turtles all the way down, and who would care?

How will we escape the labyrinth of deconstructed ideologies and self-reflective signs? We will, if we are lucky enough and mindful enough, invent communities of communication at the widest possible bandwidth and smallest possible granularity.

Lucky, and mindful. Chance favors the mindful mind.

We need not imagine ourselves stranded somewhere over the evolutionary horizon, separated from our species by the power of our own thinking.

Or separated from our history, or from our loved ones–though clearly Hamlet (to name only one) demonstrates that mindfulness alone is no guarantee of anything. But what is on the other side of the horizon? What do we find when we return to the place we left and see it for the first time?

The machine like the book and the painting and the symphony and the photograph is made in our own image, and reflects it back again.

To which I would add: the syntax and punctuation in Murray’s sentence above enact the pulses of ways we may think. Those pulses and the ways they enact are poetry. What more complex shared structure of meaning is there?–unless it’s true that all art aspires to the condition of music. Poetry “begins in delight and ends in wisdom,” Frost writes. He continues: “the figure is the same as for love.” Can the shared structures of meaning emerging from our species’ collective cognitive effort begin in delight and end in wisdom, too? Can the figure our collective cognitive efforts make be the same as for love? I think: I hope so. I think: it better be. I think: how can I try to help? The seminar is one answer, a crux of hopes, the discovery of an invisible republic of optimism.

The task is the same now as it ever has been, familiar, thrilling, unavoidable: we work with all our myriad talents to expand our media of expression to the full measure of our humanity.

And by doing so, that measure increases. May we use that abundance wisely, fairly, and lovingly within this mean old brave new world.

With luck and mindfulness, I am  hopeful that we can.

for my Jewish mother, Dr. Janet Murray, with love and deepest gratitude

So let’s recap

Soaring into the eye of the gods

In mid-March I got an email telling me I was nominated in the search for a senior leadership position at Virginia Commonwealth University: Vice Provost for Learning Innovation and Student Success. I was intrigued. I looked at the leadership profile. I was mightily interested. Can’t hurt to apply, I said to myself. So I did.

The hectic, rewarding pace of life went on. Janet Murray came to VT as the third Distinguished Innovator in Residence. (Very exciting.) The Center for Innovation in Learning prepared its first call for Innovation Grant proposals. (Ditto above.) Learning Technologies began its metamorphosis into Technology-enhanced Learning and Online Strategies. I traveled to Richmond, Boston, Bethlehem (Pennsylvania), and in June, to Rome (Italy) for conference presentations and faculty seminars. And to my wonder and delight, my candidacy continued to advance in the VCU search.

On June 7, as I sat in my lodgings in Barcelona, I spoke with VCU’s Provost, Dr. Beverly Warren, who offered me the job. As a literature scholar, it is my duty of course to tell you that the last time I was in Barcelona, in October of 2010, I was offered the Virginia Tech job. I guess Barcelona is my lucky town in the narrative of my professional life. (No one who’s been there will be in the least surprised.)

On June 24, back in the States, I signed the contract.

On August 1, I formally began my work, though I’d been ramping up at VCU and ramping down at VT since my return to the US. On this same day, my wife and I closed on our new home in Richmond.

Oh, and the conference in Rome was wonderful, far beyond my already-high expectations. The city and country were also pretty stupendous (litotes alert). As was Spain the week before, as was England the week before that. A summer of summers.

And sadly, the cloud over the trip was the death of my beloved mother-in-law on the same day that her youngest daughter, my wife, arrived in Madrid to join me in my travels. That grieving continues. If my experience with my parents at their passing is any guide, one learns to live with death, but one never gets over it.

I guess I’m a little behind in my blogging. Perhaps you can see why? The problem seems to be time, but it isn’t really. Time has become extremely compressed, yes, and spare time has become a vanishing commodity. My perception of time many days borders on the surreal as I adjust to the scale, scope, pace, and challenges of the new job–all very exciting, all very welcome, and all very demanding. Yet the real problem is, as ever, too much to say.

Time to write anyway. Not that I’ve been idle in that department, but I have been silent in this space, and I miss it. I did get 6000+ words done in an article on temptation in Paradise Lost, however–turns out I miss that kind of writing, too. Yes, Gardner writes, even if you haven’t seen it here for several months. Time to write anyway.

McLuhan and our plight

“Plight” is an interesting word. We are in a plight, meaning we’re in a tangle, a mess, a terrible fix, with “fix” itself an ironic noun in this context. Yet we also plight our troth, meaning “pledge our truth.” Plight-as-peril and plight-as-pledge both come from an earlier word meaning “care” or “responsibility” or (my favorite from the Oxford English Dictionary) “to be in the habit of doing.” Along a different etymological path, we arrive at the word meaning to braid or weave together. The word “plait” is a variant that makes this meaning more explicit. It’s not too far into poet’s corner before weaving, promising, and care-as-a-plight become entangled, at least in my mind, and perhaps usefully so.

The first McLuhan reading in the New Media Faculty-Staff Development Seminar is from The Gutenberg Galaxy, specifically the chapter called “The Galaxy Reconfigured or the Plight of Mass Man in an Individualist Society.” I don’t know if McLuhan is punning here, but it’s not implausible that the man who coined the term “the global village” and paid special attention to the role of mediation in human affairs–mediation considered as extensions of humanity–might think not only about the plight we find ourselves in but also the plighting of troth we might explore or co-create or braid.

The trick (and McLuhan is nothing if not a trickster, as others have noted) is that the plighting cannot be straightforward or “lineal,” lest it not be a genuine pledge or an authentic weaving. His very writing is obviously a plight for many readers, but it’s also a brave (and sometimes wacky) attempt to do a plighting of the plaiting kind as a sort of pledge of responsibility. He writes these stirring words for our consideration:

For myth is the mode of simultaneous awareness of a complex group of causes and effects. In an age of fragmented  lineal awareness, such as produced and was in turn greatly exaggerated by Gutenberg technology, mythological vision remains quite opaque. The Romantic poets fell far short of Blake’s mythical or simultaneous vision. They were faithful to Newton’s single vision and perfected the picturesque outer landscape as a means of isolating single states of the inner life.

From which I draw these conclusions regarding McLuhan’s argument (or plighting):

1. “Lineal” does not mean “synthesized” or “unified.” The straight path or bounded area leads only to fragmentation and reduction. It is not a weaving and cannot be. The lineal and the fragmented are perilously broken promises.

2. Mythological vision is a technology for enlarging awareness of complexity. Mythological vision is both plighted-woven and a means for plighting-weaving.

3. Fragmented, lineal awareness invents technologies of self-propagation that reinforce more lineality, more fragmentation, while giving the illusion of doing quite the opposite. Single-point perspective is not the same as a unifying vision or a simultaneous awareness of a complex group of causes and effects. It is, instead, reductive while pretending to be unified.

4. Even self-consciously or self-proclaimed liberatory movements such as Romantic poetry (or any number of other such apparently radical departures) may quail before the complexity and simply reinscribe a slightly shifted set of boundaries, thus perpetuating a reduction of complexity and a lack of awareness that dooms our technologies to reproducing our failures.

What technologies might reveal, restore, or help us co-construct a mythological vision, a species-wide simultaneous awareness of a complex group of causes and effects? It’s a political question that reaches into the realm of complexity science, art, and potentially even philosophy or (gasp) theology. Does Doug Engelbart’s idea of “augmentation” and complex symbolic innovation answer such a call? Does Bill Viola’s anti-condominium campaign? Is there an eternal golden braid to be had, or woven? What loom should we choose, or make?

Of Flutes and Filing Cabinets

Last week in our New Media Faculty-Staff Development Seminar, Nathan Hall (University Libraries) and Janine Hiller (College of Business) teamed up to take us through the Alan Kay / Adele Goldberg essay “Personal Dynamic Media.” Janine and Nathan took an inspired  approach to their task. Nathan’s a digital librarian, and he brought his training and interest in information science to bear on Kay and Goldberg’s ideas. Janine’s work is in business law, so intellectual property would have been a logical follow-on for discussion. But wily Nathan segued into wily Janine’s swerve in a direction that in retrospect makes perfect sense but at the time came with the force of a deep and pleasant surprise: the information science of metaphor.

As I look back on the session, I have to admire the very canny way in which the info science/metaphor combination acted out the very nature of metaphor itself: the comparison of two unlike objects. Having made the comparison, of course, one begins to see very interesting disjunctions and conjunctions. The mind begins to buzz. Wholly novel ideas emerge, such as the metamedium of the computer being like a pizza. Seriously.

Janine shared with us a lovely TED video on metaphor …

… and challenged us in small groups to come up with our own metaphors for computing as a metamedium (think of them as seminarian family-isms). We very quickly got to pizza in our group, courtesy of the talented Joycelyn Wilson. (Amy Nelson riffs on that metaphor in her own blog post.) Another group found itself circling back, recursively but sans recursing (dagnabbit), to the powerful and complex metaphor of the “dream machine.” (Go ahead and revive that metaphor by thinking about it again. And again. Stranger than one might suppose, eh?) (Oh, and to get another link in, I believe it was 21st-century studies lamplighter Bob Siegle who led us there.) In our closing moments, we began thinking about metaphor as a metaphor for computing, and computing as a metaphor for metaphor. I do believe Alan and Adele would have enjoyed the conversation.

At the end, Nathan sketched out a continuum between the procedural and the conceptual/metaphorical that he had found in “Personal Dynamic Media.” At one end was the filing cabinet (cf. Memex, cf. info science). At the other end was the flute (a metaphor that Janine beautifully led us to unpack in our discussion). And then, a few minutes after the seminar was over and I was walking to the car, a connection appeared for me.

There is indeed an apparent dichotomy between filing cabinets and flutes, between quotidian documents and art, between the minutiae of our task-filled lives and the glorious expressive possibilities of musical performance, especially with an instrument like the flute (I am a mediocre but enthusiastic flautist) that one plays in such intimate connection with one’s body and breath. It’s simple, direct, a column of air that resonates within the instrument as well as within the hollow, air-filled spaces within one’s own face and chest.

What could be more pedestrian, ugly, and (depending on the tasks) repellent than a filing cabinet? What could be more liberating and beautiful than a well-played flute?

How is a raven like a writing-desk? Alice asks in Alice In Wonderland. The question is never answered. (Brian Lamb once answered it–“Poe wrote on both”–but alas his ingenuity came many decades too late for poor Alice.)

How is a flute like a filing-cabinet? The question makes even less sense. At least, at first.

But considered within the world of Alan Kay’s aphorism that “the computer is an instrument whose music is ideas,” I find myself inspired to think that one may indeed make a flute of a filing-cabinet, awakening and ennobling the detritus of our dreary records and messy operational details with the quicksilver music and responsiveness of a well-played flute.

What if we could bring that vision into our lives? Our learning? Our schools? What if our filing cabinets were less like the warehouse in which the Ark of the Covenant is boxed and lost, and more like thought-vectors in concept space sounding something like the music of the spheres?

It may not be as hard as we may think–unless we actually prefer meaninglessness and stasis to delight and melody.

As Hoagy Carmichael once wrote, “Sometimes I wonder.”800px-Eight_Flute1

 

Intuition: Use, Agency, Invitation

You’ve seen the ad copy. I have too. The hard sell for the soft, gentle learning curve promised for a new device is that the device is “intuitive.” That is, the device is easy to use because you can make the device do what you want because the interface design helpfully indicates how to operate the device. You want to save a file? Click on the icon. Of course, in MS Word (and MS Office generally) the icon is a floppy disk. One used to save files on floppy disks. They used to look like that, too–the 3.5 inch not-floppy diskette. Yes, this is getting complicated already. Let’s stop the cascade by admitting that “intuitive” means “familiar,” and that “familiar” itself is more of a moving target than we’d like to think. And there’s a Gordian knot for another time. (Recommended reading: “The Paradox of the Active User,” a major addition to my intellectual armamentarium courtesy of Ben Hanrahan, a wonderful student in last year’s “Cognition, Learning, and the Internet” course.)

So let’s move on. “Intuition” (home of the intuitive) can mean something much deeper than “I bet that’s how I can do that.” It can mean “I bet this device ought to be able to do that.” In “Personal Dynamic Media,” Alan Kay and Adele Goldberg tell the story of one such intuitionist:

One young girl, who had never programmed before, decided that a pointing device ought to let her draw on the [computer] screen.

This kind of intuition is a creative intuition that isn’t about “ease of use” or “I bet I already know how to do that.” It’s an educated guess, a contextual surmise, and a leap of faith. Note the fascinating language in this description. She decided (moment of agency and commitment) that a pointing device ought to let her. This kind of intuition is something like the belief in “Mathgod” that Douglas Hofstadter describes so winsomely in Fluid Concepts and Creative Analogies. It’s also (no coincidence) what Jon Udell keeps talking about when he talks about how people “don’t have intuitions” about the World Wide Web. To connect Kay-Goldberg with Udell, to have intuitions about the Web would be to decide that the Web (and the Internet that supports it) ought to let one do this or that–meaning, “given what this system is and what it supports, this thing I imagine or invent should be possible.” Note that you have to know something about what kind of a thing, or network, or web you’re working with. Indeed. But note also that the paranoia, hebephrenia, or catatonia induced by the many double-binds that formal schooling presents to learners are responses that pretty much guarantee that such intuitions will simply not develop. Try to imagine an entering class vigorously discussing among themselves “given the mission statement of our university, this thing I imagine or would like to invent with regard to my own learning ought to be possible. Feel your brain cramping in both hemispheres? Do students read mission statements? If they did, would they seek to shape their learning in terms of it? Do the structures we build to support what we say we intend, we value, we desire, actually stimulate any such activity? Exactly. Learners in formal schooling are not very likely, most of the time, to decide that school ought to let one do this or that related to learning. And if they try to make such a decision, based on such an intuition, they are often hammered back into line. Not always, but often. And any such repression is too much.

But here’s the third level, and it comes next in “Personal Dynamic Media”

She then built a sketching tool without ever seeing ours…. She constantly embellished it with new features including a menu for brushes selected by pointing. She later wrote a program for building tangram designs.

This level of intuition is the invitationist level. This intuition is an intuition not so much about the device per se but about the learning context, an ecosystem of device, peers, teachers, etc. Kay and Goldberg praise the young girl for building her own sketching tool “without ever seeing ours.” Another teacher might have said “did you do your homework? Did you consult the manual? Did you follow directions?” These are often important questions, but they miss the most powerful intuition engine of all: the invitation.

In “The Loss of the Creature,” an essay that articulates the paradox of the active learner with haunting precision, Walker Percy writes about the recovery of being, by which he means the recovery of the person as well as the recovery of the person’s experience. He believes both person and experience to be lost to “packages” which we simply “consume” with an ever-increasing anxiety that our consumption be certified as genuine by others. Worse yet, we become increasingly numb to our consumption, unaware that our souls are rotting from the inside out. As Kierkegaard observes and Percy reminds us, the worst despair is not even to know one is living without hope. No surface receiving our “cognition prints.” No mark of our learning or inquiry or existence left behind. We do not even think to ask.

Toward the end of the essay, Percy tells a story about two modes of experience, a story of music and being:

One remembers the scene in The Heart is a Lonely Hunter where the girl hides in the bushes to hear the Capehart in the big house play Beethoven. Perhaps she was the lucky one after all. Think of the unhappy souls inside, who see the record, worry about scratches, and most of all worry about whether they are getting it, whether they are bona fide music lovers. What is the best way to hear Beethoven: sitting in a proper silence around the Capehart or eavesdropping from an azalea bush?

However it may come about, we notice two traits of the second situation: (1) an openness of the thing before one–instead of being an exercise to be learned according to an approved mode, it is a garden of delight which beckons to one; (2) a sovereignty of the knower–instead of being a consumer of a prepared experience, I am a sovereign wayfarer, a wanderer in the neighborhood of being who stumbles into the garden.

A big house with a Capehart that looks like a casket ready for an embalmed Beethoven and his embalmed listeners. Or: a sovereign wayfarer in the neighborhood of being, and a garden of delight which beckons to one.

We need to work on our beckoning. Beckoning is what Bakhtin calls addressivity: the quality of turning to someone. From design to cohort to community and everywhere in between, especially in the schools that face our present times and equip us to invent our futures: how can we work on our invitations?

The Web is not the same as the Internet, and why that matters

There must be some kind of way out of here.

I have been following John Naughton ever since I found his book A Brief History of the Future in a secondhand bookstore in South Philadelphia in the fall of 2011. (My thanks to Kathy Propert for taking me there.) Naughton is Emeritus Professor of the Public Understanding of Technology at the Open University in the UK. He’s a blogger at the aptly named Memex 1.1, he’s Vice-President of Wolfson College in Cambridge, he’s an adjunct professor at University College, Cork, his latest book is the extraordinary From Gutenberg to Zuckerberg: What You Really Need to Know About the Internet, and he’s a crackerjack journalist for The Guardian. This morning, Naughton’s blog linked to his latest Guardian column, “Kicking Away the Ladder,” which concerns among many other things the persistent, pernicious error of confusing the Internet with the World Wide Web. Naughton explores why that error matters, in fact why it may be a fatal error, one that could mean the end of the “open, permissive” infrastructure that has allowed these extraordinary telecommunications innovations we’ve witnessed over the last few decades to grow and flourish.

The essay is essential and sobering reading. Please go read it now (it’ll open in a new tab). I’ll be here when you get back.

Is Naughton overreacting? Not at all. The danger is clear and present. And he knows his history, so Naughton understands well what we have gained from the Internet and the World Wide Web. He knows how they were made, and what principles animated and informed their design. And he knows what we stand to lose in the face of the strategies controlled by those who understand elementary facts about internet and computing infrastructure, history, and design, facts that far too many people are too incurious even to inquire after. These are elementary facts. They are not difficult to understand. Their implications take a little more work to get your head around, yes, but it’s nothing that a basic program in digital citizenship couldn’t address successfully–assuming that program was about how to make open, permissive use of the open, permissive platform. That is, assuming digital citizenship is about the arts of freedom and not simply the duties and dull “vocations” of compliance and consumption.

I read parts of the essay aloud to my dearest friend and companion, the Roving Librarian, and she asked me a great question: “So, if you had to explain the difference between the Internet and the Web, how would you do it?” And as so often happens in the presence of a great and greatly foundational question asked in the spirit of mutual inquiry and respect and love, a cascade of thoughts was triggered. (Not a bad learning outcome, that.)

Here’s what I have so far. It’s coming out quickly and will need much development, but I need to write it down now. I welcome your comments and questions and elaborations and collegial friendly amendments. (No blame should attach to the Roving Librarian, by the way, for any mistakes I make. Lots of credit goes to her, though, for anything that’s worthwhile.)

The Internet is about data transmission. It’s a network that enables any node to transmit any kind of data to any other node, and any group of nodes (any network) to transmit any kind of data to any other group of nodes. It’s a network and a network-of-networks. It thus engages, stimulates, and empowers data exchange that’s one-to-one, one-to-many, many-to-many, and many-to-one. As Naughton points out in another essential essay, this structure permits unique and disruptive emergent phenomena, some of which will be disturbing and harmful, some of which will simply be puzzling or appear irrelevant (or be denounced as such), and some of which will be enormously beneficial. Naughton is not alone in his explorations. Clay Shirky indefatigably points out the enormous good that we can derive from the Internet. He points out the dangers, too, but when people call him names, they call him a “techno-utopian,” which as far as I can tell means he remains hopeful about our species’ powers of invention. Joi Ito, director of MIT’s Media Lab, emphasizes over and over again that the Internet is not so much a technology as the technological manifestation of a system of values and beliefs; not a technology, but a philosophy.

To summarize, then: the Internet permits open data transmission one-to-one, one-to-many, many-to-many, and many-to-one. Seems clear enough. And that Clay Shirky talk on social media and revolutions I linked to above makes my point very vividly and clearly. (In fact, I learned to explain things this way from Shirky, from his blog and his two books Here Comes Everybody and Cognitive Surplus and in other venues as well.)

So, then, how is the Web different from the Internet? Naughton says that it’s an application that runs on the Internet. The innovation Tim Berners-Lee brought into the world about a decade prior to the turn of the century could not have been imagined or built without the open, permissive foundation that the Internet was designed to be.

But then comes the logical next question: how then is the Web significantly different from the Internet, aside from providing a layer of eye candy that makes the Internet more appealing and the metaphor of a “page” that makes the Internet seem more familiar? Gregory Bateson says that a unit of information may be defined as a difference that makes a difference. So what difference does the difference of Web make?

If I can’t answer that question, then no explanation of the difference matters, because if fails the “so what?” question.

And the answer that comes to me, mediated through the readings I’ve done in learning environments like these (I consider a course a learning environment, a carefully crafted cognitive space or occasion that’s also a foundation for collaborative building), mediated through Jerome Bruner, and mediated through Mike Wesch’s evergreen “The Machine is Us/ing Us” and all that his creation mediates (like Kevin Kelly’s essay), is that the crucial difference is the link.

That’s all, and that’s everything.

The link allows us (and once we’ve seen it happen, it invites and entices us) to construct a thought network out of (upon, within, on top of, emerging from) a data network. That’s all, and that’s everything. It is the essential move that turns sensation–a matter of data transmission along nerve fibers–into what, given enough interconnections and enough ideas about interconnections, becomes cognition, a level-crossing connectome out of which abstractions, concepts, and conceptual frameworks will emerge.

The Internet passes data agnostically (video, text, audio, whatnot) and the Web allows us to create conceptual structures out of data by means of simple, direct, open, thoughtful, permissive linking. The linking is idiosyncratic, like cognition, but like cognition, it is not merely idiosyncratic. The linking is never random–human beings can’t be random–though it may be surprising or the relation may be obscure (at first). Some sets of links are more powerful than others, but none is as powerful as the very idea of linking, just as the most powerful concept we have is the notion of concept, something I delight in exploring with students and colleagues when we get to Engelbart’s “Augmenting Human Intellect: A Conceptual Framework.”

The Internet transmits information. The Web enables (stimulates, encourages) a set of connections that, from the first link to the enormous set of links we now experience, symbolize ideas about relationship.

The Internet permits the pre-existing connectomes within each mind and among many minds working together to pass their nerve impulses freely along a meta-set of data connections, a network of networks, an Internetwork. The Internet is a protocol and a foundation for the data transmission that enables communication considered as information transmission.

But this is only the beginning, an open, permissive, and thus powerful light-speed beginning. The next advance occurs when information transmission can be made into a foundation for sharing not just perception but experience, for sharing not just neural connections but the experience of cognition that emerges within each mind. And that level of sharing means not just sharing information but empowering and stimulating new ways of creating and sharing meaningful structures of information. (More Engelbart here, obviously.) The link is not merely a link, but a concept that enacts itself–as concepts do when we build them, and build on them.

To sum up:

The Internet is like sensation or at most perception.

(A crucial first step, and we could have gotten a network that allowed us to look at only a few things in a few ways, a walled garden a la Facebook. Instead we got something open and permissive, like a neural network of small pieces loosely joined whose emergent power emerged from the possibility of connection, not from strict specialization or over-particular design. More like cells and atoms, in other words.)

The World Wide Web is like perception leading to thinking.

(It’s like making concepts. Here Vannevar Bush missed an opportunity that we’d need a Doug Engelbart to explore. What Bush described as “associative trails” are not a mere search history. They are links, yes, but links that reveal conceptual frameworks, that symbolize conceptual frameworks, that stimulate conceptual frameworks. They are not merely a scaffolding–though to be fair, Bush does describe the scaffolding in rich ways that probably do rise to the level of what I’m talking about here. The links are fundamentally social both in the intracranial sense–the connectome in my head–and the intercranial sense–built out of the social experiment we call civilization, and returning to it as another layer of invention and potential.)

The foundational commitment in both the Internet and the World Wide Web is the same: both are built as “open, permissive” structures (to use Naughton’s words). These structures are not unlike the distributed (neuroplastic) design of the brain itself, one that, as it happens, permits all the higher orders of cognition to emerge, higher orders built of “adjacent possibles” and “liquid networks” that in turn enable even higher orders of cognition to emerge. From this open, permissive, distributed structure emerges our distinctiveness as a species. And our links within the World Wide Web enact this emergence, represent this emergence, and thus stimulate further emergent phenomena as we create and share even more powerfully demonstrated ideas about shared cognition.

The Internet is like sensation. The World Wide Web is like thinking.

Or:

The Internet transmits data of all kinds: text, images, sounds, moving pictures, etc. The World Wide Web is a newly powerful word (or medium of symbolic representation, or language) that allows us to imagine and create newly powerful n-dimensional representations of the n-dimensional possibilities of “coining words” (making and realizing representations) together.

And:

A foundational commitment to an open, permissive architecture of creation and sharing enables the next layer (species, experience) of complexity and wonder and curiosity to emerge. This open, permissive architecture enables both the cognoplasticity of individual minds and the shared thinking and building that enables the macro-cognoplasticity of civilization.

There’s a fractal self-similarity involved that makes it difficult to tell Internet from Web, just as it’s sometimes hard to tell where I end and where you, or my history, or my friends, or my reading, begin. (Bakhtin’s “Speech Genres” maps these complexities most wonderfully–definitely worth extending your cognoplasticity in that direction, dear reader, with Professor Martin Irvine‘s fine guide as a beginning.) But the difference is there, and it is vital. I suspect the problem is that the difference is not well conceptualized because the conceptual framework rarely rises beyond, or in a different direction from, the technical distinctions. But then technical distinctions are rarely explored in ways that reveal the conceptual frameworks they represent and stimulate–hence Naughton’s frustration as well as the importance of his observations.

Now let’s connect these ideas to Bruner and his ideas in Toward a Theory of Instruction, ideas that influenced Alan Kay and other learning researchers who helped to envision and build the personal, interactive, networked computing environment we now live within with varying degrees of openness and permissiveness.

In Notes Toward a Theory of Instruction, Bruner distinguishes three levels of communicating (and thus three paths to learning):

1. The enactive: we communicate by doing something physically representational in view of others. If we want water, we mime the action drinking or lapping up water, and do so in the presence of others whom we believe might relieve our thirst.

2. The iconic: we communicate by pointing to something that materially represents at one remove, while still being physically connected to, the thing we mean or seek to draw attention to. Instead of miming the act of drinking, we might point to a cup or a water fountain, perhaps making a noise of some kind to indicate the degree of urgency we feel. This level is considerably more advanced than the first level because it entails a more sophisticated “theory of other minds,” a belief (supported by learning in a social context) that we can communicate shared experience directly through a shared locus of attention that does not directly connect to our physical bodies. To point to a cup indicates an experience of shared experience. To mime drinking almost gets there, but one might do this in one’s sleep as one dreams of drinking water. The enactive doesn’t necessarily indicate a theory of other minds–though miming drinking in the presence of someone whom one believes to be paying attention may approach the iconic and cross over to it, as when someone mimes drinking from a cup.

3. The conceptual (Bruner calls this “the symbolic,” but since it’s easy to confuse “symbol” and “icon,” I’ll use “conceptual” most of the time): we communicate by means of a set of shared concepts or abstractions. Here we don’t mime drinking, and we don’t point to a cup. We speak or write, “I am thirsty.” This is a wild and crazy thing, no? A set of squeaks and grunts. A set of ink marks (or pixel shadings). Words. Every one of those three words “I am thirsty” enacts, represents, creates, and communicates a state of enormous cognitive complexity that’s hidden from us because of our mastery. The familiarity cloaks the miracle. You can’t drink the word “water,” but behold, the word may bring you what you desire, or cause you to help another human being. (Obviously I’m thinking of Hofstader here as well, and I can recommend Fluid Concepts and Creative Analogies (with profound thanks to Jon Udell), I Am A Strange Loop, and for a rapid overview, his talk at Stanford on “Analogy as the Core of Cognition.” It’s all “metaphors we live by.”)

I think most education in our schools pretends to get to the conceptual but in fact stops at the iconic or perhaps even the enactive level. Pointing pointing pointing. Proctoring proctoring proctoring, the student always in the instructor’s presence. See-do, see-do, with “critical thinking” at a level of “see-do in the sophisticated complex way I your teacher have already imagined for you, and pointed to for you, as my expertise permits me exhaustively to define excellence for your seeing and doing.” A closed and impermissive architecture mediated through language, but not really conceptual and sometimes hardly even iconic–because it doesn’t support or represent emergent phenomena, what Bruner calls problem-finding:

Children, of course, will try to solve problems if they recognize them as such. But they are not often either predisposed to or skillful in problem finding, in recognizing the hidden conjectural feature in tasks set them….. Children, like adults, need reassurance that it is all right to entertain and express highly subjective ideas, to treat a task as a problem where you invent an answer rather than finding one out there in the book or on the blackboard. (157-158)

Like Facebook, our schools and the classrooms and curricula they provide form a walled garden full of “finding” by merely clicking on icons (including the face of the teacher, which when clicked upon may yield “what the teacher wants”), partly for administrative convenience, partly for administered intellectuality that hides our own conjectures (lest emergent conceptual frameworks undermine the power authority and wealth of the old architects), partly because it’s a good business model. Ah, the business model. Tim Berners-Lee put the Web in the public domain, and what kind of a business model is that? unless one considers it an investment made to benefit the species–a mission we say we follow in higher ed, of course.

Did I say that remaining at the enactive or at best the iconic while feigning the conceptual is a “good” business model? I meant a great business model, especially if one enjoys exploiting others without leaving visible marks, since it’s education that gives us the constrictive framework of pointing that enables, encourages, and stimulates the narrow ways we are able to imagine thinking about business models. Or even, at the level of curriculum, to imagine thinking about thinking about business models. I’ll drop the sarcasm and say that’s really bad news. If education fails us because its “great business model” and massively convenient administrative structures cannot or will not allow its participants to work at a truly conceptual level, a truly problem-finding level where the lowest and highest arenas of problem-finding are centrally concerned with learning itself, then we are trapped. There will be no portals. (The cake is a lie.)

So back to the Internet, now mapped along Bruner’s levels. The Internet permits enactive communication. Data transfer in an open and permissive network-of networks, like sensation in complexly open and permissive internal neural networks, permits a kind of data-telepresence that supports all sorts of miming-based communication.

The Web appears to be a graphical user interface for the Internet, but this is a dangerous misperception. Clicking on images (or even links, for that matter) is really no more than Bruner’s enactive level of communication. The Web is an environment for linking, which means it openly and permissively enables (encourages, stimulates), with each and every act and experience of linking and linked, an iconic level of communication that contains within it the potential of a powerful experience of the abstract and conceptual, an appeal (implicitly or explicitly) to shared experience at a symbolic level that depends on a more complex idea of other minds than the merely enactive or iconic levels of communication do.

People conflate school and education the way people conflate the Internet and the World Wide Web. Education appears to be synonymous with school, which is designed to be an environment for the focused and controlled delivery of content. This is a dangerous misperception that’s similar to the dangerous misperception that says the Internet and the World Wide Web are the same thing. I think the two misperceptions are related. One may cause the other. They may cause each other in a vicious circle. Hard to say. But the danger is the same. And Facebook is like Facebook because that’s the way we like to make a world, or have a world made for us, and school is school because we need to convince ourselves that any other way is stupid, wrong, or crazy.

At any given moment, however, there are people who, like Puddleglum in The Silver Chair, insist that there’s a better and different and more open and freer world above and outside the walls of the cave. And at some lucky moments, those people get to build something that reflects that belief. Something we can build on, too, and not simply react within.

Yes, this is like moving from Flatland into a three dimensional space. We face the same difficulty, too: how to imagine a dimension that we cannot explain in terms of the data of immediate (two-dimensional) perception? Thankfully, the two-dimensional world of Flatland has a word for “dimension,” which some Flatland Folk might become curious about. And once that curiosity is awakened, you never know, some of those folk may ask themselves whether the abstraction of “dimension” might be a portal into something real that they simply cannot experience except through that portal of abstraction.

Isn’t that something like how language works? If you think about it, doesn’t language itself seem to open up n-dimensional possibilities that lead us to co-create new realities out of nothing but thought itself? Like the poet, lunatic, and lover “of imagination all compact,” as Shakespeare has a typically dense administrator pronounce, the result is that we “give to airy nothing / A local habitation and a name” (A Midsummer Night’s Dream V.i.16-17). The dense administrator, the mighty King Theseus himself, imagines this ability to be a bug, not a feature. Poor King Theseus! Luckily he married up when he found Hippolyta, who responds to her husband’s pontification with practical visionary good sense:

But all the story of the night told over,
And all their minds transfigured so together,
More witnesseth than fancy’s [imagination’s] images
And grows to something of great constancy;
But howsoever, strange and admirable.(23-27)

Minds “transfigured so together.” Too many linkings to be anything less than constant, strange, and admirable. A problem-finding education.

In Computer Lib/Dream Machines, Ted Nelson writes,

What few people realize is that big pictures can be conveyed in more powerful ways than they know. The reason they don’t know it is that they see the content in the media, and not how the content is being gotten across to them–that in fact they have been given very big pictures indeed, but don’t know it. (I take this point to be the Nickel-Iron Core of McLuhanism.)

Brilliant, but there’s more at the core: the big-picture-conveyance is not just delivery but itself a new symbol, a symbol of a specific instance (and a generalizable example) of the possibility of big-picture-conveyance. There is information about information itself, and the possibilities of conveying and sharing experience, being conveyed and shared in that big-picture-instance. Nelson’s word “conveyed” is still too close to “delivered.” McLuhan’s insight is still deeper, that what is “delivered” is always a metastatement about the conditions and means of conveyance considered largely. To put it another way, symbols do not only contain and transmit meaning. Symbols also generate meaning, the way. “link” is both a noun and a verb. A medium not only of figuration, but a figure and medium of transfiguration. Our “minds transfigured so together.”

As a species, among our many failings, we also have the wonderful endowment of brains that are bigger on the inside than they are on the outside. “Further up, and further in!” Truth, in-deed. A blogging initiative like the one going at Virginia Tech right now at the Honors Residential College is an attempt to enable, stimulate, model, and encourage intra- and intercranial cognoplasticity, the experience of “bigger on the inside than on the outside,” thus extending the inside (of a small group selected for academic ability) to the outside (which must exist in fruitfully reciprocal relationship lest the experience be merely elitism, defensive, or mutually destructive “othering”). But there’s no way to do this in our newly mediated environment without asking people to narrate, curate, and share on the open web. Until one speaks a language, a word is only a sound (an enactment). Until one reads a language, a word is only a picture (an icon). Until one writes in a language (or medium), one cannot imagine or experience or help build the portal to the thinking-together, the macro-cognoplasticity, the networked transcontextualism, the planetary double-take, that represents the next dimension we need (and desire and dread, too). Our goal is to become first-class peers for each other. Conceptacular colleagues, not just rowers in someone else’s galley.

And here I conclude for now. We have, if we choose, the ability to maintain the open, permissive architecture of the Internet and the open, permissive architecture of the Web that resides within and emerges from the Internet. If we choose to preserve the open, permissive architecture we have been lucky enough to build and lucky enough not to wreck quite yet, we may move to the third level of communication Bruner notes: the conceptual, abstract, symbolic level. For the Web is a network of links, but to call it that is to approach the realization of the next level of understanding, the mode of conceptual communication and enactment (yes there is recursion here) Bruner terms the symbolic. The World Wide Web is not simply a collection of links but an enactment of, an icon of, and an idea about (a symbol of) the complexly open and permissive activity we call linking, out of which we build together the linked and linking and open-to-linking realities of our next stage of cognoplasticity as a species.

This must also be the figure an education makes. Education is the technology that amplifies and augments the natural process of learning. Education brings Flatlanders to consider “dimension” not just as an experience one enacts or points to (a line can go this way, or that; see; now let’s test you to see if you remember that) but as a symbol that can be abstracted from experience and thus (paradoxically) lead to greater, more complex, more possibility-filled and possibility-fueled experience. To use Hofstadter’s language, education must partake of, and stimulate, and empower, the experience and emergence and creation of strange “level-crossing loops.”

The Internet permits, and the World Wide Web enacts and pictures and symbolizes, that experience and that emergence and that creation, those possibilities. (And it all recurs and is recursive, each level leading from and to each level–but that‘s a level I can’t get to in this post, except momentarily here.) Education must do the same.

It is no accident that computers and cognition and communication and education have been so intertwingled in the history of our digital age. From Charles Babbage, Ada Lovelace, and Alan Turing onward, all along the watchtower where the resonant frequencies are transmitted and received, a “wild surmise” about learning within and among the amphitheatres and launch-pads of shared cognition has accompanied each development in the unfolding n-dimensional narrative of unfolding n-dimensional possibilities and awakenings. It’s exhilarating in that tower, and exhausting as one strains to see distant shifting shapes. It’s cold, especially in the darkest moments. Or so I imagine.

All-Along-the-Watchtower

[Correction: the Naughton column is from The Guardian. Naughton called it The Observer in his Memex 1.1 post, for reasons I don’t yet understand. At any rate, I’ve made the correction in paragraph one, above.]

Personal, not Private, Ted Nelson edition

It’s Ted Nelson day in VTNMFS-S13, the Virginia Tech New Media Faculty-Staff Development Seminar. I can see from the motherblog that Ted is already stirring up plenty of response, from delight to alarm, sometimes in the same post. As is his wont, of course. That’s part of why Ted Nelson is on the syllabus. That, plus his brilliance and zany rhetorical adventurousness. And what appears to be his utter fearlessness–fearlessness, no recklessness–in speaking truth to power.

Yes, Ted Nelson: a handful. Imagine him in your class, fellow teachers. A handful. Yet say what you will about Ted, he’s clearly someone who trusts the learner. Ted Nelson trusts the learner. Once more with greater emphasis, perhaps ruining the fantics, but here goes: TED NELSON TRUSTS THE LEARNER.

Most schooling does not.

Yes, we do not trust eight-year-olds to drive cars, vote, get married, drink, watch re-runs of Love, American Style and so forth. These activities are not developmentally appropriate. And for the boys in particular, there’s a real lag time in forebrain development, which means their judgment sometimes isn’t what it needs to be until, oh, later. (Sometimes much later.)

But do we trust students to learn?

Most schooling does not.

Again, there are some sound developmental reasons, but just between us (ok?), they can start to sound kinda funny after awhile, as they become unquestioned assumptions that we shellac with layers of cracked and tobacco-colored varnish (read: curricula). For example, we obviously can’t trust learners to love and purse the learning of math (or whatever subject strikes you as hard and frustrating) unless we require it. Right?  So the answer is to require learners to learn math. Right? They wouldn’t do it otherwise. They’d just skip it, the way kids skip their spinach and go straight for the soft-serve processed sugary confections. Here’s what’s funny about that question. It assumes there’s no good or practical way, short of a mandatory “forced march across a flattened plain,” to awaken enough intrinsic interest or curiosity to prime the pump for a learner to learn math because he or she wants to. No way to tap into the learner’s own ability (or, if they’re young enough, their propensity) to be intrigued. “Priming the pump” takes too long, is too complex, requires too much teacherly skill, won’t scale, and so forth. So we take a shortcut. We require students to take math. It’s good for them. And they won’t take it otherwise.

That’s a funny set of assumptions, and I don’t mean “ha ha funny.”

The result that I have observed over several decades of teaching is that we believe that a required course somehow obviates the need for awakening internal motivation, interest, curiosity etc. Those internal states are dodgy, difficult, complex, messy, hard-to-assess things. Why engage with that complexity when we can just say “hey you, take this or else” and be done with it? Why take a chance on love when in the end we don’t believe love matters as much as dutiful compliance? Duty is important, to be sure, and strong, well-aimed habits can carry one through the rough spots–but those same habits, as we shall see, can and often do lead to the tragedy Nelson limns in one of the bleakest parts of his essay:

A general human motivation is god-given at the beginning and warped or destroyed by the educational process as we know it: thus we internalize at last that most fundamental of grownup goals: just to get through another day. (my emphasis)

I ask you, hand on heart, do we really think we will raise a nation of quantitatively literate and engaged citizens because of required math courses? Sure, sometimes a required course is able to awaken an interest that would have lain dormant otherwise. Even a blind pig finds an acorn now and again. But is that really the best we can do? Would we put our faith in a pain reliever that seems to increase pain for 80-85% of the population while somehow miraculously relieving pain for 15-20%? (These are liberal estimates of benefit and conservative estimates of pain.) Hand on heart, looking into each other’s eyes: do we think required courses do much more than salve our conscience and give our students and us a way to play “let’s pretend” much of the time? The evidence I’ve seen, personally and professionally and across many different contexts, does not support the idea that a “forced march across a flattened plain” does more good than harm. Rather the opposite.

Ted Nelson trusts the learner. Most schooling does not. Blogging in my classes is many things for me, but at the top of the list is blogging as an exercise in trusting the learner–or as Carl Rogers puts it, “freedom to learn.” I assure you that this is not a safe thing to do. That’s part of the point. One can fail, and the failing matters. It’s not so much that school gives us a place where one can fail without that mattering. If failure doesn’t matter, in that sense, how is it really failing? Rather, school gives us a chance to experience a different truth about failure by uniting us in a community of delighted strivers and yearners: death alone excepted, failure isn’t final.

Which brings me to a first-time commenter’s response to part one of this exploration. Here’s the response:

But, what if in the course of life, something that seemed personal at 20, and blogged about freely, is something you would prefer to have in the private column of life at 25, 30, 40? I’m quite sure, many of us greying adults are very happy there are no digital footprints of our personal thoughts easily found on Google from our days in college.

I hear this observation a lot as I travel around talking to folks trying to make sense of schooling in a digital age. It’s an observation that emerges from a hard and worthy question. “I’m quite sure, many of us greying adults” indicates a steadfast conviction that those of us with the hoarfrost (or worse) turned a corner sometime in our early development and are very glad that no one can see what preceded that turn. What if we entered college as staunch (insert political party here) and then got smarter and wiser and became staunch (insert political party here). Isn’t it great we don’t have any images of our younger selves at Young Republican meetings that can be found on Google? Isn’t it great there are no pictures out there of our youthful dalliance with Students for a Democratic Society? Aren’t we glad that we didn’t share photos on Facebook or on our blogs of those times we went to churches we thought we believed in when we just weren’t sophisticated enough to know that we simply believed in what our parents taught us to believe? (Thought in passing: one rarely encounters the idea that increasing intellectual sophistication can lead to religious belief as well as away from it, though I know that it can and has.)

Aren’t we glad we don’t have to run for President when YouTube shows us apparently drunk? (I’ll leave that as an exercise for the astute reader.)

Aren’t we relieved that there are no pictures like this on the Internet to embarrass our wiser, older selves?

Kids these days. I googled “spring break photos” and this one seemed safest to post here.

 

 

 

 

 

 

 

 

 

I mean, if this is what kids are voluntarily uploading to the Internet, or are having to endure being uploaded to the Internet, how in the world can we trust them as learners, give them a global printing press, and say “please put more stuff up on the Internet”? Won’t we just set them up for more future embarrassment? Aren’t we simply contributing to the grand and ultimate erosion of the right to be silly, the right to be debauched, the right to be young, be foolish, be happy (well, happily intoxicated anyway) without those lapses following them around like a blinding hangover for the rest of their lives?

Perhaps. Perhaps the young men and women in this photo will be forever barred from positions of responsibility and leadership because their employers will see this photo (or worse yet, rely on a sophisticated algorithm to find the applicant’s face in every photo on the Internet) and say “I could never hire such an irresponsible and reckless person, especially if they’re also too stupid or careless to prevent the publication of their stupid carelessness to the World Wide Cesspool.” People enjoy judging other people’s lapses. Perhaps such judges, much better at hiding their own lapses (lapses that often mutate and persist into adulthood), will see such photos and blackball everyone in them. Seth Godin tells such a story:

I almost hired someone a few years ago–until I googled her and discovered that the first two matches were pictures of her drinking beer from a funnel, and her listed hobby was, “binge drinking.”

Note, however, that the woman was doing a good thing in a bad way. She was crafting an identity and publishing it to the Internet. As Jon Udell argues, One Will Be Googled, and that should lead us not to resist publishing to the Internet, but to publish thoughtfully and well to the Internet so that the googling reveals things about us we want to be known. In the language of part one of this post, the googling should reveal us as persons while not tossing aside our privacy. The young woman in Godin’s story has of course done exactly the opposite. I have no idea who she is as a complete person. And it may well be that she advertises herself as a binge drinker because she hasn’t had much help–certainly not much help from schooling–in thinking about herself as a complete person and how to demonstrate that thinking in a generous, connected way on the Internet not as a result of a prescribed and proctored curriculum, but as an ongoing commitment.

I suspect her schooling hasn’t helped her, not because she was shown too much trust as a learner, but because she was shown too little. Perhaps she knew and did her duty in school: comply! prepare! speak when you’re spoken to, until we say you’ve learned enough to earn the right to speak in the modes of sophistication we have prescribed, and in a way that will shield you from the embarrassment of admitting you’ve failed or ever been mistaken, an admission that can be evaded unless the mistake was public, which it would have been if we had encouraged you to narrate your learning and publish that story to the Internet!

Of course I cannot say for sure that the recklessness of publishing your private enthusiasm for binge drinking to the Internet is in any way correlated with a failure of education, a failure linked to our desire to protect our learners from themselves that may emerge from a lack of trust, but I think it is reasonably clear that if we truly desire to protect our learners from themselves, we are failing. They are publishing to the Internet no matter what we say. Human beings typically want to connect with other human beings. Those energies will find an outlet. And my argument here is that we should not be protecting our learners from themselves. We should be trusting them, and aiding them in discovering and using (and teaching us, too) the arts of freedom.

Those arts are not simply the arts of abstinence. Milton writes, “I cannot praise a fugitive and cloistered virtue.” Me neither, though the result is sometimes a commitment to enduring conspicuous failure as one builds what Godin calls “a backlist.” John Dewey observes that “education … is a process of living and not a preparation for future living.” The extent to which we share with each other our growing processes of living, in all their complexities and inconsistencies and false starts and unexpected delighted discoveries and embarrassing stumbles, is the extent to which we are committed to the idea that we could build a better world together.

The idea of commitment is important here. “The figure is the same as for love.” Love is the greatest risk of all, and at the same time the greatest extension of trust we can experience. Michael Wesch tells the story of his wife’s advice to him on his first, nervous, anxious, tense, wildly hopeful day of teaching: “love your students, and they will love you back.” Anyone who’s taught for very long recognizes the difficulty of such a commitment. Yet it’s not optional. Mike expands on this idea in his essay “From Knowledgeable To Knowledge-Able: Learning In New Media Environments” (2009):

Managing a learning environment such as this poses its own unique challenges, but there is one simple technique, which makes everything else fall into place: love and respect your students and they will love and respect you back. With the underlying feeling of trust and respect this provides, students quickly realize the importance of their role as co-creators of the learning environment and they begin to take responsibility for their own education.

Love doesn’t mean keeping your students safe by teaching them abstinence. Love means keeping them safe by teaching them the arts of freedom. Abstinence is one of those arts, of course, but only one, and in my view one of the lesser of them, perhaps the least of all. Abstinence in this sense would mean abstaining from learning itself, a devotion to the idea that “ignorance is bliss.” Education is devoted to the idea that any such bliss is not a bliss worth having, and certainly not a building material for a better world. At least, that’s what we profess, implicitly, by taking money for teaching, research, and service.

What is the alternative to the kind of risk I’m urging us to take? Or to put it another way, what is the risk of not taking these risks of narrating one’s person (not one’s privacy), as that personhood emerges and develops, in a community that conspicuously supports the goal of knowing even as we are known? What if we do not engage with the kind of love that trusts the learner with the intensity and urgency of a Ted Nelson?

In The Four Loves, C. S. Lewis describes the potential horrors (one might say, a “learning outcome”) of a refusal to engage with this higher love:

There is no safe investment. To love at all is to be vulnerable. Love anything, and your heart will certainly be wrung and possibly be broken. If you want to make sure of keeping it intact, you must give your heart to no one, not even to an animal. Wrap it carefully round with hobbies and little luxuries; avoid all entanglements; lock it up safe in the casket or coffin of your selfishness. But in that casket–safe, dark, motionless, airless–it will change. It will not be broken; it will become unbreakable, impenetrable, irredeemable.

Yes, our younger selves may break our hearts as we grey into middle and late adulthood. But the only way around that risk is to start building unbreakable hearts as we become adults. Is that what we want? Will such hearts even be able to recognize the world’s deep hunger, much less respond to it with any deep gladness?

For there is another terrible risk in allowing our younger selves to persist alongside our greying ones: those younger selves may judge us, and not too kindly. Emerson warns us of the risk of not learning the habit of trust:

A man should learn to detect and watch that gleam of light which flashes across his mind from within, more than the lustre of the firmament of bards and sages. Yet he dismisses without notice his thought, because it is his. In every work of genius we recognize our own rejected thoughts: they come back to us with a certain alienated majesty.

I am sorry for the androcentric nouns, but I am not sorry for the sentiment. Notice that Emerson does not say we should “detect and watch that gleam of light … from within” instead of “the firmament of bards and sages.” He says “more than,” and everywhere in “Self-Reliance” argues that it is that inward watching, that very habit of welcoming and not dismissing or rejecting one’s thoughts, that is the greatest lesson we may learn from the bards and sages we encounter in our education. We recognize our kinship, our responsibilities, through the inward watching and welcoming we learn from those who have done that before us–and those who naturally do it now, before schooling has diminished that trust:

What pretty oracles nature yields us on this text, in the face and behaviour of children, babes, and even brutes! That divided and rebel mind, that distrust of a sentiment because our arithmetic has computed the strength and means opposed to our purpose, these have not. Their mind being whole, their eye is as yet unconquered, and when we look in their faces, we are disconcerted….  Do not think the youth has no force, because he cannot speak to you and me. Hark! in the next room his voice is sufficiently clear and emphatic. It seems he knows how to speak to his contemporaries. Bashful or bold, then, he will know how to make us seniors very unnecessary.

What if we encountered our younger selves speaking clearly and emphatically in the next room of our “backlist“? What would they say to us? Would they commend us for the compromises we have made, some of them necessary, some of them pretended so for the sake of our supposed dignity, our need merely to make it to the end of another day? Would our younger selves embarrass us with their energy, their hopefulness, their strong and happy rejection of our rebel and divided minds? Would those younger selves haunt us with their alienated majesty?

What might we see at eighteen, or even at eight, that might forever elude our maturer sight? Could we lay in stores of our youthful visions to feed us, encourage us, and occasionally chide us as we encounter the wearying discouraging complexities of our adult lives? Can we trust our younger selves, and the younger selves around us, and the younger selves for whom we invent the future, a great cloud of witnesses around us, past and present and future, not to point a reproachful finger at us, disdaining our cautions, indicting our impenetrable unbreakable hearts?

Have we no better fellowship to imagine or to build? Millions of grains of sand in the world. Why such a lonely beach?

Chords of Inquiry

Late one night when sleep wouldn’t visit, I stumbled across a stirring and revelatory documentary called Joni Mitchell: Woman of Heart and Mind. I’ve long loved Joni’s music and the sensibility behind it. I once gave a talk in which “Amelia,” my favorite of her songs, played a central role. I know few artists who are as consistently witty, poignant, and searching as Joni Mitchell. Funny, too. She has been an essential companion, even or especially when I had to strain to bridge what seemed the distance between her older, more sophisticated and artful life and the life I was trying to shape as an adolescent growing up in southwest Virginia, not so very far from where I’m typing these words.

Though she is most commonly typed as a “singer-songwriter,” in truth Joni Mitchell is as far from that folk-derived genre as I can imagine, largely because the structure of her songs is so unusual and exploratory–while also very often being as catchy and propulsive as a good pop song. How she can combine those apparently incompatible excellences is a good question.  Perhaps it has something to do with her habitual use of open and unusual tunings for her guitar. When her version of “Urge for Going” was released several years back, commentators noted it was one of the very few Joni Mitchell songs to use the standard E-A-D-G-B-E tuning. The other songs, well, not so much.

Which brings me back to the documentary, and perilously near my point. At one moment early in the film, the topic of Joni’s tunings comes up, and Joni herself speaks to her renowned oddity in that department. What she says has haunted me ever since. She says that she thinks of her unusual chords as “chords of inquiry,” and presents them as if there’s a question mark after each one.

“Chords of inquiry.” A harmony that proposes exploration and curiosity. Notes resonating together but not reaching a conclusion or advancing an argument.

The phrase itself sounds such a music: “chords of inquiry.”

This is the music I yearn for and try to encourage in our Awakening the Digital Imagination seminar each time we convene–in fact, each time the seminar has convened from its very beginning (as a faculty-staff development seminar) back in 2009. It’s not an easy music to sound, especially with a pride of highly trained academics all ranging the veldt of the seminar meetings (online and in real space), all ready to engage with (necessary, certainly) critical thinking, subtle distinctions, spirited polemic, all the academics’ discursive tooth and claw. That which does not kill us, etc. And besides, this is what we (and I do mean we) were taught to do in graduate school. To inculcate a kind of ruthlessness, a kind of skepticism and scrutiny before which all wooly thinking would simply wither.

And yet, what of these chords of inquiry? I do think a provisional acceptance of the essential frameworks of each essay we read, a kind of readerly version of Keats’ “negative capability,” can animate a renaissance of wonder and is indeed a good spiritual discipline in itself. I think of the distinction I was taught at Baylor University by my late colleague Susan Colon, a distinction between “implicative criticism” and “argumentative criticism” she worked through in her review of Andrew Miller’s The Burdens of Perfection: On Ethics and Reading in Nineteenth-Century British Literature:

Implicative criticism, according to Andrew Miller, is writing in which the writer’s thinking is unfolded and made visible to the reader so as to generate a multiplicity of responses, all of them transformative. Its foil, argumentative criticism, seeks closure rather than disclosure; it elicits agreement or disagreement but not transformation.

Argumentative criticism is the coin of the realm in academia. We are rewarded for it, and give up our claims to depth of knowledge and sophisticated methodologies when we do not practice it. Yet implicative criticism is every bit as important, as any sympathetic reader understands. It may be even more important, ultimately, if we do indeed seek transformation. Implicative criticism does unavoidably put the self at risk, it’s true. And some things do need protection, and a vigorous argumentation to pursue that need.

Yet among the many heard and unheard melodies that play through my mind, the chords of inquiry bring the deepest haunting and the most powerful insights. The writers we read in this seminar sound to my ears many deep chords of inquiry, as they imagine Doug Engelbart’s “thought vectors in concept space,” as they strive toward Alan Kay’s beautiful aphorism that “the computer is an instrument whose music is ideas.” Each chord followed by a question mark, like Vannevar Bush’s provocative little “presumably” as he ends “As We May Think.”

Unresolved, yet yearning, and musical for all that.

Personal, Not Private

What do we know, but that we face
One another in this place?

W. B. Yeats, “The Man and the Echo”


I spend a lot of time talking to academics about social media. I field many frequently asked questions and try to speak to many frequently voiced objections. Sometimes the effort is exhausting or even exasperating, particularly when the questions are really objections in disguise. Answers aren’t much use in that case. Other times, however, useful distinctions may emerge–useful to me, at least, and perhaps to others as well.

One of the typical questions has to do with how “personal” social media are, and how troubling that can be for academics. First, I have to unpack “social media” a bit, and begin to distinguish between blogs, Twitter, Facebook, and the rest. These are all “social media,” yes, but they are very different in practice, with different challenges and opportunities. After these distinctions, though, I’m still faced with the core question: what’s valuable about the personal element in these media? Why should I care? And why should I make myself vulnerable by sharing my personal life with the world?

There are many implications and assumptions hidden in the questions. Those who want to cleanse discourse of the personal seem to assume that “personal” means “irrelevant to anyone else,” or “ephemeral,” or “trivial.” The classic example is “what I had for breakfast.” (I’m on the wrong networks, obviously, as I myself don’t see breakfast tweets or blog posts or Facebook status updates.) Yet there’s also a thread of fear in these dismissals and objections, a fear or even a defiance that I acknowledge and take seriously. In this sense, “personal” also means “none of your business,” and “too dangerous to share.”

So I’ve begun to distinguish “personal” from “private.” The idea is that “private” means “don’t share on social media.” “Private” belongs to you, and you should always be vigilant about protecting your privacy. Without privacy, our agency is diminished, perhaps eliminated. Without privacy, we cannot generate or sustain the most intimate bonds of trust. Without privacy, our personhood is at risk.

But what of the personal, as opposed to the private? I believe the words are not synonyms. Instead, I believe private is a subset of personal.

I think those aspects of the person that are not private not only can be shared but ought to be shared. This is what we mean when we tell writers they should find their own voices. This is what we mean when we say we seek to “know as we are known,” as Parker Palmer insists. This is what we mean when we talk about “integration of self,” when we speak of our concern for “the whole person.” It is only when we bring the personal (not the private) to our discourse that we understand the rich complexity of individual being out of which civilization is built–or out of which it ought to be built. The personal keeps our organizations from becoming mere machines. The personal preserves dignity and community. The personal brings life to even the most mundane and repetitive operational tasks. We neglect or conceal the personal (not the private) at our peril.

I tell my students that I have only two rules for us in our work together: “passion encouraged; civility required.” The passion is always personal, as is the civility. The forbearance we show each other within our civility is a personal respect for the other, which also means a respect for the complexities of their privacy, complexities hinted at, though not made visible, primarily through the extent to which we share our personhood.

The Oxford English Dictionary entry for “person” offers many fascinating definitions, but the salient one for what I’m exploring here is definition 3a:

The self, being, or individual personality of a man or woman, esp. as distinct from his or her occupation, works, etc.

 

The personal is who we are “as distinct from [our] occupation, works, etc.” Our occupation and works are the result of effort, luck, ability, connections, a whole host of purposeful and chance occurrences. But we are not defined by our works and occupation. We are defined by something larger and more elusive, and more dynamic too. Sharing that larger, more elusive, and more dynamic aspect of selfhood is valuable, reminding ourselves and those around us that all of us are more than we appear to be in any particular transaction or encounter. Such reminders encourage humility. They also encourage a kind of exhilarating anticipation, as one never knows which humble or exalted personage may be one’s unmet friend, an angel to entertain unawares.

Sharing the personal, as distinguished from oversharing the private, means engaging with personhood in all its messy and glorious complexity, and all its potential, too. If, as Jon Udell reminds us, “context is a service we provide for each other,” the context is not merely informational, nor is it about matters that should remain private.

It is personal.

Cycle Rider

Photo by “Seb” (el_seppo).