• Goto NETFUTURE main page
  •                                 NETFUTURE
                       Technology and Human Responsibility
    Issue #73       Copyright 1998 Bridge Communications         June 18, 1998
                Editor:  Stephen L. Talbott (stevet@netfuture.org)
                         On the Web: http://netfuture.org
         You may redistribute this newsletter for noncommercial purposes.
    *** Quotes and Provocations
          Technological Zealotry: Heads I Win, Tails You Lose
          Can Photography Blind Us?
          Exchanging the World for a Map
    *** How Technology Dumbs Down Language (Stephen L. Talbott)
          ...and Undermines the Global Village
    *** Correspondence
          Where Were the Academics When It Counted? (Bob Jacobson)
          Response to Bob Jacobson (Langdon Winner)
    *** Announcements and Resources
          Canadian Teachers' Federation Shows Some Backbone
          Higher Education and Computer-Mediated Communication
    *** Who Said That?
    *** About this newsletter

    What Readers Are Saying about NETFUTURE

    "You may well consider my concern over a possibly lost issue of NETFUTURE a compliment to your skill in choosing topics and in expounding your views on them. In this age of information overflow on all fronts, not many people read texts over 5K that do not have graphics in them."

    (For the identity of the speaker,
    see "Who Said That?" below.)

    *** Quotes and Provocations

    Technological Zealotry: Heads I Win, Tails You Lose

    In order to save you much wasted time, I present this faithful digest of 179,645,392 conversations on the Net about technology:
    CRITIC: This machine is getting out of hand and carrying us where we don't really want to go.

    TECHNO-ZEALOT: Don't blame the machine. It's up to us to put it to good use.

    C: That's what I'm saying. It's up to us, so we'd better back off for a while and actually think about our relation to the machine.

    T-Z: Luddite!

    Can Photography Blind Us?

    It's amazing how often one runs into what I have called the "fundamental deceit of technology" -- the belief that a technical advance, by itself, is the solution to an essentially human challenge. (See NF #38.)

    Kenneth Brower, writing in the May, 1998 issue of Atlantic Monthly, describes the work of noted photographer, Joseph Holmes. A "consummate and obsessive printmaker", Holmes has become an expert in digital printmaking. The story of the digitalization of photography is a fascinating and disturbing one, but what particularly caught my eye in this article was Holmes' confidence that his servitude to technology will grow less burdensome with time.

    Currently, he spends most of his life in the studio. "His time in the field toting a camera has been reduced to three or four weeks a year. He does not like this imbalance but sees no way around it." However, he expects things to change:

    Digital-imaging technology is evolving by leaps and bounds, Holmes said. A number of companies are racing to develop the best printers, and each year's ink-jet machines are twice as good as those of the year before. "It's endless," he said. Hearing the weariness in his voice, I asked if he worried that his labors might be Sisyphean, given the way the technology keeps changing. He denied it. He was familiar now with computer systems. Once digital printmaking really started to work, he said, it would only get better and easier.
    Better and easier: yes, if you're talking about a particular technical achievement. But the technical achievement is not the photographer's achievement. Photographers, Brower notes elsewhere in the article, try to get a picture that's never been seen before. And as the technology improves, making today's stunningly enhanced effects routine, it will no longer be those effects that creative photographers strive after. They will have to shift continually toward the "cutting edge". Five years from now, barring a change in his philosophy, Holmes will be spending more time fiddling with his hardware and software, not less.

    The alternative, of course, is to realize that "what has never been seen before" is above all a function of the inner eye. Training the eye to see what it could not see before is the first requirement for every visual artist. It is teeth-grinding work. Today, however, our culture seems more determined to entertain the eye -- to give it the passive pleasure of new sensations supplied from without by means of novel technique. This is a formula for weakening the eye's independent powers, not for strengthening them.

    (Thanks to John Thienes and Craig Branham for pointers to the Atlantic Monthly essay.)

    Exchanging the World for a Map

    Back in May my wife and I traveled out of state to visit her family, leaving a young garden that couldn't survive the hot sun for more than a day or two without water. With a television available at my in-laws, I spent a while watching the Weather Channel, hoping to see the green and orange evidence that some vigorous showers had doused the Taghkanic Hills just west of the Berkshires.

    During those few days, as it happened, a stationary front lay across much of the country, extending from the Rockies to the Atlantic. It hovered wonderfully close to our garden, with thunderstorms kicking up along this front more or less continuously. In fact, with perfect timing for our thirsty plants, doppler radar showed a massive collection of storms passing directly over our region.

    I was relieved. But just to be sure I called a neighbor the next day. Yes, she said, there had been storms all around, but they had mostly bypassed our little valley, which did not receive enough rain to quench a garden's thirst.. She offered to water the garden herself -- an offer I gratefully accepted.

    That exposure to the Weather Channel, which I had scarcely seen before, made a strong impression on me. The vivid colors that seemed to reach out and grab me; the little verbal dramatizations repeated again and again by the program hosts; the impressive data collection, commanding expertise, and reassuring predictive reliability that apparently undergird the entire show; the reduction of atmospheric complexities to a relatively small number of graphic abstractions that almost anyone could follow and make some sense out of -- all this, I imagine, would receive justifiably high marks from those who set standards for the visualization of information.

    But my own reaction, as a novice weather watcher, was one of horror. This, with all its falseness, I thought, will surely conquer the public's imagination. What will become of our already tenuous knowledge of the weather? Here, it seems, is yet another domain where a handful of simplistic abstractions carry us away from the world we live in.

    Did I overreact?

    As I write these words, I am sweating at the bottom of several thousand feet of warm, humid, highly unstable air piled above me. A stationary front again lies nearby, having already triggered a thunderstorm this morning (saving me a watering chore). According to my weather radio, the situation is expected to grow more interesting later today: a line of intense thunderstorms over the eastern Great Lakes is racing toward us at sixty miles per hour, and will arrive here in late afternoon, possibly even producing a tornado.

    [A tornado did indeed arrive on that May 31, causing great destruction near Albany and passing about twenty miles north of here. Another tornado was reported by the National Weather Service just a couple of miles from our home, but apparently never touched down. My own exhilarating experience was of the most intense and sustained lightning storm I have ever been through.]

    At the moment, though, there is only a hazy sky and strong, gusty, breezes generally out of the south or southeast, indicating that the front -- which slipped south of us yesterday, is now pulling back northward.

    I love the drama of a developing storm. But it would take a practiced and skillful eye to read the full drama in this hour's bland sky -- an eye far more skillful than my own. If I had a television, how much easier it would be to remain indoors and find my drama ready-made, displayed in bright colors on the Weather Channel! I could hear, perhaps, about trees down in Buffalo or hail damage in Syracuse, and thrill time and again to the information that this grand display of nature's power is coming toward me!

    Yet, in reality, the storm's coming toward me is exactly what I would miss, for I would not be experiencing it. And even my later experience of the storm itself would be impoverished, for I would have lost its context. I would be reduced to a thrill-seeker whose cognitive functions were atrophying and whose connection to the world consisted of mere sensationalism.

    We need tools like those employed on the Weather Channel. But one of the most ignored truths of our day is that the tools for presenting information as effectively as possible are at the same time tools for distancing ourselves from the world as effectively as possible. What we call information is necessarily abstract. It is not the world, but a highly selected and eviscerated representation of it. The more powerfully we grab a person with this representation, the greater the wrench he must apply to himself in order to escape the map and regain the world.

    We are refining the tools of abstraction with great sophistication. It is not so clear that we are cultivating the necessary complement: skill in wrenching ourselves free from the abstractions.

    Brief Notes

    ** Commenting on rumors of an Anglo conspiracy to employ massive computing power to scan all the world's telecommunications for useful intelligence, Louis-Marie Horeau writes:
    Until a computer understands that the balance of the world can be threatened by the proximity of the words "Bill", "fly", and "Paula", it should be possible to chat in peace for a while. (From Le Canard Enchaine, via Monday Review, Jun. 15, 1998)
    ** Your daughter is in high school and you are attending the school's open house. After sitting through several twenty-minute presentations by her teachers, you come to chemistry class. The teacher tells you,
    I'm not a chemistry expert, but I have excellent computer skills, and by the end of this class, your sons and daughters will have them, too. Together we will explore the wealth of chemistry knowledge on the Internet. We have a Local Area Network (LAN, as we say) which will let us work collaboratively with other chemistry classes in the school, and a WAN (Wide Area Network) that will allow us to collaborate with universities and industry. The computers' capabilities will enable us to represent our findings dynamically and creatively, and then we will be able to publish those findings on the web. (A scene imagined by Dudley Barlow, Education Digest, Oct., 1997)
    (Thanks to Peter Kindlmann for passing this along.)

    ** On warfare in the twenty-first century:

    The greatest challenge presented by the new forms of conflict is their very informality. Decreasingly, will wars be the preserves of states and alliances of great states .... We are now in an era of long and ragged conflicts, community-based, open-ended, crude and cruel, and beyond the time limitations and technical constraints of much military and diplomatic practice in the advanced world .... The emerging informal or postmodern war has been recognized by specialists for nearly a decade (e.g., M. van Crevald, E. Luttwak). The trinity of power behind modern war -- army, government, and people -- recognized by Clausewitz in the last century, ceases to be relevant. (Robert Fox, "Beyond Clausewitz: The Long and Ragged Conflicts of the Coming Millennium", Times Literary Supplement, May 15, 1998, as summarized in Monday Review, Jun. 1, 1998).
    This is one of many areas where one might study whether highly rationalized, technical capabilities tend naturally to be countered by various irrational and chaotic developments. In my own view, these are two sides of the same coin, rather as the overly intellectualized individual tends to be driven subconsciously by various erratic and ugly emotional impulses. (I have developed this line of thought a bit in chapter 25 of The Future Does Not Computer (http://netfuture.org/fdnc/ch25.html).

    ** Following India's and Pakistan's recent nuclear exploits, the Pakistani prime minister said,

    We have taken the path shown by Allah. We have jumped through these flames without thinking through our own minds and calculating, but going into a decision made by our heart, the decision of courage.
    To which the American Physical Society's Robert Park responded: "The world can only hope that before `jumping through the flames' into nuclear war, both sides will experiment with using their brains" (What's New?, May 29, 1998).

    I myself have little patience with what's going on in India and Pakistan, but even less patience with Park's cute response. (Part of my irritation comes from knowing that Park is unhappy unless he salts each issue of his newsletter with at least one jab at some imagined departure from "hard science" -- read, "materialistic dogma".) He needs reminding that both India and Pakistan pushed their brains to the limit in perfectly good scientific style in order to develop those nuclear weapons.

    The problem does not lie in the prime minister's appeal to the courage of the heart. It lies, rather (see preceding item), in the increasing tendency around the globe for the brain and heart to run on separate tracks. What's needed is a brain that helps to direct and strengthen the heart, and a heart that softens and brings wisdom to the sharp-edged reflections of the brain.

    Until we in the West learn to bring something more than our brain's scintillating technical capabilities to the mad development of our own technologies, we will not find effective ground for criticizing the mad, technology-based adventures of others.

    ** According to Science-Week (Jun. 12, 1998), a new cloned sheep called Polly (produced by the team that brought us Dolly) contains an added human gene in every cell. The consequences?

    In short order, using these techniques, laboratories will be cloning animals with human genes to produce hormones or other products for use in human clinical medicine. Second, cloned animals engineered to have human genetic diseases will be used for research into these diseases. Third, cloned, engineered animals will be produced with specific changes in their cell surfaces that will reduce the probability of organ rejection and thus give a great impetus to the use of animal organs in organ transplantation.
    It's long been a standard bedtime story of science that the Copernican Revolution removed mankind from its privileged place at the center of the universe. Come again? We who would engineer diseased animals for the greater glory of humanity -- we no longer claim privilege and centrality?

    At least the ancient claim of privilege could reasonably be said to imply a heightened burden of responsibility.


    Goto table of contents

    *** How Technology Dumbs Down Language
    From Stephen L. Talbott (stevet@netfuture.org)

    You've doubtless noticed that web search engines now offer on-the-spot machine translation of foreign-language web pages. I'll spare you the usual examples of comical translation. What worries me is not how bad they are, but how we will go about making them better.

    It's actually quite easy: all we need to do is to continue using ever less evocative, less richly textured, less meaningful language. The more we can resort to a flat, abstract, technical, and contentless vocabulary, the more satisfactory the machine translation will be. If we could finally learn to speak and write in something like a programming language, we'd be blessed with near-perfect translations. Don't look for a Moby Dick or Leaves of Grass to be written in this language, however.

    But there's a second, complementary way for the translations to become more acceptable: the reader can lower his standards of acceptance. Commentator David Jolly tells us that, while computer translations were once the butt of jokes, they are now taken quite seriously. He goes on:

    But the real story is the Internet, because web-surfers aren't worried about a publication-quality document; they just want to be able to browse foreign websites. (CBS MarketWatch, May 13, 1998)
    Of course, when we're "just browsing" we're not particularly concerned about such things as depth of understanding, subtle distinctions, fidelity to the source, and the intimate and sympathetic penetration of another mind. These objectives, along with many others, fade into the background.

    They may need to fade into the background on occasion. The concern on the Net today is whether they are fading beyond retrieval.

    In any case, all this underscores the question that a few people began asking some years ago. In the convergence of human being and machine, which is more fateful -- the machine's becoming more intelligent and human-like, or the human being's becoming more machine-like? All the commentary, all the prognostication, all the excitement seems focused on the machine's generation-by-generation ascent -- which already suggests that the human descent is well advanced.

    Searching, Filtering, Blocking

    The risks of machine translation are presenting themselves on several fronts. To begin with, the widespread use of search engines encourages authors to write for "searchability." The idea is to avoid the unexpected (and therefore potentially more revelatory) word, and instead to appease the audience's expectations. They will, after all, search according to their expectations, and if they don't find you, what good will your words do?

    The same issues arise with filtering and blocking software. There is no way -- and in principle never can be a way -- to implement a dependable filter or blocker so long as our language remains alive and meaningful. The blocking software must rely to one degree or another on past word associations, automatically correlating certain words with particular subjects and meanings. The result is that those whose intentions are not, for example, pornographic, must avoid the "pornographic lexicon" or else suffer blocking.

    But -- as the study of meaning and metaphor has made abundantly clear -- the renewal of language and the extension of human understanding depend on continual cross-fertilization between lexicons. Only in that way can we counter the tendency for our language to harden into unrelated, narrow, specialized usages that give us precision while eliminating expressive power. Such specialized lexicons are ideal for capturing, in the most prosaic terms, what we already know -- but disastrous for helping us to take wing and transcend the previous limits of our understanding.

    The concern for internationalization of web pages raises the same issues yet again. Colorful, inventive, richly textured language is not only difficult for foreigners to understand, but may also lead (we're told) to unintended messages and even insults. The standard advice is to avoid colloquialisms, unusual metaphors, and, in general, any unexpected use of language.

    While a genuine thoughtfulness may be at work in this advice, you will find that I make none of the recommended accommodations in NETFUTURE. My refusal is rooted in respect for the reader. To hear or read someone from a different culture calls for a heroic effort of imagination and sensitivity, and we do no favor to anyone by discounting this effort. Personally, I would not want to encounter a foreign author in a watered-down and patronizing form. Nor would I want to learn a foreign culture through a compromised version of its language. Only the fullest and most powerful use of language lends itself to the most profound grasp of the speaker and his culture.

    While I am not much of a stylist, I always try to do my best. I realize, though, that this stance, taken in the wrong spirit, quickly becomes arrogant. Certainly, for example, one-to-one communication calls for profound mutual accommodation. The accommodation -- the willingness to address the concrete individual in front of you -- is, in fact, nearly the whole point.

    But it happens that the mental effort and resourcefulness of imagination required for this kind of accommodation is exactly what the machine-reduction of language is now discouraging. You cannot accommodate to the world of the other person without first doing the hard work of entering it. The inability to achieve this work of imagination is surely implicated in the various ethnic conflicts currently roiling the globe.

    It is one of the characteristic paradoxes of the Net (a paradox lying, I'm convinced, at the core of the entire technological enterprise) that the tools designed to bridge the distance between peoples can operate in a deeper way as tools for destroying even the bridges we already had.

    Unspeaking the Creative Word

    Voice recognition systems offer still another venue for the attack upon language. But here it is no longer just the written word -- the word already substantially detached from us -- that is at issue. It is more directly we ourselves, in the fullest act of expression, who must adapt ourselves to the machine's limitations. We must train ourselves toward flatness, both in sound and meaning. But it is almost impossible to achieve a given quality of voice without first achieving more or less the same quality within oneself. Just how far is it healthy to practice inner qualities of machine-likeness?

    From ancient times the spoken voice -- the Word -- has been experienced as the primary agent of creation. Still today we may occasionally hear dim echoes of the Word's power, whether in song, or in dramatic presentation, or at times when we are spooked, or in those intense, interpersonal moments when everything hangs on the overtones of meaning and the soul-gripping tonal qualities in the voice of the other.

    I happen to believe that a lot hinges on our ability to rediscover, for good or ill, the powers that stream into the world upon the current of the human voice. It would, however, be a hard case to make to a computer-bred generation. And, with our adaptation to machine translation, it promises to become harder still.

    Speaking of efforts to reform and simplify language, philologist Owen Barfield has written,

    Those who mistake efficiency for meaning inevitably end by loving compulsion, even if it takes them, like Bernard Shaw, the best part of a lifetime to get there .... Of all devices for dragooning the human spirit, the least clumsy is to procure its abortion in the womb of language; and we should recognize, I think, that those -- and their number is increasing -- who are driven by an impulse to reduce the specifically human to a mechanical or animal regularity, will continue to be increasingly irritated by the nature of the mother tongue and make it their point of attack. (Preface to second edition of Poetic Diction)
    Barfield wrote that in 1951. If he were writing today, I think he would refer less to specific enemies of the mother tongue and more to the emergence of a global logic of distributed intelligence and connectivity. As we articulate more and more of our activities into the logical operations of the computerized global system, we will also -- unless we consciously resist the tendency -- sacrifice more and more of our creative world of meaning, from which alone the future can arise.

    (This is another illustration of my contention -- see NF #59 and 61 -- that the new threats of tyranny look less and less like issuing from central, identifiable authorities, and more and more like properties of "the system.")

    Goto table of contents

    *** Correspondence

    Where Were the Academics When It Counted?

    Response to: "Report from the Digital Diploma Mills Conference" (NF-72)
    From: Bob Jacobson (bluefire@well.com)

    Langdon Winner's reflections on technology and education as usual are trenchant and alarming. But his report from the conference dealing with distance education has its entertaining moments. Listening to academics lament distance teaching because it isolates them from students is very funny: in my experience as a student, many professors used every possible subterfuge to accomplish this end.

    Moreover, though now they decry the advances of technology, when various of us in the scholarly community, in the 1970s and 1980s, tried to raise an alert, we were completely marginalized. There was no place for critical communication scholars in most universities. Even today, when the media's threat to independent thinking is so obvious, the faculty who staff departments of communication or media prefer to take the safe middle road in hiring, stressing administrative studies over critical studies.

    Except for the fact that democracy's very epistemological foundation is under attack, it would be tempting to reward the academics' cries of "Assault!" with very big alligator tears. But then, they're victims too, even if they do have tenure.


    Response to Bob Jacobson

    From: Langdon Winner (winner@rpi.edu)

    I agree with Bob that professorial wailing about isolation from students is often in bad faith. To their credit, people at the Digital Diploma Mills conference seemed genuinely concerned about both a general erosion in commitment to face-to-face education and the ways in which distance learning amplifies this woeful trend.

    Will concerns about computerization produce renewed efforts to expand and revitalize contact with students? As Bob suggests, faculty status in most universities is inversely proportional to the amount of time spent in contact with students. It's research, publishing, the lecture circuit, and international visibility that matter. As legislatures, governors, politicians, and the general public focus on some of the deplorable habits of university "teachers" in this light, they may well find some appealing targets, ones not unlike the welfare mothers of recent years. A poll of 35 governors reported in the June 19 Chronicle of Higher Education contained some bracing findings in this regard:

    To meet future needs, how important will it be for institutions to:

    Allow students to receive their education any time and any place via technology:
    * Very important or important: 83%
    * Not at all important or unimportant: 0%

    Allow private-sector companies (e.g. Disney or Microsoft) to compete with colleges and universities:
    * Very important or important: 37%
    * Not at all important or unimportant: 22%

    Maintain traditional faculty roles and tenure:
    * Very important or important: 3%
    * Not at all important or unimportant: 32%

    On Bob's point about the marginalization of critical studies of communication in universities, I can only join in the lament. In programs I see forming on communications and information technology, people reputed to have serious, probing questions about the social and political dimensions of these fields are simply not invited to the table. No, the enterprise must remain "upbeat"!

    Goto table of contents

    *** Announcements and Resources

    Canadian Teachers' Federation Shows Some Backbone

    When I spoke at the Canadian National Library in Ottawa back in April, a woman named Marita Moll introduced herself to me. She is Head of Research and Technology for the Canadian Teachers' Federation, and she assured me that the CTF was that rarity I no longer expect to find on my travels: a major institution with a vested interest in education that was nevertheless resisting the demands to launch every high-tech educational vehicle it could get its hands on. As Moll put it in an op-ed piece:
    The educational system is being re-engineered on a grand scale to accommodate expensive, high-tech tools. Yet research has shown quite clearly that the desired skills are common outcomes of the much cheaper, much more inclusive, just as interactive, collaborative and effective, music and arts programs. The explanation for this discordance lies not in the objectives of education but in the politics of technologies. (Toronto Star, Mar. 26, 1998)
    I want to draw your attention to two resources available from the CTF. One is a sourcebook called Tech High: Globalization and the Future of Canadian Education. Edited by Moll, it contains papers from a number of authorities, including NETFUTURE's own Langdon Winner ("The Handwriting on the Wall: Resisting Technoglobalism's Assault on Education"). The book was co-published in 1997 by the Canadian Centre for Policy Alternatives (ccpa@policyalternatives.ca) and Fernwood Publishing (fernwood@istar.ca).

    Secondly, there's the "research and technology" portion of the CTF web site: http://www.ctf-fce.ca/e/what/restech/. You might particularly be interested in the papers listed as "critical.htm" and "critical2.html". The former, incidentally, leads off with this interesting little quote, taken from an 1889 issue of Scientific American:

    The improvement in city conditions by general adoption of the motor car can hardly be overestimated. Streets clean, dustless, and odourless, with light rubber-tired vehicles moving swiftly and noiselessly over their smooth expanse, would eliminate a greater part of the nervousness, distraction, and strain of modern metropolitan life.
    So you might want to look to the Canadian Teachers' Federation if you're needing some moral support for the struggle within your own, more faddishly inclined organization.

    Higher Education and Computer-Mediated Communication

    Sheizaf Rafaeli, Margaret McLaughlin, and Eli Noam are editing a special issue of the Journal of Computer-Mediated Communication dealing with higher education. The deadline for draft manuscripts is September 1, 1998. Here's a paragraph from the call for papers:
    We encourage submission of studies examining the form, suitability, implementation, acceptance, efficacy and utility of CMC innovations in higher education. Papers are solicited which address the dimensions of interest in deciding about CMC: Is CMC in the university about redistributing power and discourse, or is it about cutting costs? Does CMC make higher education less or more accessible, does it increase the reach and speed of diffusion, and/or does it impact the sociology of knowledge?
    For further information, contact Rafaeli (sheizaf@umich.edu) or McLaughlin (mmclaugh@rcf.usc.edu).

    Goto table of contents

    *** Who Said That?

    To the NETFUTURE listserver and many on the Net he is known as Ijon Tichy -- a pen name drawn from a Stanislaw Lem character. His real name is Asaf Bartov, and he is a twenty-one-year-old Israeli currently serving his mandatory term in the Israeli Defense Force. A professional programmer, he left the school system at age fifteen to begin working for a software firm. (He finished his formal schooling through independent study.)

    Bartov's many interests include linguistics and grammar of natural languages, as well as natural-language parsing and artificial intelligence (making him a fitting candidate for inclusion in this particular issue of NETFUTURE). He is about to found a non-profit organization focusing on the education of youth. The aim is to

    widen the young people's horizons and expose them to elements in literature, philosophy, poetry, science, politics, sociology, and history that are not covered by the Israeli school system (and indeed by far too few school systems in the world, according to preliminary research I have conducted).

    Goto table of contents

    *** About this newsletter

    Copyright 1998 by The Nature Institute. You may redistribute this newsletter for noncommercial purposes. You may also redistribute individual articles in their entirety, provided the NetFuture url and this paragraph are attached.

    NetFuture is supported by freely given reader contributions, and could not survive without them. For details and special offers, see http://netfuture.org/support.html .

    Current and past issues of NetFuture are available on the Web:


    Goto table of contents

  • Goto NETFUTURE main page