• Goto NETFUTURE main page
  •                                 NETFUTURE
    
                       Technology and Human Responsibility
    
    --------------------------------------------------------------------------
    Issue #37      Copyright 1997 O'Reilly & Associates        January 8, 1997
    --------------------------------------------------------------------------
                Editor:  Stephen L. Talbott (stevet@netfuture.org)
    
                         On the Web: http://netfuture.org
         You may redistribute this newsletter for noncommercial purposes.
    
    CONTENTS:
    *** Quotes and Provocations
          Dreams of a Global Village
          The Mystical Properties of Zero
          Radio Cyberdays
          Do Computers Kill People?
    *** Things That Bite Back: Tenner on Productivity (Stephen L. Talbott)
          That computer is costing you way more than you think
    *** About this newsletter
    

    *** Quotes and Provocations

    Dreams of a Global Village

    Way back in Wired 1.5 (November, 1993), Alvin Toffler (who is always referred to as "the futurist Alvin Toffler") complained that the U.S. should be dropping faxes and camcorders into Yugoslavia instead of food. In the simple-minded manner of popular futurists, this left unaddressed the question whether more rather than less killing would occur in ethnic enclaves where every "foreigner" could be suspected of recording history.

    But maybe the futurist Alvin Toffler was just a few years behind the curve. Next time around, let's try dropping Internet computers along with certificates granting free access to the obedient playmates of the Virtual Dreams site. That might do it. There'd be little time for distractions like killing your neighbors. And, just to exercise those futurist genes, we can look forward a bit further to full-immersion virtual reality. Presumably our air drops will then play their role in finally implementing a Global Feelage.

    Goto table of contents

    The Mystical Properties of Zero

    Professor Prabhakar Ragde, a NETFUTURE reader and mathematician in the computer science department at the University of Waterloo, offers a nice twist on the Year 2000 problem (NF #35), which he says
    may be the first mass disillusionment faced by those who up until that point have accepted the blind optimism of those who promote technology. A lot of "ordinary people" are going to be inconvenienced. Add this to the expected millennial madness -- apocalyptic prophets, a surge in New Age spiritualism, and the urge to introspection that the sight of a "0" seems to invoke in us -- and you have the potential for a considerable shift in public attitudes towards technology.
    An interesting thought. I would only add that any lasting shift--which I will hope for along with Professor Ragde--must be born of a deep, introspective recognition that we've been asleep in our relation to technology. Perhaps indeed the Year 2000 problem can help to trigger such a recognition in those who are already inclining to pay attention.

    Beyond that, however, I fear the potential of anti-technology fads fully as much as pro-technology fads. Neither is a matter of wakefulness. Mastering the machines in our lives is as different from smashing them as it is from yielding passively to them. The smashing, after all, will provoke its own backlash in turn, and amid our rudderless thrashing about we will find ourselves more and more in thrall to the intelligent machinery around us (which, in its own way, does seem to know where it wants to go).

    Once we have imposed proper discipline upon our hapless digits, we will still face the enduring Year 2001 problem: how do we subordinate the digital flood to our own purposes?

    Goto table of contents

    Radio Cyberdays

    Responding to the historical quotes about radio and television in NF #36, a reader whose name I unfortunately mislaid sent me some electronically posted excerpts from an article in the January issue of Postmodern Culture (vol. 6, no. 2). The piece is by Martin Spinelli, Department of English, SUNY-Buffalo, and is called "Radio Lessons for the Internet." As reported by Spinelli, here's how Martin Codel, "a newspaper editor and later a radio theorist," described the promise of radio in 1930:
    That anything man can imagine he can do in the ethereal realm of radio will probably be an actual accomplishment some day. Perhaps radio, or something akin to radio, will one day give us mortals telepathic or occult senses!
    Spinelli notes how, in Codel's view, "reality and fantastic projection overlap....Before radio life possibilities were confined to what could be done in the material world; after radio there are no limits."

    Another commentator, Rudolf Arnheim, wrote a book in 1936 in which he said:

    Wireless without prejudice serves everything that implies dissemination and community of feeling and works against separateness and isolation.
    As Spinelli observes, this proposes radio as an "antidote for the very social fragmentation it encourages." A familiar ring, no?

    Goto table of contents

    Do Computers Kill People?

    "Everything that happens anywhere in society," according to Phil Agre (Red Rock Eater News Service),
    happens on the Internet too, but everything that happens on the Internet is news, and when something bad happens on the Internet, the "line" instantly arises that the bad thing in question is a property of the Internet.
    Agre is commenting on the Internet-assisted spread of a presumably silly rumor about comet Hale-Bopp. The rumor, he points out, was also effectively countered by means of the Internet. He goes on to offer some useful advice about from-the-hip characterizations of the Net:
    Let's not let anyone essentialize the Internet and say "the Internet does this" and "the Internet does that" and "the Internet spreads rumors" and "the Internet causes social hierarchies to collapse and brings an era of peaceableness and decentralization to the world forever and ever amen," because those are not things that the Internet itself is capable of doing. Those are things that people do, or don't do, as they collectively see fit.
    All such statements of the "guns don't kill people" variety (or of the opposite, "guns do kill people" variety) are likely to provoke yet another installment of my periodic harangue about technological neutrality. This one is no exception.

    The argument that "guns don't kill people; people do" is unassailably correct--and comes down nicely on the side of human freedom to use technology as we choose. The theme of freedom--along with its correlate, responsibility--is one I've pressed repeatedly in NETFUTURE.

    But there's another side to the story. Every technology already embodies certain human choices. It expresses meanings and intentions. A gun, after all, was pretty much designed to kill living organisms at a distance, which gives it an "essentially" different nature from, say, a pair of binoculars.

    If all technology bears human meanings and intentions, the networked computer carries the game to an entirely different level. Its whole purpose is to carry our meanings and intentions with a degree of explicitness, subtlety, intricacy, and completeness unimaginable in earlier machines. Every executing program is a condensation of certain human thinking processes. At a more general level, the computer embodies our resolve to approach much of life with a programmatic or recipe-like (algorithmic) mindset. That resolve, expressed in the machinery, is far from innocent or neutral when, for example, we begin to adapt group behavior to programmed constraints.

    Putting it in slightly different terms: Yes, our choices individually and collectively are the central thing. But a long history of choices is already built into the technology. We meet ourselves--our deepest tendencies, whether savory or unsavory, conscious or unconscious--in the things we have made. And, as always, the weight of accumulated choices begins to bind us. Our freedom is never absolute, but is conditioned by what we have made of ourselves and our world so far. The toxic materials I spread over my yard yesterday restrict my options today.

    It is true, then, that everything comes down to human freedom and responsibility. But the results of many free choices--above all today--find their way into technology, where they gain a life and staying power of their own. We need, on the one hand, to recognize ourselves--pat, formulaic, uncreative--in our machines even as, on the other hand, we allow that recognition to spur us toward mastery of the machine.

    It is not, incidentally, that the effort to develop the latest software and hardware was necessarily "pat and formulaic." It may have been extremely creative. But once the machine is running and doing its job, it represents only that past, creative act. Now it all too readily stifles the new, creative approaches that might arise among its users. Every past choice, so far as it pushes forward purely on the strength of its old impetus, so far as it remains automatically operative and thereby displaces new choices--so far, that is, as it discourages us from creatively embracing all the potentials of the current moment--diminishes the human being. And the computer is designed precisely to remain operative--to keep running by itself--as an automaton dutifully carrying out its program.

    The only way to keep our balance is to recognize what we have built into the computer and continually assert ourselves against it, just as you and I must continually assert ourselves against the limitations imposed by our pasts and expressed in our current natures.

    It is not my primary purpose here to comment on the Internet as a rumor mill, but it is worth pointing out that there is a certain built-in Net bias to worry about. It may not be an "essential" bias, but it is a bias of the Net we happen to have built. For in the Net--from our design of its underlying structures to the deep grooves cut by our habits of use--we have nearly perfected the tendency, already partially expressed in printing technology, radio, and television, to decontextualize the word. More and more the Net presents us with words detached from any known speaker and from any very profound meeting of persons. At the same time, the Net offers us a wonderfully privatized blank screen against which to project our fantasies, personas, wishes.

    Obviously, there is much more to say. But not even the whole of it would be to argue that rumor must triumph over truth on the Net. It would only be to acknowledge that the Net has been constructed in accordance with certain tendencies of ours--not many of them wakeful, and therefore not many of them safe to hand ourselves over to without full alertness. The Net does have a given nature, even if that nature is, finally, our own. Not all things our own can easily be waved away upon a moment's new resolve, even if the will is there. "The spirit is willing, but the flesh is week"--and the programs are hard to change.

    I need only add that Phil Agre, whose remarks stimulated this harangue, is shouldering more than his share of responsibility for cultivating a proper alertness among Net users.

    Finally, in the spirit of provocation I challenge anyone out there who is bumping up against the question of technological neutrality: rumor mills aside, demonstrate how the analysis I have offered is fundamentally inadequate as a first-order breakdown of the question.

    SLT

    Goto table of contents


    *** Things That Bite Back: Tenner on Productivity

    From Stephen L. Talbott (stevet@netfuture.org)
         Notes concerning the book, Why Things Bite Back: Technology
         and the Revenge of Unintended Consequences, by Edward Tenner
         (New York: Alfred A. Knopf, 1996).  Hardcover, 341 pages, $26.
    
    I will offer some general comments about Tenner's book toward the end of these notes. First, however, I would like to summarize Tenner's take on why computerization has not more noticeably raised productivity in the service sector. The question is a vivid one, since "throughout most of the 1980s, it was hard to imagine services and white-collar work in general not becoming more productive. The cost of everything from random-access memory (RAM) to disk storage space seemed to be shrinking by half every eighteen months."

    Yet the cost reductions, combined with soaring investment, apparently yielded only the most paltry return:

    According to one study by the economist Stephen Roach, investment in advanced technology in the service sector grew by over 116 percent per worker between 1980 and 1989, while output increased by only 0.3 percent to 1985 and 2.2 percent to 1989. Two other economists, Daniel E. Sichel of the Brookings Institution and Stephen D. Oliner of the Federal Reserve, have calculated the contribution of computers and peripherals as no more than 0.2 percent of real growth in business output between 1987 and 1993.
    Tenner's approach to this question--as to all others--is more extensive than systematic, so the best I can do is to list a set of independent bullet items:

    A few notes of my own:

    Computerization has, according to a wide consensus, increased productivity in manufacturing and distribution. The contrast, then, is between the domain where we work primarily upon materials and the flow of materials, and the rapidly growing service domain where interaction with other people in the context of human needs, purposes, and cognitive processes is primary. This is hardly a revolutionary insight, but it does encourage us to stop and consider the "productivity problem" in conjunction with the broad attempt over the past several centuries to bring the science and technology of materials and mechanisms to bear upon the living world, the social sciences, and the human being. This attempt has given us, among many other things,

    Behind all of these developments there lay a certain simplistic mindset born of the success and mental habits of a science triumphant over the material world, but stumbling badly when confronting life. What some of these habits are and how they might be mended is something we can at least begin talking about today, and one wishes that Tenner had been inclined to take up the conversation.

    For the record, Tenner does not oppose computer use, and is not anti- technology in general. But one of his central messages is that "a labor- saving technology needs a surprising amount of time-consuming vigilance to work properly." He believes we'd be better off if we recognized this fact and planned around it.

    His book covers a wide range of technologies, from agriculture to sports to medicine. As an encyclopedic compendium and guide to the literature, it is extremely valuable. I was somewhat disturbed, however, by the haphazard way in which the material was collated--and even more by his failure to seek out any deeper ground for understanding the vast array of "effects" he chronicles. His classification scheme, based on "revenge effects," "recomplication effects," and so on, is merely formal; it remains largely unrooted in history and disconnected from any serious effort to understand the human being who develops and uses all these tools.

    Nevertheless, Tenner does attempt to frame a few relatively general and systematic insights. I will summarize the three that seem most central:

    Complex systems, chronic effects, and the backlash of intensity: Tenner seems content with such abstract categories. To pursue them further, I think, one would have to follow them into the historically rooted human being--and then the categories would probably change. After all, to say that X-rays and smokestacks both present us with risks of a chronic sort is not to speak very revealingly about what X-rays and smokestacks have in common as symptoms of modern society--or about how they differ. But they are both products of human cognition and behavior, and upon the background of that commonality we might be able to trace more intimate connections between them. Jacques Ellul, in his equally far-ranging survey, The Technological Bluff, takes exactly such an approach.

    Nevertheless, Tenner's book may have particular value precisely because he he is content dispassionately to catalog a wide range of technologies and effects, without additional fuss. While any further analysis will probably require reworking the catalog along different lines, he has performed a large service by marshalling such a massive literature in a form readily usable by others.

    Stephen L. Talbott

    Goto table of contents


    *** About this newsletter

    Copyright 1997 by The Nature Institute. You may redistribute this newsletter for noncommercial purposes. You may also redistribute individual articles in their entirety, provided the NetFuture url and this paragraph are attached.

    NetFuture is supported by freely given reader contributions, and could not survive without them. For details and special offers, see http://netfuture.org/support.html .

    Current and past issues of NetFuture are available on the Web:

    http://netfuture.org/

    To subscribe or unsubscribe to NetFuture:

    http://netfuture.org/subscribe.html.
    Steve Talbott :: NetFuture #37 :: January 8, 1997

    Goto table of contents

  • Goto NETFUTURE page