• Goto NETFUTURE main page
  •                                 NETFUTURE
    
                       Technology and Human Responsibility
    
    --------------------------------------------------------------------------
    Issue #61       Copyright 1997 Bridge Communications       December 1 1997
    --------------------------------------------------------------------------
                Editor:  Stephen L. Talbott (stevet@netfuture.org)
    
                         On the Web: http://netfuture.org
         You may redistribute this newsletter for noncommercial purposes.
    
    
    CONTENTS:
    *** Editor's Note
    
    *** Quotes and Provocations
          What Movies and Computers Have in Common
          The Digital Citizen: More Nonsense from Wired
          The Threat of Computer Illiteracy among Babies
          If Big Brother Has Been Dismembered, Are We Safe?
          Why I Do Not Use `Gender-neutral' Language
    
    *** About this newsletter
    

    *** Editor's Note

    This is the first issue of NETFUTURE to come to you courtesy of our new host, the International Federation of Library Associations. Please see the updated subscription information at the end of the newsletter. All existing subscriptions have been transferred to the new listserver. If you have posted NETFUTURE subscription information somewhere, I'd greatly appreciate your updating it; if you haven't posted anything, give it some consideration!

    I hope to see some of you at the Columbia University conference on Education and Technology: Seeking the Human Essentials, where I'll be giving a talk on Thursday, December 4. For more information, call 212-678-3987, or see the announcement in NF #56.

    SLT

    Goto table of contents


    *** Quotes and Provocations

    What Movies and Computers Have in Common

    Yale computer scientist David Gelernter writes:
    When I was in school in the sixties, we all loved educational films. When we saw a movie in class, everybody won: teachers didn't have to teach, and pupils didn't have to learn. I suspect that classroom computers are popular today for the same reasons. ("The Myth of Computers in the Classroom," in Minutes of the Lead Pencil Club, edited by Bill Henderson)

    The Digital Citizen: More Nonsense from Wired

    Always specializing in missing the point, Wired magazine succeeds more spectacularly than usual in its December issue. The cover article by Jon Katz (a Wired contributing editor) celebrates the "first in-depth poll" of the digitally Connected, and reports that they are "optimistic, tolerant, civic-minded, and radically committed to change."
    Almost all conventional wisdom about digital culture -- especially as conveyed in recent years by journalists, politicians, intellectuals, and other fearful guardians of the existing order -- is dead wrong. The Internet, it turns out, is not a breeding ground for disconnection, fragmentation, paranoia, and apathy .... The online world encompasses many of the most informed and participatory citizens we have ever had or are likely to have.
    The poll, sponsored by Wired and Merrill Lynch Forum, assures us that the Connected are more likely than the Unconnected to know who the chief justice of the United States is; to believe in change, racial diversity, and a better future; to have confidence in the two-party political system; and to read books.

    All of which, we're told, puts the lie to the "countless tales of perversion, porn, hatemongering, violence, addiction, and other perils" that mainstream journalism is forever disseminating.

    The common stereotype of the Internet as a haven for isolated geeks who are unaware of important events occurring outside their cavelike bedrooms can now be exploded as an inaccurate myth. The same goes for the caricature of technology as a civic virus that breeds disaffection from politics.
    (Despite repeated allusions to them, Katz never tells us exactly which publications constitute this Net-bashing mainstream. He seems to have in mind those often silly articles that some publications occasionally run, dramatizing the "dark side of the Net." The tabloid-urge, unfortunately, is a part of mainstream journalism. But this fact is wholly compatible with the dominant reality Katz ignores. Where is the mainstream publication whose education pages are trying to brake the lemming-rush to wire our schools, or whose business pages do not urge the centrality of everything high-tech for our economic future, or whose feature pages do not lionize the latest, hot, high-tech start-ups along with their CEOs, or whose editorial pages fail to treat the "information superhighway" as a sacred cow that should be encouraged to wander unimpeded into every corner of our culture, or whose Christmas buyers' guides are not doing everything possible to stimulate consumer interest in the coolest high-tech gadgets?)

    Katz' primary mission is to let us know beyond any doubt that

    clearly, there is now evidence that technology promotes democracy, citizenship, knowledge, literacy, and community.
    I have three comments:

    First, until very recently the critiques of digital culture were necessarily directed at the culture's earlier, "purer" manifestations -- what we might call the John Perry Barlow phase of the Net, when a heady mixture of libertarianism, warmed-over counterculture, and technological optimism ruled the day. It was a time when the networked computer could be embraced wholesale as the redemptive substitute for rotten social institutions. This electronic culture was rooted in the research departments, computer engineering organizations, underground publications, university computer science programs, and bulletin board networks that incubated the modern Net and brought it to birth.

    It's hardly surprising that the various utopian disaffections, cultural distortions, and imbalances of that earlier phase have been diluted by the more recent arrival of the masses.

    Second, the poll results reflect the education, political power, economic strength, and faith in the existing order characteristic of a relatively privileged class -- namely, those who are able to get connected and are equipped to capitalize on their connections. And, again, it is hardly surprising to find that the more educated, better-off folks are also better-read, more politically engaged, and so on.

    I have little doubt that a poll taken during the first decade or two of the television era would likewise have shown a relatively well off, better-educated, better-read, and politically engaged audience. There were, at the same time, widespread, positive expectations about the future of television-influenced education, culture, and politics. So what?

    Third, Katz' answer to the "so what?" is missing. He doesn't supply a single sentence to support his contention that "technology promotes democracy, citizenship, knowledge, literacy, and community." A poll purporting to show what sort of user has connected during the build-up of the Net tells us nothing at all about the effects the Net will "promote" in these users. For that you would need to track these users over time.

    Meanwhile, he might have remembered television. Regardless of the well-intentioned involvements and expectations of those who took up television, it would not be easy today to argue that television strengthened community or encouraged democratic participation or redeemed education. But, in any case, my point is that the argument needs to be made, not just assumed. Until Katz offers at least a shred of evidence that the Net will indeed prove salutary, his claim remains suspended in mid-air, lacking all support.

    It is hard to believe that the editors of a major publication would offer such a massive non sequitur as a dramatic cover feature. The giddy sense of exultation and triumph with which Katz and Wired herald this poll of 1444 Americans is, for me, the most telling aspect of the article. Whatever may be the case with Net users as a whole, these folks apparently remain in the Barlow, wish-fulfillment phase of cyberspatial development.

    It's an especially unhealthy phase. Katz welcomes a poll result showing extraordinary confidence among the Connected in their ability to master and direct the forces of technological change. We who are connected certainly do have good reason to believe we can ride these forces for personal advantage -- I am certainly making the attempt -- and that may be enough to sustain considerable enthusiasm for a time. But for any of us, at this point in history, to fancy ourselves masters of the self-driven, global, technological juggernaut is the sheerest fantasy.

    The sober effort to achieve such mastery is, of course, exactly what's needed. Unfortunately, there still aren't many signs of sobriety at Wired.

    The Threat of Computer Illiteracy among Babies

    In case you've been worrying about your child getting a head start on growing up, you can now buy software for toddlers. In fact, there's one package aimed at six-month-old tots; banging on the keyboard produces stars on the screen. It's an advanced training program, I imagine, to prepare children for Saturday morning cartoons.

    But, as U.S. News & World Report (November 24, 1997) points out, the products aren't really for the youngsters:

    Don't kid yourself by thinking that it's the changing needs of children that drive this kind of software to the shelves: It's scared parents.
    Which raises the interesting question: Who are the real Luddites? If (according to the modern, non-historical usage) Luddites are people who have failed to understand and master computerized technology, which therefore inspires fear in them, then we can hardly deny this cliched label to those overwrought parents who are stampeding us toward computer-based education.

    Additional pressure, of course, comes from business people without scruples:

    "Toddler software is an artifact of smart merchandising," says [children's software expert Cathy] Miranker. Knowledge Adventure's JumpStart series proved to the other software makers that they can make a lot of money playing on parents' fear that their kid won't succeed in life without a head start.
    Note, incidentally, that even this (apparent) criticism labels the merchandisers "smart". As I pointed out in my piece on the Invisible Hand (NF #60), the more-or-less explicit, more-or-less cynical, and always crass acceptance of selfish business practice as not only inevitable but also unworthy of sharp condemnation (because it is supposed to be socially beneficial in the end) has now become commonplace, having filtered down from savvy economists to would-be savvy commentators.

    There are exceptions, the public's low regard for tobacco executives and their companies being perhaps the most obvious one. But don't look for the peddlers of toddler software to be classed with tobacco pushers any time soon. The halo surrounding the high-tech industry remains untarnished as far as the buying public is concerned.

    If Big Brother Has Been Dismembered, Are We Safe?

    In "No Place To Hide" (Forbes, September 22, 1997) Ann Marsh offers a useful survey of the potentials and current realities of electronic surveillance. Among the items she ticks off:

    But Marsh also looks at the positive side: tracking lost kids and Alzheimer patients; nabbing criminals by installing tiny transmitters in cars and store merchandise, or in stacks of money given to bank robbers; automatically dialing 911 and reporting the location of car accidents; tracking salmon migrations; and using drones to sniff out airborne pollutants.

    Like it or not, Marsh concludes,

    The world is becoming smaller and smaller and ever more transparent. Your fenced-in yard won't be quite as private as it used to be. But neither will the dark alley near your bank teller machine. Tradeoff, tradeoff, tradeoff.
    Yes, but the notion of tradeoffs can lead us to stop thinking too soon. It's not simply a matter of looking at a particular piece of technology and saying, "Gee, it could be used this way (which is good) and it could be used that way (which is bad) -- and let's make the bad illegal." Given the complex interrelationship of all technologies, we have little choice but to seek larger and deeper patterns.

    Once we do that, we will not always find neatly offsetting facts on opposite sides of the balance. It is not as simple as seeing which set of facts outweighs the other. There may be some patterns that are reinforced by both sides of a supposed tradeoff.

    In particular, chip implants for tracking Alzheimer patients, while obviously helpful for some purposes, would not necessarily prove benign in the larger picture. The devices could easily encourage the continuing depersonalization of care for the elderly -- especially in a society already moving in that direction. The more easily you can electronically monitor a person, the more invisible he tends to become as a person -- and long-term social policy toward invisible people is rarely benign.

    Similarly, medical data stored on chip implants may save some lives. But as this data gets electronically transmitted from here to there for analysis, the risk is that the individual, flesh-and-blood patient in all his particularity will be replaced by the sum of his data. So the medical benefit, to the degree it entails disregard of the individual, is not unrelated to various abuses on the other side of the balance. For example, violations of data privacy become much more likely once the patient has been lost sight of.

    And, as I have pointed out before, the ability to nab criminals through high-tech tracking devices is occurring as part of a general trend toward the technical mediation of human relations. This in turn -- if we do not counter the trend with a strong consciousness of community -- leads to the weakening of the social matrix that is the most effective barrier to crime. So even the benefits of the tracking devices may be part of a distinctly unhealthy pattern that negates the benefits.

    In her article, Marsh mentions the Orwellian, dictatorial potentials of the new technologies, and sets them in the balance against the freeing potentials. But, as I argued in "Distributing Big Brother's Intelligence" (NF #59), we need to be alert to the possibilities of a "distributed tyranny" requiring no dictatorial center. Both sides of the balance in which we usually assess the issues may weigh in favor of such a tyranny.

    While Marsh mentions Saddam Hussein in passing, it is worth remarking that nearly all her examples of dangerous technical implementations come from the United States. It is hard to imagine that Americans will succumb to an old-style dictator in the foreseeable future. But it is quite imaginable that we will continue succumbing to the "necessities" of progress -- the very necessities that lock us in debate about what may in some cases be lose-lose tradeoffs.

    When hidden cameras in public places, electronic surveillance of employees, and the tracking of individual product purchases are on the increase, it is not because we are heading toward a centralized dictatorship, but rather because the entire fabric of corporate business, law enforcement, and public consumption seems to require these things. Here the immediate necessity is rooted in economics, there in personal health; here in law enforcement; there in the needs of the disabled.

    I am not sure to what degree we might avoid pursuing the various technical implementations that sustain the threats. Who, after all, would deny to the disabled the latest technical assist? But I am sure that we had better step back and take a hard look at the overall shape of the puzzle pieces slowly assembling themselves all around us.

    I can't help thinking of those coarsely scanned images that, from close up, yield nothing recognizable, but from a distance betray a human countenance. Let us hope that the face now taking shape in the pattern of our technologically transformed lives is not the dismembered and redistributed face of Big Brother. But if it is, our only hope for recognizing the fact is to gain some objective distance between ourselves and the "inevitable" progress of technology.

    (Quotations are from the online version of Marsh's article: http://www.forbes.com/forbes/97/0922/6006226a.htm.)

    Why I Do Not Use `Gender-neutral' Language

    I know that some of you have wondered about my general (although not quite exclusive) use of the generic "he". It is only recently that I reverted to this usage; in my 1995 book, for example, I switched between "he" and "she". ("The astronomer with her computer-generated charts....") I do not regret my earlier usage, but as time goes by it seems to me less and less satisfactory. I'm not fully sure why, but it has something to do with a certain self-conscious arbitrariness about the whole thing. Why, exactly, do I choose "she" here and "he" there? And why can't I shake the uncomfortable feeling that someone is looking over my shoulder to see whether my statistical distribution of these terms is somehow suspect?

    Arbitrariness may, for a time, play a useful role in jerking the reader (and the writer) awake regarding entrenched biases of society. But eventually, I think, we have to transform ourselves at the level where the problems first arise, and then the language issue becomes largely moot. If people were inherently incapable of gender bias, on what grounds would we complain about a language convention stipulating the use of the generic "he"?

    Anyway, I think there's a basic contradiction in the attitude that says, "We are still a biased society, language unavoidably reinforces our bias, and therefore you are irresponsible to refuse gender-neutral language." This argument tries to have it both ways: on the one hand, we are determined by our language; on the other hand, we are masters who can redesign the language in the interest of justice.

    Well, I think we must have it both ways insofar as we are partially determined and are only struggling toward freedom. But that doesn't save the argument. Putting it simply: the point at which we become free and aware enough to change our language in order to lift the downtrodden is necessarily also the point at which we are no longer so unfree as to be chained by the old meanings we now find objectionable. If we can change our words in clear consciousness of what we are doing, we can also change the meanings with which we use old words.

    Every writer must strive to let his own meanings shine through the words he uses, so that they burn away everything that does not fit his intention. Of course, this goal is never fully achieved. But the more consciously we take up our language, the closer we come to the goal.

    I think the whole movement toward gender-neutral language has been positive in this sense: it betokens a growing consciousness of our responsibility for language. I hope, in this spirit, that you will judge my words according to the meaning and intent you see me putting into them -- not automatically, based on the mere appearance of certain alphabetic sequences.

    Also, please don't take me to be proposing a "correct" stance. I may change again shortly! In any case, it's amazing how much of an author's meaning can shine through even the most limited vocabulary and the wackiest grammar. Always look for the author's intent -- exactly what we risk losing, incidentally, as words are detached from their author and dispersed through our electronic communication channels and databases.

    SLT

    Goto table of contents


    *** About this newsletter

    Copyright 1997 by The Nature Institute. You may redistribute this newsletter for noncommercial purposes. You may also redistribute individual articles in their entirety, provided the NetFuture url and this paragraph are attached.

    NetFuture is supported by freely given reader contributions, and could not survive without them. For details and special offers, see http://netfuture.org/support.html .

    Current and past issues of NetFuture are available on the Web:

    http://netfuture.org/

    To subscribe or unsubscribe to NetFuture:

    http://netfuture.org/subscribe.html.
    Steve Talbott :: NetFuture #61 :: December 1, 1997

    Goto table of contents

  • Goto NETFUTURE main page