Goto NETFUTURE main page
NETFUTURE
Technology and Human Responsibility
--------------------------------------------------------------------------
Issue #95 A Publication of The Nature Institute September 23, 1999
--------------------------------------------------------------------------
Editor: Stephen L. Talbott (stevet@netfuture.org)
On the Web: http://netfuture.org
You may redistribute this newsletter for noncommercial purposes.
NETFUTURE is a reader-supported publication.
CONTENTS
---------
Editor's Note
Quotes and Provocations
The Distorting Potentials of Technical Capability
Movements, Too, Must Be Allowed to Die
What Makes a Technology Inevitable?
The Fascination with Ubiquitous Control
DEPARTMENTS
Correspondence
Toward Appropriate Behavior by Objects (Alan Wexelblat)
Response to Alan Wexelblat (Langdon Winner)
Missing the `Old' Don Norman (Steve Baumgarten)
About this newsletter
==========================================================================
EDITOR'S NOTE
I'll be offline from September 27 - October 6 due to speaking commitments
and a little bit of associated vacation. On October 1 I'll give a public
lecture at the University of Moncton, New Brunswick, on "The Strange
Disappearance of Earth: Is Cyberspace a Black Hole?". And the next day
I'll deliver the keynote to the Professional Librarians of New Brunswick:
"Awakening to Ourselves in the Age of Intelligent Machines".
If you're within striking distance, stop by. It's always a special
pleasure when I meet NETFUTURE readers on the road -- which seems
inevitably to happen these days!
SLT
Goto table of contents
==========================================================================
QUOTES AND PROVOCATIONS
The Distorting Potentials of Technical Capability
-------------------------------------------------
I've pointed out on various occasions that no domain of human activity has
been more thoroughly founded upon the computer than finance and investment
-- and, not coincidentally, no domain has been more thoroughly drained of
its human content. Unbelievably massive capital flows course through the
global bitstreams, seeking nothing but their own abstract, mathematical
increase. The healthy or unhealthy social consequences of the particular
uses of a particular bit of capital -- my capital or your capital --
disappear from the picture.
But there's another arena where computerized technology plays an
increasingly prominent role -- an arena where you'd think it couldn't help
but serve human purposes. I mean the hospital's Intensive Care Unit.
Digital readouts and computer-controlled displays reveal the patient's
apparent status at a glance, high-tech alarms signal changes in condition,
and the amazing medical procedures that we so naturally refer to as
"technical" seem to have the crisp, well-defined, algorithmic character of
subroutines executing in silicon (and, in fact, are more and more likely to
be, at least in part, subroutines executing in silicon).
Yet, the consequences of all this have some physicians worried. They fear
that the technical ability to do things is eclipsing the human welfare
that the hospital originally set out to serve.
One of those physicians, Sandeep Jauhar, recently made a remarkable
statement in a New York Times opinion piece ("First, Do No Harm: When
Patients Suffer", Aug. 10, 1999). "Most of our patients would do better
on autopilot", he quotes a fellow I.C.U. physician as saying. "They'd be
better off if we brought them in here and just left them alone".
Admitting the exaggeration in this statement, Jauhar goes on to point out
that, even for those patients who benefit from active treatment, the safe
and necessary level of care "has not been defined". He passes along this
1980 judgment from Arnold Relman, the editor of the New England Journal of
Medicine, noting that nothing much has changed during the intervening two
decades:
The cost and psychological stress of I.C.U. treatment would be
justifiable if such units were known to reduce mortality and morbidity
from levels achievable with less costly and intensive modes of hospital
care. (But) there have been no prospective, randomized, controlled
trials to supply such data.
(This is worth bearing in mind when you hear the usual denunciations of
alternative medical practices as "unproven and unscientific".) More
recent studies, Jauhar reports, "are beginning to prove that conservative
management is often the right course for many patients":
For example, a study reported recently in the New England Journal of
Medicine found that many anemic patients do better when they
receive fewer blood transfusions. Fewer transfusions mean fewer
transfusion reactions, infections and so on. And recent studies also
show that pulmonary artery catheters, used to measure blood pressure in
the lungs and once a hallmark of the I.C.U., provide little benefit to
patients and may even increase mortality, even for conditions for which
they have been almost uniformly accepted.
All this is in response to alarming data about the incidence of iatrogenic
(doctor-caused) "adverse events" -- drug reactions, wound infections,
complications from catheter insertions, and so on. In intensive-care
situations, these events can provoke a rapid, downward spiral.
However, the public is often as fully implicated in the problems as the
medical profession, a fact brought out in one of Jauhar's stories:
I remember taking care of an elderly woman in the I.C.U. who was in a
coma for a month. She developed a blood infection, and kidney, heart
and respiratory failure, but lingered for weeks with antibiotics and
drugs to elevate her blood pressure.
We all knew she was going to die and that our efforts were futile.
Still, no one in her family was willing to give up, and so we persisted
in aggressively treating her, causing her obvious pain every time we
poked her with a new catheter or pushed on her belly to examine her.
What finally caused her death was a catheter placed through a hole in
her abdomen, which probably resulted in yet another infection. She did
not live any longer because of what we had done, just ended her life
more miserably.
This story is all too common. Somehow, the sense of unbounded technical
potential encourages us, if we let it, to lose sight of the immediate
terms of our existence. Dying is one of the most profound and significant
things we do in this life. And yet, surrounded by the false mystique of
technical devices, we readily yield up all possibility of doing it well.
Movements, Too, Must Be Allowed to Die
--------------------------------------
The summer, 1999 issue of Orion magazine is devoted to a striking
assessment of the state of the environmental movement. It's exactly the
sort of feet-on-the-ground analysis that may prevent the movement
from stagnating. Certainly the lead article, which is by farmer-scholar
Wendell Berry, lays out the risk of stagnation with uncommon force.
No matter how necessary and dear a movement may have seemed to us, Berry
warns, we must get out of it when it has "lapsed into self-righteousness
and self-betrayal, as movements seem almost inevitably to do":
People in movements too readily learn to deny to others the rights and
privileges they demand for themselves. They too easily become unable
to mean their own language, as when a "peace movement" becomes violent.
They often become too specialized, as if finally they cannot help
taking refuge in the pinhole vision of the institutional intellectuals.
They almost always fail to be radical enough, dealing finally in
effects rather than causes. Or they deal with single issues or single
solutions, as if to assure themselves that they will not be radical
enough.
You can't control effects while leaving their causes in place, Berry tells
us. Nor should we assume that other people are the causes, an
assumption that easily leads us to focus on policy changes rather than
changes in behavior (including our own).
Berry vividly underscores the interconnectedness of all environmental
challenges, which renders "utterly groundless" the assumption that "we can
subdivide our present great failure into a thousand separate problems that
can be fixed by a thousand task forces of academics and bureaucratic
specialists". If we have not already recognized it, "the present
scientific quest for odorless hog manure should give us sufficient proof
that the specialist is no longer with us".
Berry recognizes that all environmental issues finally come down to the
question, "Where are the people?" -- where are those with the knowledge,
skills, motives, and attitudes to tackle the problems? It is true, for
example, that many people visit rural landscapes and even develop an
appreciation of them, but "most people are available to those landscapes
only recreationally":
They do not, in Mary Austin's phrase, "summer and winter with the
land". They are unacquainted with the land's human and natural
economies....
I am not suggesting, of course, that everybody ought to be a farmer or
a forester. Heaven forbid! I am suggesting that most people now are
living on the far side of a broken connection, and that this is
potentially catastrophic. Most people are now fed, clothed, and
sheltered from sources toward which they feel no gratitude and exercise
no responsibility.
And, finally, he points to a "profound failure of imagination":
Most of us cannot imagine the wheat beyond the bread, or the farmer
beyond the wheat, or the farm beyond the farmer, or the history beyond
the farm. Most people cannot imagine the forest and the forest economy
that produced their houses and furniture and paper; or the landscapes,
the streams, and the weather that fill their pitchers and bathtubs and
swimming pools with water. Most people appear to assume that when they
have paid their money for these things they have entirely met their
obligations.
Berry, with his eloquent disdain for incumbent nonsense, does not
always leave himself room for constructive contact with the "enemy".
But I can strongly recommend this issue of Orion both for its realistic
look at the state of the environmental movement and its suggestions
about positive ways to move forward. See in particular the article,
"Reinhabiting Environmentalism: Picking Up Where Leopold and Carson
Left Off", by Peter Sauer and also "Swinger Goes to Town: Why It's a
Good Thing Environmentalism is Dying", by Mike Connelly.
You can visit the Orion folks at http://www.orionsociety.com/ .
The Orion Society is one of the truly good things on the planet.
What Makes a Technology Inevitable?
-----------------------------------
Is there any idea more powerful, more debilitating, more drastic in its
consequences for our future than the idea of technological inevitability?
Nothing seems more certain than the march of technological progress, and
nothing more quixotic than resisting it. As a result, certain forces are
unleashed that carry society forward with all the subtlety of a tsunami,
which in turn reinforces the sense of inevitability.
But wait a minute. Let's look at a particular field of technology -- say,
biotech. In broad strokes, here's what we see:
First, the entire field is awash with money and commercial impulses. No
one will dispute that the links between "pure" research and business are
ubiquitous. Few university department chairmen fail to sit on the board
of at least one biotech company, and few if any major departments fail to
receive funding from industry. I recently heard a geneticist from a large
university tell how his department chairman had announced, "Fifty thousand
dollars goes to anyone who brings in a `special relationship' with a
pharmaceutical company".
Second, the biotech industry participates in a cycle of product
development and guaranteed obsolescence that fuels a kind of perpetual-
motion cash machine. For example, Monsanto Corporation produces "Roundup-
ready" corn seed, highly tolerant to its own Roundup herbicide. This
enables farmers to drench their soil with Roundup in order to fight weeds.
Roundup sales go through the roof. Meanwhile, the weeds start developing
resistance to the herbicide -- a process already well under way -- and
Monsanto is thereby assured a market for its next generation of herbicides
and engineered seeds, currently under development in the laboratory.
To take this particular example a little further: Monsanto is also a
prime mover in the current push to develop a so-called Terminator
technology, intended to prevent seed saving. When the technology is in
use, it will make the seeds produced by the farmer's crops sterile, so
that he must either go back to Monsanto to buy a new batch of seeds each
year, or else buy a Monsanto chemical that, when applied to the seeds,
"switches" them back on.
Third, all the resources of the federal government have been made
available to the high-tech industry in order to realize this cash machine.
As Steven Gorelick points out in the Sep./Oct., 1998 issue of The
Ecologist, the industry receives huge direct and indirect subsidies;
the Patent Office makes genetically altered life forms profitable; and the
regulatory agencies are shamefully dominated by industry interests.
(Gorelick mentions that, during the approval process for Monsanto's
recombinant bovine growth hormone, "the `revolving door' almost spun off
its hinges". You can read the horrific details in The Ecologist.)
Now, when you place particular technological developments into their
larger context like this, the question of inevitability takes on new
coloring. How inevitable is Terminator seed technology? Would it happen
without the patent laws, without the inordinate industry influence in the
regulatory agencies, without the industrialization of agriculture, and
without the subordination of research organizations to commercial
interests? It is hardly imaginable. None of this has to be; we've simply
made it a matter of social policy.
More generally, the entire commercial, industrial, scientific, and
governmental program embodies a stance toward the world that can be
summarized in the single word, `control'. That's where the perpetual-
motion cash machine comes in. What the great totalitarian regimes of the
twentieth century discovered in the realm of human affairs also holds true
in our relation to the natural world: the one-sided attempt to exercise
control leads to a ceaseless upping of the ante. The next technical fix
must be more draconian than the last. It will consume more resources and
require a more single-minded commitment from the controlling authorities.
Unfortunately, it will also set society more irreversibly upon a path that
eventually must lead to collapse.
So, in order to see the fallacy of inevitability, you need to look at the
context of the "inevitable" developments and ask yourself what patterns of
choice the context represents. It is certainly true -- barring the most
extreme cataclysm -- that we will become ever more technically capable.
But this still leaves open the directions in which we choose to
apply our capabilities. There are radically different ways of viewing and
relating to the world -- organic agriculture, for example, emphasizes
cooperating with nature rather than controlling and browbeating it -- and
the kinds of technology we spend our money on will vary according to our
vision.
A technology's "inevitability" represents nothing more than our
inattention to the various ways we participate in demanding the
technology.
(This essay was inspired by, and draws upon, a lecture on biotech given by
my Nature Institute colleague, Craig Holdrege, on Sep. 18, 1999.)
The Fascination with Ubiquitous Control
---------------------------------------
The previous article suggests (all too briefly) the nature of my concern
about Alan Wexelblat's letter-to-the-editor in this issue. My teapot
tells me when it's hot, so why, he wonders, shouldn't my house know when
I'm out of town and stop delivery of the newspaper?
There is no absolute answer to such a narrowly focused question, and never
can be. The narrow focus itself -- the sense that we can reasonably plot
a future based on such a constricted view of things -- is exactly the
problem. The implication is that technical developments almost carry
their own justification. There are no considerations that go beyond the
obvious desirability of specific capabilities. A "capability", merely
considered as such, will always seem better to have than not to have.
Without a doubt we will see more and more technical capabilities of the
sort Wexelblat describes, and I am not one simply to hold up my hands and
say, "No more, period". But it is crucial to recognize that where all
these nifty devices will take us depends on a much larger context than the
one we have in mind when we say, "Gee, that makes sense" or when we
exclaim about this or that cool invention.
This is the point Langdon Winner made (NF #94) when he spoke about the
overall, harrying, effect of many household labor-saving devices. It is
also the point I made (NF #91) when I mentioned that the warm feeling of
the early automobile user as he drove across town to help a friend -- a
genuinely good deed -- did not tell a whole lot about how that trip (and
the billions of others added to it) would re-shape society.
The underlying effects always depend on the larger context. Among other
things, they depend on the kinds of problems we are trying to address.
What we find in our own society -- and this, I think, is reflected in
Wexelblat's letter -- is a fascination with technological capabilities as
such. This is the dangerous preoccupation with control and efficiency for
their own sake that I referred to above in connection with biotech. In a
place like the MIT Media Lab where Wexelblat works, you find the same
marriage of commerce and research that dominates biotech.
In this context, the urge for technical innovation is driven (as Winner
points out in his letter below), not by some assessment of the needs and
imbalances of society, but merely by an analysis of what is technically
do-able and commercially promising. "Since, with the right infrastructure
in place, my house could speak with the insistence of my teapot, obviously
that's where we should channel our technical wizardry and society's
resources."
There are other choices. But they will only become evident to us when we
rise above our obsession with power, control, and efficiency, and consider
instead what qualities of life are worth cultivating. One thing we will
discover, then, is that it is always an entire context we must cultivate.
It is all of society we must work on. And this is the sole context from
which worthy questions can arise. Mere technical possibility can never
tell us where to go, even if the distinctive fact about our society today
is that we act as if it could.
Stating the matter a little more concretely: Wexelblat says that "putting
a pager and cellphone into your Nikes seems silly until you see people
carrying these things while jogging". So behold the problem, and yes,
behold the obvious solution. But it requires only a slightly widened
attention to realize that there's another question we could ask: Is the
jogger's burden of luggage just an obvious problem to which there's an
obvious technical solution, or is it rather a symptom of a deeper problem
having to do with the entire structure of our lives? And is this deeper
problem linked in turn to an unbalanced proliferation of the very sorts of
devices now being proposed as a solution?
In asking this question I do not imply any sort of absolute judgment on
the individual who jogs with a pager and cellphone. And I am well aware
of the immediate -- and perfectly reasonable -- response that will come
from many readers: there are environments where the cellphone, as an aid
to personal safety, seems a prerequisite for the very possibility of
jogging.
Of course. Each of us is suspended, so to speak, between what is and what
might be. Everyone -- including those most radically driving toward
social change -- must accommodate in numerous ways to the existing social
realities. But it is as wrong-headed as could be to misread the
necessities of accommodation as positive guides for further social
development. Unfortunately, this mistake is all too easy to make when one
is merely looking around for "problems" that lend themselves to immediate
technical solutions.
What I've said here is hardly an adequate coverage of the issues, and I
would welcome further contributions to the discussion from Alan Wexelblat
or anyone else at the Media Lab -- or the MIT Laboratory of Computer
Science or the Xerox Palo Alto Research Center.
SLT
Goto table of contents
==========================================================================
CORRESPONDENCE
Toward Appropriate Behavior by Objects
--------------------------------------
Response to: "Tech Knowledge Revue 1.3" (NF-94)
From: Alan Wexelblat (wex@media.mit.edu)
Langdon Winner misses the point of ubiquitous computation. Putting a
pager and cellphone into your Nikes seems silly until you see people
carrying these things while jogging. Perhaps they're silly people, or
perhaps they're finding innovative uses for the technology.
Making your coffee maker aware of when you're on vacation makes sense --
more sense than letting it waste because it's ignorant of a simple fact.
Why don't pots "know" that their contents are burning and make that fact
available for people to act upon? We don't think it's strange for
teakettles to have whistles; why should it be strange for my house to
arrange to stop delivery of my newspaper when I'm out of town?
Seymour Papert called it an "appropriate" intelligence. My faucet doesn't
need access to my rolodex, but it probably ought to know that the person
putting his hands into the water stream is a five-year-old visitor to my
house who cannot tolerate the water as hot as I can. Soon airbags will
have the sensors necessary to determine that a child is occupying the
passenger seat and change their inflation method so as to reduce injuries.
Safety is less glamourous and receives commensurately less press than
entertainment and "Sharper Image" products. But both are natural end
results of a way of viewing material objects as embedded in the patterns
and processes of people, and giving objects the information to act
appropriately in these contexts.
I find it absurd, as Bill Buxton put it, that the toilet in the airport
"knows" when I'm no longer standing in front of it, but my computer does
not.
--Alan Wexelblat moderator, rec.arts.sf.reviews
MIT Media Lab - Intelligent Agents Group finger(1) for PGP Key
Pager: 781-945-1842 wex@media.mit.edu
http://wex.www.media.mit.edu/people/wex/ ICQ 3632609
Response to Alan Wexelblat
--------------------------
From: Langdon Winner (winner@rpi.edu)
Alan Wexelblat says I've missed the point about ubiquitous computing, but
he never quite says what that "point" is. His examples express the
enthusiasm today's technology developers show for cramming information
processing into every object under the sun. Yes, all of those things can
be made. Just add chips, programs and stir vigorously. But will any of
it improve the objects or the society they serve?
My doubts here stem from the utter disconnect between the supposed wants
and needs studied in places like the Media Lab and ones painfully evident
in the human community at large. As a rough measure, look at key
indicators of need summarized in the yearly United Nations Human
Development Report and those defined by high-tech wizards in Cambridge,
Palo Alto and elsewhere. Notice the enormous crevasse that suddenly opens
up?
Calling a temperature-sensing faucet an example of "appropriate"
intelligence merely highlights how thoroughly out of touch with most of
the world's people these research programs have become. Appropriate to
what, I wonder? The term "initial public offering" comes to mind.
Missing the `Old' Don Norman
----------------------------
Response to: "Tech Knowledge Revue 1.3" (NF-94)
From: Steve Baumgarten (steve_baumgarten@yahoo.com)
Hi Steve. I came across this in reading a Usenet newsgroup -- someone is
using it as part of his signature -- and thought you might get a smile out
of it. Seems appropriate given at least some of the subject matter of NF
#94:
I have always wished that my computer would be as easy to use as my
telephone. My wish has come true. I no longer know how to use my
telephone.
- Bjarne Stroustrup (author of the C++ programming language)
NF #94 also sent me to The Atlantic, where I read the article on
Waldorf schools -- thanks so much for the pointer.
This issue of NF made for interesting reading -- as always. I also noted
with sadness that Donald Norman has moved over to the side of the raving
technology boosters -- his most recent book, The Invisible Computer, is
easily his least interesting and least thoughtful, a huge letdown from the
author of The Psychology of Everyday Things. Only people who are totally
isolated from the real world would propose such inane "solutions" to
problems that simply don't exist -- and then, more tragically, waste their
time and the time of countless able engineers in the pursuit of these
"solutions".
He was one of us, and now I think he's one of them. Too bad we're living
in an age when we need the "old" Don Norman more than ever.
SBB
Goto table of contents
==========================================================================
ABOUT THIS NEWSLETTER
Copyright 1999 by The Nature Institute. You may redistribute this
newsletter for noncommercial purposes. You may also redistribute
individual articles in their entirety, provided the NetFuture url and this
paragraph are attached.
NetFuture is supported by freely given reader contributions, and could not
survive without them. For details and special offers, see
http://netfuture.org/support.html .
Current and past issues of NetFuture are available on the Web:
http://netfuture.org/
To subscribe or unsubscribe to NetFuture:
http://netfuture.org/subscribe.html.
Steve Talbott :: NetFuture #95 :: September 23, 1999
Goto table of contents
Goto NETFUTURE main page