Goto NetFuture main page
NETFUTURE
Technology and Human Responsibility
--------------------------------------------------------------------------
Issue #126 December 18, 2001
--------------------------------------------------------------------------
A Publication of The Nature Institute
Editor: Stephen L. Talbott (stevet@netfuture.org)
On the Web: http://www.netfuture.org/
You may redistribute this newsletter for noncommercial purposes.
Can we take responsibility for technology, or must we sleepwalk
in submission to its inevitabilities? NetFuture is a voice for
responsibility. It depends on the generosity of those who support
its goals. To make a contribution, click here.
CONTENTS
---------
Editor's Note
Quotes and Provocations
How Can We Hold the Balance?
Wrangling over Odysseus: An Exchange with Kevin Kelly
DEPARTMENTS
Books Received
How to Teach in a Post-Modem World
Correspondence
Pen and Paper (Richard Smith)
The Illusory Self Is Indeed Nobody (Malcolm Dean)
Misconceptions About Knowledge Management (Michael Knowles)
Announcements and Resources
CPSR Conference: Shaping the Network Society
About this newsletter
==========================================================================
EDITOR'S NOTE
The last issue of NetFuture, with its essay about Odysseus and technology,
brought as much positive comment as anything I've ever published in the
newsletter. However, complaints are often more valuable to readers, so I
include below an exchange I've had with Kevin Kelly, who took issue with
my summary dismissal of a remark by Danny Hillis (or was it a summary
dismissal of Danny Hillis?).
The Fall issue of The Nature Institute's In Context newsletter is
now available online. In it you'll find discussion of the nature of
qualities, a remarkable comparison between the skulls of wild and captive
lions, and a look at some aspects of the current research on complexity
-- research that is widely regarded as a scientific revolution.
http://natureinstitute.org/pub/ic/ic6.
Response to the fund appeal of a few weeks ago has been gratifying, and
those of you who have contributed should soon be receiving acknowledgment
and thanks, if you haven't already. I am happy to report that we have
just now, as I write, gone over the top in our matching-grant challenge.
Of course, there is always next year's budget! If you need a little year-
end adjustment to your financial portfolio in the form of a tax-exempt
contribution, see http://www.netfuture.org/support.html.
SLT
Goto table of contents
==========================================================================
QUOTES AND PROVOCATIONS
How Can We Hold the Balance?
----------------------------
In response to the idea of disconnecting from email (NF #121, 122, 124),
Martin Raish sends this along from Anne Morrow Lindberg's Gift from the
Sea:
Instead of planting our solitude with our own dream blossoms, we choke
the space with continuous music, chatter and companionship to which we
do not even listen. It is simply there to fill the vacuum. When the
noise stops there is no inner music to take its place.
We must relearn to be alone. It is a difficult lesson to learn today
to leave one's friends and family and deliberately practice the
art of solitude for an hour, or a day, or a week. For me, the break is
the most difficult. Parting is inevitably painful, even for a short
time. It is like an amputation, I feel. A limb is being torn off,
without which I shall be unable to function.
If one sets aside time for a business appointment, a trip to the hair-
dresser, a social engagement, or a shopping expedition, that time is
accepted as inviolable. But if one says: I cannot come because that is
my hour to be alone, one is considered rude, egotistical or strange.
What a commentary on our civilization, when being alone is considered
suspect; when one has to apologize for it, make excuses, hide the fact
that one practices it like a secret vice. I find there is a
quality to being alone that is incredibly precious. Life rushes back
into the void richer, more vivid and fuller than before.
And Philippe Lewis, remarking on my comments about keeping a balance in
the face of email and other pressures, writes:
I wholeheartedly agree. I'd love to hear more of what other readers
are doing to improve their own life with regards to technology. I for
one have taken to slowing down my walking pace and remembering to
breathe.
I'd be happy to hear from others about what you are doing to "hold the
balance". If the situation merits, I'll summarize for the rest of the
readership.
Wrangling over Odysseus: An Exchange with Kevin Kelly
-----------------------------------------------------
In "The Deceiving Virtues of Technology" (NF #125) I offered this
paragraph:
We hear much talk about transformation about the coming Great
Singularity, the Omega Point, the emergence of a new global
consciousness. But, to judge from this talk, we need only wire things
up and the transformation will occur automatically. Complexity
theorist Ralph Abraham says that "when you increase the connectivity,
new intelligence emerges". Our hope, he adds, is for "a global
increase in the collective intelligence of the human species .... a
quantum leap in consciousness". And computer designer Danny Hillis
tells us that "now evolution takes place in microseconds .... We're
taking off .... There's something coming after us, and I imagine it is
something wonderful". (Quotations drawn from "Flocking Together
through the Web" by Joel Garreau, Washington Post, May 9, 2001)
I then proceeded to disparage this kind of view as "Evolution for Dummies"
or "Plug-and-Play Evolution just add connections and presto!
a quantum leap in consciousness!"
On the occasion of delivering that talk, I was called to account by one
audience member who was offended by my "absurdly simplistic summary of
Abraham's work" (or something like that). Then, after the talk appeared
in NetFuture, Kevin Kelly, the founding editor of Wired magazine,
complained about my treatment of Hillis. This led to an exchange of
views, reproduced here.
---------------------
Steve,
I enjoyed your latest essay. One thing unsettled me. I happen to know
Danny Hillis fairly well. I know you are using him as a symbol; but that
is what unsettles me. He is not a symbol. He is a person. His
intelligence, his spirit, his soul are not captured and not represented by
the quote you use. If out of the blue you had asked me last week who is
the most spiritual technologist I know, I would have said Danny Hillis.
He grew up in India of missionary parents. He is constantly thinking
about the other dimensions of technology. He has a spiritual life, too.
He is at least as rounded as you are. And there is no doubt in my mind
that if I had to choose between Odysseus and Danny as a model, I'd choose
Danny. He is far more inventive, far more original, far more witty, and
far more whole than Odysseus was.
I only mention this because this example is not an isolated case. If I
have any complaint about your essays in general it is that you tend to
look down upon techno-promoters as inferior people: In this view they are
broken, misguided people, who thus generate broken misguided ideas.
In my experience, however, they are some of the more rounded, deeper, and
most human people that I know. I think this resident bias harms your
arguments. And as I mentioned before, I think the stance you have taken
with technology and its adherents to view it as an enemy in
the long run cannot lead to full under understanding of it. I think only
love leads to full understanding.
--kk
---------------------
Kevin --
I'm puzzled. I have never met Danny Hillis, I know absolutely nothing
about his character, and I would certainly not presume to speak about his
spiritual depth or roundedness as an individual. I was responding to a
statement he offered to the press. If this statement was in any way
misquoted, or is at all misrepresentative of his thought, then I would
like to know about it, and would be happy to modify my treatment of the
issues accordingly. There is no shortage of statements like the one
attributed to Hillis for me to draw on; they are "in the air" these days.
The fact is that a lot of people (who tend to be darlings of the press)
get a lot of mileage by issuing colorful statements of this sort. In
doing so, they are reaping the benefit of, and further reinforcing, a view
that is becoming deeply influential. One should be allowed to offer a
forceful response to this view without passing judgment on the moral
character of those who promote it.
It happens that the remarks by both Abraham and Hillis perfectly
illustrated the main point of my talk, which was that the intelligence and
techne of the self-aware deviser has been giving way before the
ubiquitous, automatic, frozen intelligence of our devices. As I put it in
my talk (which compared Odysseus, man of many devices, with Silicon
Valley's man of many gadgets):
The techne that devises is being co-opted by its own devices.
Odysseus was on his way to being a true contriver; we seem content to
be mere contrivances.
(Incidentally, while this comparison between Odysseus and wired moderns
was central to my argument, I in no way made Odysseus a general model for
today's human being. My treatment dealt with sweeping, irreversible
historical change, and acknowledged that Odysseus was in some ways quite
primitive.)
When Hillis talks about evolution taking place in microseconds, he is
clearly thinking of machines, in which he vests great hope for the
"something wonderful" that is coming after us. As for Abraham's ramifying
connections that bring a quantum leap in intelligence: am I mistaken in
thinking that, foremost in his mind, too, is the connectivity
afforded by the machinery of the digital age?
I am well aware that those quoted remarks arise from a complex body of
work partly, in fact (in Abraham's case, and I suspect also
Hillis') from the body of work known as "complexity theory". Having
looked a good deal into this work, I remain thoroughly unconvinced that
simply multiplying the number of connections and the intricacy of
networks, or appealing to feedback mechanisms and the mutual interactions
of complex adaptive agents, or invoking semi-mystical notions of emergence
and self-organization at the edge of chaos, in any way changes the
mechanical and automatic nature of the intelligent systems so many people
are praying in aid of our future.
Of course, many will counter my remarks by noting how numerous
technologies, from pencil and paper to procedures for indexing and
accounting, from alphabets and number systems to printing press and
calculator, have enabled us to "leverage our intelligence". But it would
be truer to say "leverage certain aspects of our intelligence with
a vengeance". As I said in the talk, all this leveraging sets up a kind
of positive feedback loop that continually accentuates the prevailing one-
sidedness:
We have invested only certain automatic, mechanical, and computational
aspects of our intelligence in the equipment of the digital age, and it
is these aspects of ourselves that are in turn reinforced by the
external apparatus.
My own sense (partly expressed in "The Great Knowledge Implosion" in NF
#84) is that the profound shift from reliance on the sources of living
intelligence in ourselves to the "frozen" intelligence in our machines has
resulted in an unprecedented contraction of knowledge. The contraction is
distantly hinted at in the old phrase, "we're learning more and more about
less and less", and in the concern about deeper sorts of understanding
being shivered into informational bits, with no one able to put Humpty
Dumpty back together again.
Well, I doubt whether these few words will help much. In general, I agree
that brief dismissals of anyone's views don't often serve much of a
purpose and they tend to get people unnecessarily riled up. I
confess that I used the quotations primarily because they suited my
rhetorical purpose so well. But I also do think the pronouncements at
issue, which have colonized substantial portions of the intellectual
landscape, urgently need some good, healthy ridicule, if only to
counter the aura of esoteric authority and the Olympian heights from which
they are so often intoned.
I do not, however, venture my criticism casually. For what it's worth, I
point below to a recently published article of mine about complexity
theory. It's just the first half of a two-part essay, which in turn is
merely a sketchy extract from a monograph-length treatment due in several
months. Moreover, this first article does not contain my critical
assessment of the work on complexity. Rather, it attempts a brief summary
of some (not all) of the themes and methodological principles of the work.
Writing this summary seemed to me an essential public service, given how
little the public understands the often dramatic claims coming from the
Santa Fe Institute and elsewhere. These claims have a lot in common with
what we heard from Abraham and Hillis, and I hope my article will suggest
to you that I was not altogether unaware of the celebrated lineage of the
remarks I disparaged.
Finally, my references to technology as an "enemy" are nearly always
coupled with acknowledgment of it as an exalted gift and my
repeated point has been that we receive the gift precisely by accepting
the enmity for what it is. The paradox is intentional; it's the only way
to keep from killing the living truth of the matter between the either/or
pincers of a dead and mechanical logic a logic that asks us to be
"for" or "against" technology. Once we accept these terms, then whichever
way we decide, the machine wins, since it has taken over our thinking.
Steve
---------------------
Steve,
I have no problem with anyone criticizing the views of others, even views
represented by what that person said as captured by a simple and short
quote. But it wasn't just Danny's view; it was his whole mind you were
after:
When we try to create an artificial mind, are we trying to program an
Odysseus or a Danny Hillis? It makes a difference! and, if I
may say so, it is vastly easier to capture aspects of Hillis'
intelligence in a computer than it would be to capture much of
Odysseus' intelligence. We have, after all, spent the last several
hundred years learning to think computationally, to formulate and obey
rules, to crystallize our thoughts into evident structures of logic.
It was on this path that we felt compelled to develop computers in the
first place, and it is hardly surprising that these computers turn out
to be well designed for representing the kind of Hillisian thinking
embodied in their design.
In the above passage you suggest that it is vastly easier to capture
aspects of Hillis' intelligence in a computer than it would be to capture
Odysseus. You've just admitted that you have never met Hillis and "know
nothing about his character." I know Hillis and I think you simply picked
the wrong guy. "Hillisian thinking" is not more machine-like than
Odysseusian thinking, unless you were not thinking of Danny Hillis but
were using Hillis as a symbol of something, some kind of brainy,
humorless, computational type of intelligence which is not what
Danny Hillis' mind is like. And if Hillis is merely a metaphor, an emblem
of that kind of withered human mind that complexity and computer experts
are suppose to have, then I ask that you get to know these folks better,
because that assertion does not match up with my experience.
You ended by saying:
Finally, my references to technology as an "enemy" are nearly always
coupled with acknowledgment of it as an exalted gift and my
repeated point has been that we receive the gift precisely by accepting
the enmity for what it is.
The reason I find this correspondence worthwhile is that among all the
critics of technology, Steve, I find that you have the most patience and
the most honesty in investigating it. My mild chastisements to unleash
your respect for technology and its adherents come from the fact that you
are so close already. Most critics so loathe technology, or so embrace
it, that they can't say anything truthful about it. I read and respond to
your wonderful essays precisely because you are trying to confront both
the gift and the curse. Because you care about what you say, I care about
what you say. You should check the record of what you say. Most of the
time in your essays you are pointing out the curse and the price of the
gift, rather than the joy and the liberation of the curse. And very
occasionally, perhaps unconsciously, you'll confuse the sin with the
sinner, the curse with the accursed, the price of technology with those
who love it. I am nudging you toward greater consideration of the
benefits of our marriage to technology. You may think these benefits are
well known, almost cliche, but I think they are still unexplored
and make better sense in the wider context which you bring.
A fan, as always,
--kk
---------------------
Kevin --
Many thanks for your response, very effectively handled as always (and
characteristically gracious as well).
With the Odysseus/Hillis bit I was arguing that AI has ignored the
evolution of the mind. It was the modern mind I was contrasting with an
earlier form of the mind, not Hillis with other moderns. He was chosen
mostly because he had already been mentioned, but also because he excels
as a computer designer, and therefore excels in those aspects of human
mentality required for designing computers (which led to my point about
the computer naturally being well adapted for representing the kind of
thinking designed into it). But it is our entire culture that
"spent the last several hundred years learning to think computationally,
to formulate and obey rules, to crystallize our thoughts into evident
structures of logic". I happen to think that learning these things
learning them as well as Hillis, if we are as fortunate as he is an
essential requirement for being a modern human being. Of course, my whole
point is that we need, as a culture, to work toward something else as
well, so as to escape the one-sidedness of this current evolutionary
thrust. It is, after all, a disaster if the mind ever ceases to transcend
itself.
I would be chagrined if anyone took me to be representing Hillis as
someone whose mind could more easily be programmed because it is inferior
to the minds of us more enlightened (not to mention, unbearably arrogant)
types. That would be to drop the whole historical point of the passage.
By the way, the rest of your response provides an excellent basis for us
to pursue the exchange at a really fundamental level, if we should find
the occasion and to do so without the distraction that can swirl
around dismissive pot shots fired off in the direction of particular
individuals. (Whoops! I guess I just now implicitly acknowledged how
easily such pot shots become counter-productive. So here occurs a brief
brief, I said! bow in your direction.)
Steve
---------------------
Steve,
As you put it here, I have no objection. In fact I agree with it totally,
except for the first sentence of your second paragraph. I don't think AI
research has ignored the greater dimensions of the mind, in the usual
sense that they overlooked it. Rather AI to date has simply been so
primitive that it has been unable to represent or even seek to represent
much beyond cognitive computation. Calculation, like evolution, turns out
to be surprisingly easy to mechanicize since both are almost mathematical
processes. Who would have guessed that complex mathematics could be put
into software, but that is what the amazing program Mathematica is.
Since Danny Hillis has been dragged into this discussion I should mention
that his own theory is that while many people think that emotional states
are aspects of humanity that might only become captured by silicon long
after calculation has been replicated, his belief is that emotions may
turn out to be necessary and fundamental to higher thinking itself, and in
fact easier to create in machines than sophisticated "thinking." He would
be the first to point out, and I would agree, that emotion and cognition
are in no ways the sum of what makes human thought so interesting. We
first tackle these two varieties in part because the new concept of
computation is how we now conceive ourselves as thinking, and in part
because these functions are easier to replicate.
In the spirit of giving, I offer this recycled bit of news I wrote a few
years ago on the invasion of the computational metaphor, which may fit
into your larger point.
The Computational Metaphor
The least noticed trends are usually the most subversive ones. First on
my list for an undetected upheaval is our collective journey toward the
belief that the universe is a computer.
Already the following views are widespread: thinking is a type of
computation, DNA is software, evolution is an algorithmic process. If we
keep going we will quietly arrive at the notion that all materials and all
processes are actually forms of computation. Our final destination is a
view that the atoms of the universe are fundamentally intangible bits. As
the legendary physicist John Wheeler sums up the idea: "Its are bits."
I first became aware of this increasingly commonly held (but not yet
articulated) trend at the first Artificial Life Conference in 1987, where
biological reproduction and evolution were described by researchers in
wholly computer science terms. The surprise wasn't that such organic
things could be given mathematical notations, because scientists have been
doing that for hundreds of years. The surprise was that biological things
could be simulated by computers so well. Well enough that such
simulations displayed unexpected biological properties themselves. From
this work sprang such fashionable patterns as cellular automata, fractals,
and genetic algorithms.
The next step in this trend was to jettison the computer matrix and re-
imagine biological processes simply in terms of computer logic. But to do
this, first computation had to be stripped from computers as well.
Starting with the pioneering work of Von Neumann and Turing, a number of
mathematicians concluded that the essential process of computing was so
elementary and powerful that it could be understood to happen in all kinds
of systems. Or, in other words, the notion of computation was broadened
so wide that almost any process or thing could be described in
computational terms. Including galaxies, molecules, mathematics,
emotions, rain forests and genes.
Is this embrace just a trick of language? Yes, but that is the
unseen revolution. We are compiling a vocabulary and a syntax that is
able to describe in a single language all kinds of phenomena that have
escaped a common language until now. It is a new universal metaphor. It
has more juice in it than previous metaphors: Freud's dream state,
Darwin's variety, Marx's progress, or the Age of Aquarius. And it has
more power than anything else in science at the moment. In fact the
computational metaphor may eclipse mathematics as a form of universal
notation.
This quickening of the new metaphor was made crystal clear recently in the
work of mathematicians and physicists who have been dreaming up the next
great thing after silicon chips: quantum computers. Quantum computers
lie at the convergence of two "impossible" fields, the world of the
impossibly small (quantum states), and the world of the impossibly ghostly
(bits). Things get strange here very fast, but one thing is strangest of
all. In the effort to create mathematical theories of how matter works at
levels way below sub-atomic particles, and in the effort to actually build
computers that operate in this realm, some scientists have found that
using the language of bits best explains the behavior of matter. Their
conclusion: Its are bits. Young Einsteins such as
mathematician/theoretical physicist David Deutsch are now in the very
beginnings of a long process of re-describing all of physics in terms of
computer theory. Should they succeed, we would see the material universe
and all that it holds as a form of computation.
There will be many people who will resist this idea fiercely, for many
good reasons. They will point out that the universe isn't really a
computer, only that it may act as if it was one. But once the metaphor of
computation infiltrates physics and biology deeply, there is no difference
between those two statements. It's the metaphor that wins.
And as far as I can tell the computational metaphor is already halfway to
winning.
--kk
---------------------
Kevin --
I think you're right disastrously right about the metaphor
winning.
Well, if anything, we've got too much on the table now for a
focused conversation. We'll have to see what we can do to pick up, or
weave, a coherent, manageable thread. Meanwhile, thanks for contributing.
Steve
---------------------
For a continuation of this dialog, see:
http://www.netfuture.org/2002/Apr0202_130.html
Related articles:
** "The Lure of Complexity" in In Context #6 (newsletter of The
Nature Institute):
http://natureinstitute.org/pub/ic/ic6/complexity.htm
** "The Great Knowledge Implosion" in NF #84:
http://www.netfuture.org/1999/Feb0999_84.html#4
** "The Central Metaphor of Everything?" by Jaron Lanier (recommended by
Kevin Kelly):
http://www.edge.org/documents/day/day_lanier.html
SLT
Goto table of contents
==========================================================================
BOOKS RECEIVED
Over the past year or two I have been sent many books, an impressive
number of them written by NetFuture readers. In this issue I inaugurate a
"Books Received" section, in which I will try to start working through the
backlog. The notices here will generally be brief and will not be reviews
in any full sense. SLT
How to Teach in a Post-Modem World
----------------------------------
Notes concerning Breaking Down the Digital Walls: Learning to
Teach in a Post-Modem World, by R. W. Burniske and Lowell Monke
(Albany, N.Y.: State University of New York Press, 2001).
http://www.sunypress.edu/breaking.html.
Much about educational policy today seems aimed at deadening the heart of
the classroom exchange the living engagement between student and
teacher. Everything that makes this engagement vital the
unpredictability about where it will lead, the pursuit of an interest felt
in this very moment, the vitality of a genuine investigation (rather than
memory-dumping, whether by teacher or student), the teacher's openness to
being personally re-shaped by the classroom encounter in sum,
everything that goes into a true meeting of self and other or self and
world is all too easily lost in the attempt to transmit a known and
predeclared body of information. The problem arises, not only from
standardized testing, but from the general subordination of the teacher to
a bureaucracy-imposed curriculum. When teachers cease to be learners in
their own own jobs, how can students discover what it means to become
learners?
What is so wonderful about Breaking Down the Digital Walls is that the
book itself embodies and demonstrates a learning exchange worthy of any
classroom. Burniske and Monke have constructed the book from the extended
dialog they carried out as they struggled (part of the time from opposite
sides of the globe) with the challenges, excitements, and discouragements
of their pioneering computer technology classes for high schoolers. The
result is no final declaration about the online learning projects that
figure heavily in the book. Rather, it is a mind-opening exploration of
the conundrums you will never hear articulated in the standard sales
pitch for wired classrooms.
There is just enough tension between the more optimistic Burniske and the
more pessimistic Monke to keep things interesting, while not so much
difference that the dialog loses its driving coherence. Given that the
book gains its force from the authors' sustained conversation, standalone
excerpts tend to lose a lot of their significance. Nevertheless, here is
one of countless passages that I am sure many NetFuture readers will
appreciate (Monke is speaking):
Ever since the drive for efficiency gravitated from the new assembly-
line factories into the schools at the beginning of the twentieth
century, the power of classroom teachers to shape the structure of
their students' education has gradually diminished almost to nil.
Michael Apple, among others, has pointed out that this trend continues
with reliance on prepackaged computer software programs, which "can
cause a decided loss of important skills and dispositions on the part
of teachers. When the skills of local curriculum planning, individual
evaluation, and so on are not used, they atrophy." He argues that the
use of predesigned computer programs contributes to this long-running
trend of deskilling and depowering teachers....
I find considerable personal irony in this critique. One of the
factors that attracted me to computers in the early 1980s was the
inability of anyone to tell me what to do with them. No one was able
to fit them into the curricular strait jacket. Even during the period
when schools were setting up separate computer literacy labs no one was
successfully prescribing the instruction that went with them. Those of
us who jumped into teaching with and about computers early on were able
to pretty much call our own shots. We ran our programs in the gaps
between the well-defined planks of the curriculum. Many of us were
ceded a large degree of autonomy by administrators who had no idea what
these machines might be good for (which is not to say that we did).
Perhaps this is why there is such a revolutionary atmosphere at
educational computing conferences. Many computer teachers have had the
same experience I have had, and their eyes have been opened to the need
for real change in the way we educate our children. Teaching in the
gaps demanded that I rethink almost everything I knew about education.
This was confusing but also liberating, for I began to understand how
restricted the role of the teacher had become in the traditional
classroom setting. I began to recognize the very thing Apple bemoans:
that most teachers have little authority to really shape learning
according to the individual and group needs of their students and
themselves.
Unfortunately, it seems that many computer teachers have concluded that
just because the computer allowed them to slip into their own gaps in
the curriculum, spreading the use of them all over the schools will
somehow result in fostering the needed revolution. They don't
recognize that what liberated them was not really the computer itself
but its newness to education, a feature that always causes problems for
rigid bureaucracies for a while. As I watch the way computers
are being deployed in schools today I see that the bureaucrats are
beginning to catch up to the computer .... The trend toward networking
classrooms and buying server-resident software so that each class has
access to the same district-selected material is another means of
standardizing instruction and further depowering the classroom teacher
(though often the teacher, in getting retrained as a technician, does
experience some rejuvenating sense of power, albeit over a machine, not
the curriculum). (pp. 61-62; references omitted)
Monke, as many of you know, has been an occasional columnist for
NetFuture. His "The Web and the Plow", now a chapter in Breaking Down
the Digital Walls, first appeared in NF #19. You will find it here:
http://www.netfuture.org/1996/May1696_19.html#4.
SLT
Goto table of contents
==========================================================================
CORRESPONDENCE
Pen and Paper
-------------
From: Richard Smith (smith@sfu.ca)
Dear Steve,
A few years ago I started a seminar class with my students with the
assignment that they were to write to me, on paper, with a pen, about
something real and significant that had happened to them directly (no
mediated experiences). They were also to choose the pen and paper
carefully. The result was wonderful and it set the tone for a great
seminar.
You do exactly the same thing for me every time you write one of your
newsletters make me take a step back from what I am doing, and
think about it all more carefully. Thank you.
Richard
The Illusory Self Is Indeed Nobody
----------------------------------
Response to: "The Deceiving Virtues of Technology" (NF #125)
From: Malcolm Dean (malcolmdean@earthlink.net)
Buddha showed logically that self is an illusion. There never was any
kind of split between Man and Nature because, Nature being all, there is
only a shifting relationship of aggregate forms. There is Nobody. What
is being challenged, increasingly and fundamentally, is the cultural meme
of a unique and separate self, a supposition which, if it can be supported
at all today, will certainly vanish in a world of intelligent,
interconnected noosphere of animal, vegetable and mineral devices. This
is the same meme at the center of Egyptian resurrectionism, which was
borrowed by subsequent religions, and in its extreme form is used as a
justification for murder and jihad. The cultural shockwaves now
travelling around the world are merely announcing the tsunami of its
demise.
Malcolm Dean
Writer, Editor, Journalist
1015 Gayley Av #1229, Los Angeles CA 90024-3424
Misconceptions About Knowledge Management
-----------------------------------------
Response to: "The Deceiving Virtues of Technology" (NF #125)
From: Michael Knowles (mike@mwknowles.com)
Hi, Steve,
An excellent, excellent article. I have been discussing the concept of
knowledge management at some length with colleagues and coworkers, and
particularly the problem of why knowledge management systems do not work.
The engineers keep looking for more powerful heuristic algorithms, faster
engines, deeper databases, and end up with nothing useful. They have yet
to understand that this thing they call knowledge management has nothing
to do with data per se.
Knowledge management is about the recording of human endeavor, creativity,
and choice, not about accumulated data. You are so right about the
lethargy of human thought in the high tech world today; there's an
expectation that some critical mass will be reached, after which point the
sum of all consciousness will be accessible to anyone with a hand-held
device or computer.
I feel sad when I hear people speak that way. Knowledge management is
nothing more than the recorded history of humans in an organization
rendered accessible to the inquiring mind. It is or, at least,
should be a record of human endeavor and interaction that can be
perused by people willing to observe relationships and draw their own
conclusions, and then interact with other people and draw larger, deeper
conclusions that machines can never attain. I know that it is trite to
say that we seem to be in a headlong rush into numbness, but there is a
fear that drives many people in the high tech world today. Fear and a
blind acceptance of clever words devised by minds of shallow thought.
I look forward to reading your columns, if only to know that there is
someone else out there who is not taken in.
Thanks,
Michael Knowles
Writer and Publisher
http://www.mwknowles.com/
Goto table of contents
==========================================================================
ANNOUNCEMENTS AND RESOURCES
CPSR Conference: Shaping the Network Society
--------------------------------------------
Computer Professionals for Social Responsibility (CPSR) and the National
Communication Task Force on the Digital Divide are sponsoring a May 16 -
19 conference called "Shaping the Network Society: Patterns for
Participation, Action and Change". It is the eighth biannual Directions
and Implications of Advanced Computing (DIAC) symposium and will be held
in Seattle.
An interesting feature of the conference is its focus on patterns:
To promote bridge-building, we are soliciting "patterns" instead of
abstracts, that will be developed into full papers for this symposium.
A "pattern" is a careful description of a solution or suggestion for
remedying an identified problem in a given context that can be used to
help develop and harness communication and information technology in
ways that affirm human values.
For more information, including how to structure a "pattern", go to
http://www.cpsr.org/conferences/diac02/.
Goto table of contents
==========================================================================
ABOUT THIS NEWSLETTER
Copyright 2001 by The Nature Institute. You may redistribute this
newsletter for noncommercial purposes. You may also redistribute
individual articles in their entirety, provided the NetFuture url and this
paragraph are attached.
NetFuture is supported by freely given reader contributions, and could not
survive without them. For details and special offers, see
http://netfuture.org/support.html .
Current and past issues of NetFuture are available on the Web:
http://netfuture.org/
To subscribe or unsubscribe to NetFuture:
http://netfuture.org/subscribe.html.
Steve Talbott :: NetFuture #126 :: December 18, 2001
Goto table of contents
Goto NetFuture main page