Natural-Born cyborgs: Minds, Technologies, and the Future of Human Intelligence

Aus de_evolutionary_art_org
Wechseln zu: Navigation, Suche


Clark, A. (2003). Natural-Born cyborgs: Minds, Technologies, and the Future of Human Intelligence. Oxford University Press. New York, ISBN-13: 978-0195177510.



Extended Abstract



Table of contents

Introduction 3 CHAPTER 1 Cyborgs Unplugged 13

CHAPTER 2 Technologies to Bond With 35

CHAPTER 3 Plastic Brains, Hybrid Minds 59

CHAPTER 4 Where Are We? 89

CHAPTER 5 What Are We? 115

CHAPTER 6 Global Swarming 143

CHAPTER 7 Bad Borgs? 167

CHAPTER 8 Conclusions: Post-Human, Moi? 197

Notes 199

Index 221

Used References (= Notes)


1. I first encountered this example in D. Rumelhart, P. Smolensky, D. McClelland, and G. Hinton, “Schemata and Sequential Thought Processes in PDP Models,” vol. 2 of Parallel Distributed Processing: Explorations in the Microstructure of Cognition (Cambridge, Mass.: MIT Press, 1986), 7–57.

2. For a brief sampling, see the essays in The Adapted Mind, ed. J. Barklow, L. Cosmides, and J. Tooby (New York: Oxford University Press, 1992).

3. See W. J. Holstein “Moving Beyond the PC,” US News and World Report 127:23 (December 13, 1999): 49–58. Jacob Mey (personal communication) tells me that the word “kanny” is an abbreviated version of a word used, when talking to small children, to mean (in a sweet way) “palm of the hand.” This is combined with an ending that signifies nounhood and conveys the status of an artifact. The best translation he can suggest is “palmie.”

Chapter 1

1. M. Clynes and N. Kline, “Cyborgs and Space,” Astronautics, September 1960; reprint, The Cyborg Handbook, ed. by C. Gray (London: Routledge, 1995), 29–34.

2. Ibid., 29.

3. See A. Turing, “On Computable Numbers, with an Application to the Entscheidungsproblem,” Proceedings of the London Mathematical Society, 2d ser., 42 (1936): 230–65; A. Turing, “Computing Machinery and Intelligence,” 59 (1950): 433–60; J. Von Neumann, The Computer and the Brain (New Haven: Yale University Press, 1958); W. S. McCulloch and W. H. Pitts, “A Logical Calcu- lus of the Idea Immanent in Nervous Activity,” Bulletin of Mathematical Biophysics 5 (1943): 115–33; A. Newell, J. Shaw, and H. Simon, “Empirical Explorations with the Logic Theory Machine,” Proceedings of the Western Joint Computer Confer- ence 15 (1957): 218–39.

4. See R. Ashby, Introduction to Cybernetics (New York: Wiley, 1956), and N. Wiener, Cybernetics, or Control and Communication in the Animal and in the Ma- chine (New York: Wiley, 1948).

5. Manfred Clynes, “An Interview with Manfred Clynes,” interview by C. H. Gray, The Cyborg Handbook (London: Routledge, 1995), 43–55.

6. Clynes and Kline, “Cyborgs and Space.”

7. Donna Haraway, “Cyborgs and Symbionts,” The Cyborg Handbook, ed. C. Gray (London: Routledge, 1995), xv.

8. Simon Le Vay, “Brain Invaders,” Scientific American 282:3 (March 2000): 27.

9. K. Warwick, “Cyborg 1.0,” Wired 8:2 (February 2000): 145.

10. The experiments and results are further detailed in Professor Kevin Warwick’s book I, Cyborg, (London: Century Press, 2002).

11. Warwick, “Cyborg 1.0,” 146–47.

12. All these cases are reported by Kevin Warwick, Wired, 150. For fuller dis- cussion, see chapter 5.

13. Ian Sample, “Push My Button,” reporting on work by Stuart Meloy, a sur- geon at Piedmont Anesthesia and Pain Consultants in North Carolina. See New Scientist, February 10, 2001, 23.

14. See C. H. Gray ed., The Cyborg Handbook, “Pilot’s Associate,” 101–3, and “Science Fiction Becomes Military Fact,” 104–5.

15. For “adamantium skeleton,” see the Marvel Comics Character: Wolverine. For “skull guns” and “human brains directly jacked into Cyberspace,” see W. Gibson, Neuromancer (New York: Ace Books, 1984).

16. W. Ellis, Transmetropolitan, 3, Helix DC Comics (New York, 1997), 4. (Bar-coded dancer appears on page 4.)

17. For some nice discussions, see Kim Sterelny and Paul Griffiths, Sex and Death (Chicago: University of Chicago Press, 1999).

18. As Kevin Warwick once usefully remarked (personal conversation).

19. E. Hutchins, “How a Cockpit Remembers Its Speeds,” Cognitive Science 19 (1995): 265–88; E. Hutchins and T. Klansen, “Distributed Cognition in an Air- line Cockpit,” in Cognition and Communication at Work, ed. Y. Engestrom and D. Middleton (Cambridge: Cambridge University Press, 1998); E. Hutchins, “Inte- grated Mode Management Interface,” Final Report to Contract NCC-2-591 NASA- Ames Research Center (Moffert Field, Calif., 1997).

20. L. Zuckerman, “Making Computers Relate to Their Human Partners,” New York Times, March 4, 2000. Available on the web at 03/04/news/arts/machines-humans.html.

21. Ibid.

22. Kevin Kelly, Out of Control (Reading, Mass.: Perseus Books, 1994), 331. 23. Clynes and Kline, “Cyborgs and Space.”

24. Nicola Jones, “Call from the Heart,” New Scientist 2277 (February 10, 2001): 20.

25. In Finland, a service is being pioneered that allows you to use your cell phone to monitor the actual (not the timetabled) approach of your bus to a desig- nated bus stop, setting it up to alert you when it is time to leave your warm surroundings and venture into the cold Finnish night (service pioneered by Hewlett Packard—see For a variety of reasons, then, the cell phone is un- usually well poised to act as a transition technology. In the wake of the September 11 tragedy, many Americans now view such items as necessities, highlighting its role in informing rescuers, exchanging vital last-minute information, and (sadly) conveying a final message of love from those about to die. See Olivia Baker “Cell- phones Hit Home,” USA Today, September 13, 2001, 12D.

26. Geoff Marsh, “Make a Connection Without the Clutter,” Sunday Express, October 15, 2000.

27. Mark Weiser, “The Computer for the 21st Century,” Scientific American, September 1991, 94–110.

28. See Don Norman, The Invisible Computer (Cambridge, Mass.: MIT Press, 1999).

29. See S. Quartz and T. Sejnowski, “The Neural Basis of Cognitive Devel- opment: A Constructivist Manifesto,” Behavioral and Brain Sciences 20 (1997), 537–96.

30. For a recent account, see D. Milner and M. Goodale, The Visual Brain in Action (Oxford: Oxford University Press, 1995). This work is further discussed in chapters 4 and 7, and in A. Clark “Visual Awareness and Visuomotor Action,” Journal of Consciousness Studies 6:11–12 (1999): 1–18.

31. Some names to conjure with (past and present) include Lev Vygotsky, Maurice Merleau-Ponty, Jerome Bruner, Bruno Latour, Daniel Dennett, Ed Hutchins, Don Norman, and (to a greater or lesser extent) all those currently working in the field of “situated and distributed cognition.”

Chapter 2

1. You can find a nice description on the web at

2. See Mark Weiser, “The Computer for the 21st Century,” Scientific American, September 1991, 94–110, and Donald Norman, The Invisible Computer (Cam- bridge, Mass.: MIT Press, 1999). Similar ideas are found in the philosophical works of Maurice Merleau-Ponty, such as The Phenomenology of Perception, trans. Paul Kegan (1945; reprint, London: Routledge, 1962) and Martin Heidegger, Be- ing and Time (1927; reprint, New York: Harper and Row, 1961). An early appear- ance of the idea of “transparent tools” in the context of human-computer interaction is found in Jacob Mey’s 1988 paper “CAIN and the Transparent Tool: Cognitive Science and the Human-Computer Interface” presented at the 3rd Symposium on Human Interface in Osaka, Japan, in Journal of the Society of Instrument and Con- trol Engineers 27:1 (1987): 247–52.

3. See Donald Norman, The Invisible Computer (Cambridge, Mass.: MIT Press, 1999).

4. There is some discussion of this case in Norman, but what follows is based upon David Landes’s wonderful tome Revolution in Time: Clocks and the Making of the Modern World (London: Viking Press, 2000).

5. Ibid., 92–93.

6. I came upon this example several years ago but cannot seem to recover the original source. To my best recollection, the example was then deployed in the service of a philosophical argument concerning the nature of implicit beliefs. If any reader can supply the full reference, I would be most grateful.

7. A tropism is a kind of automatic, hardwired response. I use the term, how- ever, to mean a learned but now largely automatic response.

8. For a sustained philosophical defense of this position, see A. Clark and D. Chalmers, “The Extended Mind,” Analysis 58 (1998): 7–19.

9. This idea of “scaffolding” originates with the work of Soviet psychologist Lev Vygotsky, who stressed the way a child’s experience with external props (espe- cially an adult’s helps and hints) could alter and inform the way the child solves a problem. Since then, the term “scaffolding” has come to mean any kind of exter- nal aid and support, whether provided by a notepad, a computer, or another human being. See L. Vygotsky, Thought and Language (Cambridge, Mass.: MIT Press translations, 1986).

10. Norman credits the term to Jeff Raskin and dates the coinage as 1978, in an internal Apple memo. See Norman, The Invisible Computer, 275, par.1.

11. This list is a distillation from Norman’s work. It is also influenced by con- versations with Ed Hutchins at UCSD and Mike Scaife and Yvonne Rogers at the University of Sussex.

12. Norman, The Invisible Computer, 59.

13. Ibid., 67.

14. For this, see Bradley J. Rhodes, Nelson Minar, and Josh Weaver, “Wearable Computing Meets Ubiquitous Computing: Reaping the Best of Both Worlds” (Pro- ceedings of the International Symposium on Wearable Computers, October 1999),

15. Bradley J. Rhodes, “The Wearable Remembrance Agent: A System for Aug- mented Memory,” Personal Technologies Journal Special Issue on Wearable Comput- ing, Personal Technologies 1 (1997):

16. Ibid.

17. Known as a “Private Eye,” the unit is made by Reflection Technology and marketed (at last sighting) by Phoenix Group, N.Y. See for details.

18. See T. Starner, S. Mann, B. Rhodes, J. Levine, J. Healey, D. Kirsch, R. W. Picard, and A. P. Pentland, “Augmented Reality through Wearable Computing,” Presence, Special Issue on Augmented Reality (1997).

19. P. Scott, “Eye Spy,” Scientific American, September 2001, News Scan sec.

20. Rhodes, “The Wearable Remembrance Agent,” 219.

21. For much more on this issue, see chapter 7.

22. Paul Dourish, Where the Action Is: The Foundations of Embodied Interaction (Cambridge, Mass.: MIT Press, 2001), 141.

23. Ibid., 42–43.

24. Ibid., 42.

25. Thanks to Ron Chrisley for pointing this out.

26. J. Patten, H. Ishii, J. Hines, and G. Pangaro, “Sensetable: A Wireless Ob- ject Tracking Platform for Tangible User Interfaces” (Proceedings of the ACM CHI 2001),

27. B. Ullmer and H. Ishii, “The metaDESK: Models and Prototypes for Tan- gible User Interfaces” (Proceedings of the ACM UIST ’97 Symposium on User Interface Software and Technology, 1997), 223–32.

28. See especially David Small and Hiroshi Ishii, “Design of Spatially Aware Graspable Displays” (Short Paper, during proceedings of the ACM CHI ’97, At- lanta, March 22–27, 1997).

29. Thanks to Frank Biocca for helpful discussion on this topic.

30. For this information and the examples in the same paragraph, I am in- debted to S. Feiner, “Augmented Reality: A New Way of Seeing,” Scientific Ameri- can 286:4 (April 2002): 48–55.

31. The idea of using mixed reality play was introduced to me by Steve Benford and Tom Rodden in a plenary session at the Fourth International Conference of Cognitive Technology, Coventry, UK, 2001. There, they described a “traveling story tent,” using some of the technologies described in the text.

32. Yvonne Rogers, Mike Scaife, Eric Harris, Ted Phelps, Sara Price, Hilary Smith, Henk Muller, Cliff Randall, Andrew Moss, Ian Taylor, Danae Stanton, Claire O’Malley, Greta Corke, and Silvia Gabriella, “Things Aren’t What They Seem to Be: Innovation Through Technology Inspiration” (unpublished manuscript).

33. One highly funded European endeavor, the Equator Project, is entirely dedicated to the exploration of such interpenetration. The Equator Project pres- ently spans eight academic institutions. A typical vision statement reads: “Instead of treating these [the digital and the physical] as two different worlds, our view is of these two media as being complementary halves of the same world. They are continually interwoven by the activity of people.” For more on this work, visit the Equator web site:

Chapter 3

1. V. S. Ramachandran and S. Blakeslee, Phantoms in the Brain: Probing the Mysteries of the Human Mind (New York: William Morrow, 1998), 58. 2. Ibid., 59

3. Galvanic Skin Response (GSR) is basically a measure of arousal. Arousal (either positive or negative) is accompanied by increased blood flow, heart rate, and sweating. Experimenters measure changes in the electrical resistance of the skin, caused by the sweating, to yield an index of arousal.

4. Ramachandran and Blakeslee, Phantoms in the Brain, 52–54.

5. Ibid., 40–42. A genetic component is suggested by, for example, the pres- ence of phantom arms and hands in a patient born without arms. 6. Ibid., 62.

7. Y. Iwamura, “Hierarchical Somatosensory Processing,” Current Opinion in Neurobiology 8 (1998): 522–28.

8. A. Yarbus, Eye Movements and Vision (New York: Plenum Press, 1967).

9., or just feed “amazing card trick” to a search engine such as Google.

10. See also Daniel Dennett’s “many Marilyns” example, as described in D. Dennett, Consciousness Explained (Boston: Little, Brown, 1991).

11. G. W. McConkie and K. Rayner, “Asymmetry of the Perceptual Span in Reading,” Bulletin of the Psychonomic Society 8 (1976): 365–68.

12. J. K. O’Regan, “Solving the ‘Real’ Mysteries of Visual Perception: The World as an Outside Memory,” Canadian Journal of Psychology 46 (1992): 461–88.

13. D. J. Simons and D. T. Levin, “Change Blindness,” Trends in Cognitive Science 1 (1997): 261–67.

14. Ibid., 266.

15. Best just to search for Change Blindness, Flicker Paradigm, etc., but some current sites are

16. R. Brooks, “Intelligence Without Representation,” Artificial Intelligence 47 (1991):139–59.

17. A. Clark, Being There: Putting Brain, Body and World Together Again (Cam- bridge, Mass.: MIT Press, 1997); D. Dennett, Consciousness Explained (Boston: Little, Brown, 1991); D. Ballard, “Animate Vision,” Artificial Intelligence 48 (1991): 57–86; P. S. Churchland, V. S. Ramachandran, and T. Sejnowski, “A Critique of Pure Vision,” in Large-Scale Neuronal Theories of the Brain, ed. C. Koch and J. Davis (Cambridge, Mass.: MIT Press, 1994).

18. With the important exception of the work of Alva Noe and J. Kevin O’Regan, “A Sensorimotor Account of Vision and Visual Consciousness,” Behavioral and Brain Sciences 24 (2001): 5.

19. For a fairly conservative view see S. Pinker, Words and Rules (New York: Basic Books, 1999). For radical views more in line with the one I develop here, see D. Dennett, Consciousness Explained (Boston: Little, Brown, 1991); R. Jackendoff, “How Language Helps Us Think,” Pragmatics and Cognition 4:1 (1996): 1–34; L. S. Vygotsky, Thought and Language (Cambridge, Mass.: MIT Press, 1986). I ex- plore these topics more fully in A. Clark, “Magic Words: How Language Aug- ments Human Computation,” in Thought and Language, ed. S. Boucher and P. Carruthers (Cambridge: Cambridge University Press, 1998), and A. Clark, Being There.

20. In a recent treatment Merlin Donald suggests that prior to the ability to acquire complex language and culture comes an ability to learn complex sequences of connected subskills (something like the automated subroutines of a computer program). This ability, he argues, is rooted in a change in the nature and func- tional role of consciousness itself. See Merlin Donald, A Mind So Rare (London: Routledge, 2001).

21. R. K. R. Thompson, D. L. Oden, and S. T. Boysen, “Language-Naive Chim- panzees (Pan troglodytes) Judge Relations Between Relations in a Conceptual Match- ing-to-sample Task,” Journal of Experimental Psychology: Animal Behavior Processses 23 (1997): 31–43.

22. The power of labeling is celebrated in D. Dennett, “Learning and Labeling: Commentary on A. Clark and A. Karmiloff-Smith,” Mind & Language 8 (1994): 540–548.

23. Oliver Sacks, Seeing Voices (New York: Harper Perennial, 1989).

24. S. Dehaene, E. Spelke, P. Pinel, R. Stanescu, and S. Tviskin, “Sources of Mathematical Thinking: Behavioral and Brain Imaging Evidence,” Science 284 (1999): 970–74. See also Dehaene’s superb book, The Number Sense (Oxford: Oxford University Press, 1997).

25. Dehaene, The Number Sense, 103.

26. Some newer kinds of computational models, so-called connectionist sys- tems or Artificial Neural Networks, display a similar profile of strengths and weak- nesses, making them a much better bet as machine models of some of the biological aspects of intelligence. These models have been the focus of much of my own past research. See A. Clark, Microcognition: Philosophy, Cognitive Science and Parallel Distributed Processing (Cambridge, Mass.: MIT Press, 1989). For good original sources, see Parallel Distributed Processing: Explorations in the Microstructure of Cognition, ed. J. McClelland, D. Rumelhart, and P. R. Group (Cambridge, Mass.: MIT Press/Bradford Books, 1986), vols. I & II. Most older A.I. (Artificial Intelli- gence) models, by contrast, seem much more like models of the larger systems (of humans-plus-technological props and aids) on which they were originally based. For a nice account, see E. Hutchins, Cognition in the Wild (Cambridge, Mass.: MIT Press, 1995).

27. C. Van Leeuwen, I. Verstijnen, and P. Hekkert. (1999). “Common Uncon- scious Dynamics Underlie Common Conscious Effects: A Case Study in the Inter- active Nature of Perception and Creation.” In Modelling Consciousness Across the Disciplines, ed. J.S. Jordan (Lanhan, Md.: University Press of America), 179–218.

28. D. Chambers and D. Reisberg. “Can Mental Images Be Ambiguous?” Jour- nal of Experimental Psychology: Human Perception and Performance 2:3 (1985): 317–28.

29. Van Leeuwen et al. (See note 27.)

30. Hutchins, Cognition in the Wild, 155.

31. For more on all this, see Clark, “Magic Words” (refer to note 19); A. Clark and C. Thornton, “Trading Spaces: Connectionism and the Limits of Uninformed Learning,” Behavioral and Brain Sciences 20:1 (1997): 57–67; D. Dennett, Kinds of Minds (New York: Basic Books, 1996); M. Donald, Origins of the Modern Mind (Cambridge, Mass.: Harvard University Press, 1991).

32. Dennett, Kinds of Minds, 133.

33. Donald, Origins of the Modern Mind, 343.

34. See V. Landi, The Great American Countryside (New York: Collier Macmillan, 1982), 361–63.

35. J. Elman, “Learning and Development in Neural Networks: The Impor- tance of Starting Small,” Cognition 48 (1994): 71–99.

36. S. Fahlman and C. Lebiere, “The Cascade-Correlation Learning Architec- ture,” in Advances in Neural Information Processing Systems 2, ed. D. Touretzky (San Francisco: Morgan Kauffman, 1990); C. Thornton, Truth from Trash (Cam- bridge, Mass.: MIT Press, 2000).

37. Specifically, synaptic and dendritic growth. See S. Quartz and T. Sejnowski, “The Neural Basis of Cognitive Development: A Constructivist Manifesto,” Behav- ioral and Brain Sciences 20 (1997): 537–96.

38. Evidence for the view comes primarily from recent neuroscientific studies (especially work in developmental cognitive neuroscience). Key studies here in- clude work involving cortical transplants, in which chunks of visual cortex were grafted onto other cortical locations (such as somatosensory or auditory cortex) and proved plastic enough to develop the response characteristics appropriate to the new location. See B. Schlagger and D. O’Leary, “Potential of Visual Cortex to Develop an Array of Functional Units Unique to Somatosensory Cortex,” Science 252 (1991): 1556–60, as well as work showing the deep dependence of specific cortical response characteristics on developmental interactions between parts of the cortex and specific kinds of input signals; see A. Chenn et al., “Development of the Cerebral Cortex,” in Molecular and Cellular Approaches to Neural Develop- ment, ed. W. Cowan, T. Jessel, and S. Ziputsky (Oxford: Oxford University Press, 1997), 440–73). There is also, as mentioned earlier, a growing body of constructivist work in Artificial Neural Networks, connectionist networks in which the architec- ture (number of units and layers, etc.) itself alters as learning progresses. The take-home message is that immature cortex is surprisingly homogeneous, and that it “requires afferent input, both intrinsically generated and environmentally determined, for its regional specialization.” See S. Quartz, “The Constructivist Brain,” Trends in Cognitive Sciences 3:2 (1999): 48–57.

39. R. Sireteanu, “Switching On the Infant Brain,” Science 286 (1999): 60.

40. P. Griffiths and K. Stotz, “How the Mind Grows: A Developmental Perspec- tive on the Biology of Cognition,” Synthese 122 (2000): 1–2, 29–52.208

41. This notion was partially anticipated by Gregory Bateson’s notion of an “extra-regulator”: a creature that regulates its own states by changing and control- ling its environment. Bateson’s concern, like that of Clynes and Kline (see chapter 1), was more with basic bodily functions than with mental processes. See Bateson, “The Role of Somatic Change in Evolution” (1972); reprint, in his Steps to an Ecology of Mind (Chicago: University of Chicago Press, 2000), 346–63.

42. For a powerful defense of such a view, see P. Griffiths and R. Gray, “Devel- opmental Systems and Evolutionary Explanation,” Journal of Philosophy 91: 6 (1994): 277–305.

Chapter 4

1. The friend was Brian Cantwell Smith, now a professor at Duke University, and former director of the XeroxPARC component of the Center for the Study of Language and Information at Palo Alto, California.

2. D. Dennett, “Where Am I?” in D. Dennett, Brainstorms (Sussex: Harvester Press, 1981).

3. Ibid., 317.

4. See Johan Wessberg, Christopher R. Stambaugh, Jerald D. Kralic, Pamela D. Beck, Mark Laubach, John K Chapin, Jung Kim, S. Jammes Biggs, Mandayam A. Srinivasan, and Miguel A. L. Nicolelis, “Real-time Prediction of Hand Trajec- tory by Ensembles of Cortical Neurons in Primates,” Nature 408 (November 16, 2000): 305–6.

5. The official name is the Laboratory for Human and Machine Haptics.

6. Quoted in MIT Tech Talk (Cambridge, Mass.: MIT news office, December 6, 2000);

7. This point is nicely made in David Sanford’s follow-up to Dennett’s piece, “Where Was I,” in The Mind’s I, ed. D. Dennett and D. Hofstadter (Sussex: Har- vester Press, 1981), 232–41.

8. In an article published in Omni (May 1980): 45–52, Minsky credits Pat Gunkel with the original coinage.

9. I was led to this site by Thomas Campanella’s excellent piece “Eden by Wire,” in The Robot in the Garden, ed. K. Goldberg (Cambridge, Mass.: MIT Press, 2000).

10. Marvin Minsky, Omni (May 1980): 45–52. The passage is cited in Dennett and Hofstadter, The Mind’s I.

11. A. Hein, “The Development of Visually-Guided Behavior,” in Visual Coding and Adaptability, ed. C. S. Harris (Hillsdale, N.J.: Erlbaum, 1980), 52. For furtherN

discussion and some caveats, see S. Hurley, Consciousness in Action (Cambridge, Mass.: Harvard University Press, 1998), chaps. 9, 10.

12. See J. G. Taylor, The Behavioral Basis of Perception (New Haven, Conn.: Yale University Press, 1962), 205, and Hurley, Consciousness in Action, 387.

13. These experiments were performed by some of my ex-colleagues at the Washington University Medical School. See W. Thach, H. Goodkin, and J. Keating, “The Cerebellum and the Adaptive Coordination of Movement,” Annual Review of Neuroscience 15 (1992): 403–42.

14. V. S. Ramachandran and S. Blakeslee, Phantoms in the Brain: Probing the Mysteries of the Human Mind (New York: William Morrow, 1998), 59.

15. This list is based on K. Goldberg, ed., “Introduction: The Unique Phenom- enon of a Distance,” The Robot in the Garden (Cambridge, Mass.: MIT Press, 2000).

16. Ibid. See all the papers therein, especially Machiko Kusahara’s survey, “Pres- ence, Absence and Knowledge in Telerobotic Art.”

17. E. Kac, “Dialogical Telepresence and Net Ecology,” in Goldberg, The Robot in the Garden, 188.

18. The next few paragraphs draw mainly on Blake Hannaford’s “Feeling Is Be- lieving: A History of Telerobotics,” in Goldberg, The Robot in the Garden, 247–74.

19. R. C. Goertz, “Fundamentals of General-Purpose Remote Manipulators,” Nucleonics 10:11 (November 1982): 36–45.

20. A.Bejczy and K. Salisbury “Kinesthetic Coupling for Remote Manipula- tors,” Computers in Mechanical Engineering 2:1 (1983): 48–62.

21. Hannaford “Feeling Is Believing,” in Goldberg, The Robot in the Garden, 251.

22. This work is described in A. Milner and M. Goodale, The Visual Brain in Action (Oxford: Oxford University Press, 1995), 167–70, and in M. Gazzaniga, The Mind’s Past (Berkeley: University of California Press, 1998), 106–10.

23. A large literature, pro and con, has grown around this dramatic demonstra- tion. E. Brenner and J. Smeets, “Size Illusions Influence How We Read but Not How We Grasp an Object,” Experimental Brain Research 111 (1996): 473–76; R. Ellis, J. Flanagan, and S. Lederman, “The Influence of Visual Illusions on Grasp Position,” Experimental Brain Research 125 (1999): 109–14; Gazzaniga, The Mind’s Past, chap. 5; Ramachandran and Blakeslee, Phantoms in the Brain, chap. 4; S. Glover, “Visual Illusions Affect Planning but Not Control,” Trends in Cognitive Sciences 6:7 (2002): 288–92.

24. In fact, the two visual “brains” are sufficiently distinct as to be indepen- dently vulnerable to damage. DF, a patient who suffered severe damage (due to carbon monoxide poisoning) to the ventral stream, claims she is unable to see the shape or orientation of visually presented objects. Yet, if you ask her to drop a letter through a mail slot (which she says she cannot see), she will do so accu- rately. Optic ataxics, by contrast, have damage to the dorsal stream and are un- able to perform fluent motor actions despite seeing the scene perfectly well and suffering no gross physical impediments to fluent action. See Milner and Goodale, The Visual Brain.

25. T. Sheridan, Telerobotics, Automation and Human Supervisory Control (Cam- bridge, Mass.: MIT Press, 1992).

26. M. Goodale, “Where Does Vision End and Action Begin?” Current Biology R489–R491 (1998): 491.

27. Gazzaniga, The Mind’s Past, 106.

28. Hannaford, “Feeling Is Believing,” in Goldberg, The Robot in the Garden, 255.

29. Perhaps there is a sense in which it was, but the signals ran through the helper whose concealed tappings completed the circuit.

30. Ramachandran and Blakeslee, Phantoms in the Brain, 61.

31. Ibid.

32. F. Biocca, and J. Rolland, “Virtual Eyes Can Rearrange Your Body: Adapta- tion to Visual Displacement in See-through, Head-mounted Displays,” Presence: Teleoperators and Virtual Environments (Cambridge, Mass.: MIT Press, 1998), 7: 3, 262–77.

33. See Antonio Damasio, Descartes’ Error (New York: Grosset Putnam, 1994), 62–66.

34. See M. Kawato et al., “A Hierarchical Neural Network Model for the Con- trol and Learning of Voluntary Movement,” Biological Cybernetics 57 (1987): 169– 85; P. Dean, J. Mayhew, and P. Langdon, “Learning and Maintaining Saccadic Accuracy,” Journal of Cognitive Neuroscience 6 (1994): 117–38. I first learned about this work from Rick Grush. See R. Grush, “The Architecture of Representation,” Philosophical Psychology 10:1 (1997): 5–25.

35. See Thach et al., “The Cerebellum and the Adaptive Coordination of Move- ment,” Annual Review of Neuroscience 15 (1992): 403–42.

36. W. Kim and A. Bejczy, “Demonstration of a High-Fidelity Predictive/Pre- view Display Technique for Telerobotic Servicing in Space,” IEEE Trans. Robotics and Automation 9:5 (1993): 698–702.

37. For an excellent survey, upon which much of the previous discussion is based, see Hannaford, “Feeling Is Believing,” in Goldberg, The Robot in the Garden.

38. Ibid., 274.

39. Jim Hollan and Scott Stormetta, Proceedings of the ACM (Association For Computing Machinery), ACM 0-89791-S513-S/92/0005-0119 (1992): 119–25.

40. Ibid., 120.

41. Ibid., 125

42. This term was coined by Howard Rheingold in his classic Virtual Reality (London: Seiter and Warburg, 1991.)

43. Hubert Dreyfus, “Telepistemology: Descartes’s Last Stand,” in Goldberg, The Robot in the Garden, offers a balanced and sophisticated treatment of such worries.

44. If your partner was very familiar to you, an emulation circuit might help here, but that seems a little dramatic, even for my liberal tastes.

45. Albert Borgman, “Information, Nearness and Farness,” in Goldberg, The Robot in the Garden, calls this property of endless richness “repleteness.”

46. Dreyfus “Telepistemology,” in Goldberg, The Robot in the Garden, 62.

47. A. Chang, B. Resner, B. Koerner, X. Wang, and H. Ishii, “LumiTouch: An Emotional Communication Device” (short paper), in Extended Abstracts of Con- ference on Human Factors in Computing Systems (CHI ’01), (Seattle, Washing- ton, USA, March 31–April 5, 2001), (New York: ACM Press), 313–14.

48. See S. Brave, A. Dahley, P. Frei, V. Su, and H. Ishii, “inTouch” in SIGGRAPH [Special Interest on Computer Graphics] 1998: Conference Abstracts and Applica- tions of Enhanced Realities (New York: ACM Press, 1998).

49. J. Canny and E. Paulos “Tele-Embodiment and Shattered Presence: Recon- structing the Body for Online Interaction,” in Goldberg, The Robot in the Garden, 280–81.

50. This theme is also explored by M. Indinopulos “Telepistemology, Mediation and the Design of Transparent Interfaces,” in Goldberg, The Robot in the Garden.

51. Canny and Paulos, in Goldberg, The Robot in the Garden.

52. N. K. Hayles, How We Became Post-Human (Chicago: University of Chicago Press, 1999), 291.

Chapter 5

1. EMG stands for Surface Electromyography and involves the use of elec- trodes, positioned on the body surface, to record information about muscular and nervous activity.

2. From the Stelarc web site at

3. Ibid.

4. Thanks to Blay Whitby, who saw the combined performance, for suggest- ing this example.

5. See D. Norman, The Invisible Computer (Cambridge, Mass.: MIT Press, 1999),7.

6. See D. Norman, Things That Make Us Smart (Cambridge, Mass.: Perseus Books, 1993), 191.212

7. Ibid., 190.

8. See Johan Wessberg, Christopher R. Stambaugh, Jerald D. Kralic, Pamela D. Beck, Mark Laubach, John K. Chapin, Jung Kim, S. James Biggs, Mandayam A. Srinivasan, and Miguel A. L. Nicolelis, “Real-time Prediction of Hand Trajectory by Ensembles of Cortical Neurons in Primates,” Nature 408 (November 16, 2000): 305–6.

9. Also called the Laboratory for Human and Machine Haptics, directed by Mandayam Srinivasan.

10. See Bernard D. Reger, Karen M. Fleming, Vittorio Sanguineti, Simon Alford, and Ferdinando A. Mussa-Ivaldi, “Connecting Brains to Robots: The Develop- ment of a Hybrid System for the Study of Learning in Neural Tissues,” Artificial Life 6 (2000): 307–24.

11. T. B. DeMarse, D. A. Wagenaar, A. W. Blau, and S. M. Potter, “The Neu- rally Controlled Animal: Biological Brains Acting with Simulated Bodies.” Autono- mous Robots 11 (2001): 305–10.

12. N. Birbaumer, N. Ghanayim, T. Hinterberger, I. Iversen, B. Kotchoubey, A. Kübler, J. Perelmouter, E. Taub, and H. Flor, “A Spelling Device for the Paraly- sed,” Nature 398 (1999): 297–98.

13. Duncan Graham-Rowe, “Think and It’s Done,” New Scientist, October 17, 1998.

14. In fact, our best current developmental stories suggest that this is precisely the task that confronts the infant. See E. Thelen and L. Smith, A Dynamic Systems Ap- proach to the Development of Cognition and Action (Cambridge, Mass.: MIT Press, 1994).

15. William H. Dobelle, “Artificial Vision for the Blind by Connecting a Televi- sion Camera to the Visual Cortex,” ASAIO [American Society of Artificial Internal Organs] Journal 46 (2000): 3–9.

16. Commercial concerns, such as Cyberonics Inc., Medtronic Corps., AllHear, and Optobionics Corporation, are already helping to make it happen. This short list was drawn from a helpful article titled “Real-World Cyborgs,” which appeared dateline 06/13/00 at

17. Paul Bach-y-Rita, Brain Mechanisms in Sensory Substitution (New York: Aca- demic Press, 1972).

18. Paul Bach-y-Rita, Kurt A. Kaczmarek, Mitchell E. Tyler, and Jorge Garcia- Lara, “Form Perception with a 49-point Electrotactile Stimulus Array on the Tongue: A Technical Note,” Journal of Rehabilitation Research and Development 35:4 (Octo- ber 1998): 427–30.

19. For more information, see the Epistemics Ltd. web site currently at

20. For a nice introductory account, see Michael Gazzaniga, The Mind’s Past (Berkeley: University of California Press, 1998).

21. D. Norman, Things That Make Us Smart (Reading, Mass.: Addison-Wesley, 1993).

22. Ibid., 212–14.

23. The diary entries are based on ideas gathered from many sources, most of which we have already met. The images of information appliances and wearable computers are due to Dan Weiser and Donald Norman. Donald Norman also imagined the neurophone, Stelarc imagined the linked-dancers application, and Patti Maes wrote about brain-like software agents. The thought-controlled pros- thesis is based on the work of Nicolelis, Mussa-Ivaldi, Birbaumer, and others described earlier. The “prosthetic car” is inspired by DERA’s “Cognitive cockpit.” 24. Named after that rather wonderful treatment by Ed Regis, Great Mambo Chicken and the Transhuman Condition: Science Slightly Over the Edge (New York: Addison-Wesley, 1990).

25. D. Dennett, Elbow Room (Oxford: Oxford University Press, 1984), 82.

26. J. Glover, I: The Philosophy and Psychology of Personal Identity (London: Penguin, 1988), 74.

27. On the “narrative self,” see D. Dennett, Consciousness Explained (Boston: Little, Brown, 1991), chap. 13; A. Damasio, The Feeling of What Happens (New York: Harcourt, Brace, 1999), chap. 7.

28. This is not to suppose that some neural circuits have some magic property that makes their goings-on consciously available and others not. It is simply to notice that we are not always conscious of all the processing we are doing, and that some of it is never consciously visible at all. The story I shall develop is thus compatible with both a Dennett-style rejection of any “magic dust” story about consciousness, and with the possibility that conscious experience is nothing but a certain kind of interplay or relationship between multiple nonconscious elements, or a certain kind of informational poise.

29. See A. Clark, “Leadership and Influence: The Manager as Coach, Nanny and Artificial DNA,” in The Biology of Business, ed. J. Clippinger III (San Francisco: Jossey-Bass, 1999). Also Philip Anderson, “Seven Levers for Guiding the Evolving Enterprise,” in Clippinger, The Biology of Business.

30. Kevin Kelly, Out of Control (Reading, Mass.: Perseus Books, 1994).

31. For this argument, see K. Butler, Internal Affairs: A Critique of Externalism in the Philosophy of Mind (Dordrecht: Kluwer, 1998). For some related consider- ations, see F. Adams and K. Aizawa, “The Bounds of Cognition,” Philosophical Psychology 14:1 (2001): 43–64.

32. V. S. Ramachandran and S. Blakeslee, Phantoms in the Brain: Probing the Mysteries of the Human Mind (New York: William Morrow, 1998). The authors discuss these issues, isolating the anterior cingulate gyrus as one key neural structure.

33. Ibn Sina Avicenna, De Anima, vol. 7. Avicenna was a Persian philosopher, scientist, and physician, who lived from 980 to 1037 A.D. See Avicenna Latinus, Liber de anima seu sextus de naturalibus. Edition critique de la traduction latine médiévale. Introduction sur la doctrine psychologique d’Avicenne par G. Verbeke, Partes IV–V. The quote is from an unpublished translation by R. Martin.

34. See especially Dennett, Elbow Room and Consciousness Explained.

35. See A. Clark, “That Special Something: Dennett on the Making of Minds and Selves,” forthcoming in Dennett Beyond Philosophy, ed. A. Brook and D. Ross (Cambridge: Cambridge University Press). For some of Dennett’s original treat- ments see Dennett, Consciousness Explained and Kinds of Minds.

36. In making my case, I have, however, helped myself to a distinction that Dennett would treat with great caution: the distinction between the contents of my current conscious awareness and the manifold other goings-on inside my brain (and perhaps, at times, elsewhere). Dennett repeatedly rejects any such clean and crisp separation. He rejects “the assumption that consciousness is a special all-or- nothing property that sunders the universe into two vastly different categories,” and adds “we cannot draw the line separating our conscious mental states from our unconscious mental states” (both quotes from Dennett, Consciousness Ex- plained, 447). But just what is Dennett rejecting here? Not, I think, the idea that many neural processes operate at what Dennett himself originally dubbed the “sub- personal level.” In spinning my narrative web, or reporting my experiences, I sim- ply have no access, except of an indirect, scientific kind, to many facts about, for instance, the specifics of my own low-level visual processing, or of my postural adjustment systems, or of many of the processes involved in creative thought. Dennett’s point is that within the somewhat smaller space of goings-on of which I have some degree of explicit awareness—where I could, if asked, offer some kind of judgment or report—there is no neat line between what is really, here- and-now conscious and what is not. Moreover, between all such potentially re- portable goings-on and all the rest, there is no difference so profound as to resist explanation in terms of the flow of information and availability for control of intentional action and verbal report. If this reading is correct, then my use of a conscious/nonconscious distinction is fully compatible with Dennett’s own posi- tion. For a short treatment by Dennett, which strongly suggests this interpretation, see Dennett, “The Path Not Taken,” in The Nature of Consciousness, ed. N. Block, O. Flanagan, and G. Guzeldere (Cambridge, Mass.: MIT Press, 1997). Thanks to Susan Blackmore for drawing this apparent conflict to my attention.

37. For an especially interesting attempt to dismantle such a picture, and for some fascinating reflections on the relation between Buddhist ideas and the “dis- appearing self,” see S. Blackmore, The Meme Machine (Oxford: Oxford University Press, 1999) chaps. 17, 18.

38. See D. Edwards, C. Baum, and N. Morrow-Howell, “Home Environments of Inner City Elderly with Dementia: Do They Facilitate or Inhibit Function?” Gerontologist 34:1 (1994): 64. In an ongoing project at Washington University, Baum and her colleagues are using neuro-imaging techniques to identify exactly what kinds of neural degeneration respond best to environmental restructuring. In this protracted longitudinal study, the group combines imaging with drug therapy with the study and manipulation of the physical home environments, and analy- sis of the social environment (family, friends, etc.). The goal is to take the social and environmental factors every bit as seriously as the biological ones and try to track the complex interactions between them.

39. See The Adapted Mind, ed. J. Barklow, L. Cosmides, and J. Tooby (New York: Oxford University Press, 1992). This is not to suggest, of course, that all evolutionary psychologists speak with one voice. There are as many shades of EP as there are of socialism, for instance. My brief comments address only what seems to be a central tendency in many of the more popular treatments.

40. For a powerful defense of such a view, see P. Griffiths and R. Gray, “De- velopmental Systems and Evolutionary Explanation,” Journal of Philosophy 91:6 (1994): 277–305. Unlike Griffiths and Gray, however, I am not yet convinced that the genes do not play a distinctive role in this complex matrix. My belief is simply that the genes are, nevertheless, just one element in a kind of culturally modulated cascade, and that our cognitive natures are best seen as products of this much more complex cascade. For some thoughts on the distinctive role of the genes, see A. Clark and M. Wheeler, “Genic Representation: Reconciling Content and Causal Complexity,” British Journal for the Philosophy of Science 50:1 (1999): 103–35.

Chapter 6

1. See, for example, M. Davies, S. Hawkins, and H. Jones, “Mucus Production and Physiological Energetics in Patella Vulgata,” Journal of Mollusc Studies 56 (1990): 499–503.

2. Special Report, New Scientist (March 11, 2000): 26–45.216

3. For this example and a general introduction to the notion of swarm intelli- gence and its technological applications, see E. Bonabeau and G. Théraulaz, “Swarm Smarts,” Scientific American 282:3 (March 2000): 72–79.

4. Michael Brooks, “Global Brain,” New Scientist (June 24, 2000).

5. E. Bonabeau and G. Théraulaz, “Swarm Smarts,” Scientific American 282:3 (March 2000): 76–77. Also, (site devoted to ant-based optimization techniques).

6. Michael Brooks, “Global Brain,” New Scientist (June 24, 2000).

7. The search engine Google uses a related but slightly less intensive proce- dure. For a comparison, see S. Chakrabarti, B. Dom, D. Gibson, J. Kleinberg, S. R. Kumar, P. Raghaven, S. Rajagopalan, and A. Tomkins, “Hypersearching the Web,” Scientific American, June 1999.

8. J. Kleinberg, “Authoritative Sources in a Hyperlinked Environment,” Pro- ceedings of the 9th ACM-SIAM Symposium on Discrete Algorithms (1998). An ex- tended version is in Journal of the ACM 46 (1999). It also appears as IBM Research Report RJ 10076, May 1997. A pdf version is available on Kleinberg’s homepage. References are to the page numbers of the 1997 research report. The quoted pas- sage is from page 2.

9. One page in the root set R is allowed to bring only some set number of new pages into view, and links between pages with the same domain name (“the first level in the URL string associated with a page”) are ignored, since these typically serve merely “navigational” functions within one larger document. The only links that are of interest are thus what Kleinberg calls “transverse links”: links between pages with different domain names. These and a few additional heuristics are de- scribed in Kleinberg, “Authoritative Sources in a Hyperlinked Environment,” 6–7.

10. Ibid., 11–12.

11. New Scientist 2276 (February, 2001): 17.

12. E. Thelen and L. Smith, A Dynamic Systems Approach to the Development of Cognition and Action (Cambridge, Mass.: MIT Press, 1994), 60, 311.

13. For lots more on this, see A. Clark, Being There: Putting Brain, Body and World Together (Cambridge, Mass.: MIT Press, 1997).

14. Luis Mateus Rocha, “Adaptive Recommendation and Open-Ended Sym- biosis,” International Journal of Human-Computer Studies. Also available at

15. James O’Donnell, Avatars of the Word (Cambridge, Mass.: Harvard Univer- sity Press, 1998), 61.

16. For example, my editor at Oxford University Press, Kirk Jensen, points out that under such conditions funding for many journals would be threatened, and this would have the undesired consequence of making publishing into an under- funded amateur activity. For an opposing viewpoint, see O’Donnell, Avatars of the Word.

17. See also Susan Blackmore, The Meme Machine (Oxford: Oxford University Press, 1999).

18. Kevin Kelly, Out of Control (Reading, Mass.: Perseus Books, 1994), 176.

19. M. Resnick, Turtles, Termites and Traffic Jams (Cambridge, Mass.: MIT Press, 1994).

20. For a compelling meditation on these themes, see Steven Johnson, Emer- gence: The Connected Lines of Ants, Brains, Cities and Software (London: Penguin, 2001). Also Kevin Kelly’s classic, Out of Control.

21. Neil Gershenfeld, When Things Start to Think (London: Hodder and Stough- ton, 1999), 18.

22. Ibid., 18–31.

23. “El moviento Linux cumple diez años animado por la acogida de la indus- tria,” El Pais (Ciberpais) 181 (August 23, 2001): 1.

24. Amy Harmon, “For Sale: Free Software,” New York Times, September 28, 1998, sec. C1, 1–2.

25. Gershenfeld, When Things Start to Think, 235.

26. Ibid., 240–41.

27. David Prosser, “Black Box Could Lower Premiums,” Daily Express, Febru- ary 20, 2002, 33.

Chapter 7

1. El Pais (Ciberpais), 20 de Julio, 2000, 2.

2. This figure is given by N. Katherine Hayles in How We Became Post-Human (Chicago: University of Chicago Press, 1999), 20.

3. Neil Gershenfeld, When Things Start to Think (London: Hodder and Stoughton, 1999), 242.

4. Ibid., 243.

5. For full details, see the MIT OpenCourseWare factsheet, published on the web by the MIT News Office. A Google search for MIT OpenCourseWare will hit it first time.

6. Acting together with the British Medical Association and the Soros Foun- dation. See “Read All About It,” New Scientist 2299 (July 14, 2001): 5.

7. This and several of the following stories are drawn from a chilling article by Jeffrey Rosen, “The Eroded Self,” New York Times Magazine, April 30, 2000.

8. Word 97 and Powerpoint 97 are cited by Jeffrey Rosen (see note 7 above).

9. See Bradley J. Rhodes, Nelson Minar, and Josh Weaver, “Wearable Com- puting Meets Ubiquitous Computing: Reaping the Best of Both Worlds,” Proceed- ings of the International Symposium on Wearable Computers (ISWC ’99) (October 1999); rhodes/Papers/wearhive.html; http://

10. Said in a news conference concerning the release of Jini, a new interactive technology hailed as part of the fully networked home in which consumer appli- ances will communicate with each other and with outside networks ( as per the vision of Ubiquitous Computing).

11. Bradley Rhodes et al., “Wearable Computing Meets Ubiquitous Computing.”

12. For a lovely account, see Kevin Kelly, Out of Control (Reading, Mass.: Per- seus Books, 1994), chap. 12.

13. Gershenfeld, When Things Start to Think, 94.

14. Rosen, “The Eroded Self.”

15. Ibid. Other security measures might include the use of Atguard, a program that watches your online activity, alerts you to monitors and open doors, and sends cookies packing.

16. J. G. Ballard, Super-Cannes (London: HarperCollins, 2001), p. 95.

17. For an extended meditation on this theme, see Kelly, Out of Control.

18. Gershenfeld, When Things Start to Think, 121–22.

19. Both quotes drawn from Donald Norman’s discussion titled “No Moments of Silence,” in D. Norman, The Invisible Computer (Cambridge, Mass.: MIT Press, 1999), 129.

20. For example, an ad for a men’s fashion magazine consisting of a black-and- white commercial bar code warped open in the center so as to resemble a vagina and accompanied by the worrying slogan “What every man wants.” This image might be laid alongside the bar-coded breasts displayed in chapter 1, as a re- minder of what we don’t want to win a place in (what Donna Haraway called) “man’s Family Album.”

21. John Pickering, “Human Identity in the Age of Software Agents,” in Cogni- tive Technology: Instruments of Mind: Proceedings of the 4th International Conference on Cognitive Technology, ed. M. Beynon, C. Nehaniv, and K. Duatenhahn (Berlin: Springer, 2001), 450.

22. Ibid., 446.

23. Hayles, How We Became Post-Human, 115.

24. Pickering, “Human Identity in the Age of Software Agents,” 445.

25. See report by Eugenie Samuel, “Gimme, It’s Mine,” New Scientist 2301 (July 28, 2001): 23.

26. Both quotes from Netfuture 124 (October 30, 2001).

27. N. Sheppard Jr., “Trashing the Information Highway: White Supremacy Goes Hi-tech,” Emerge ( July–August 1996): 34–40. For some further discussion of related themes, see Thomas Foster, “Trapped by the Body? Telepresence Tech- nologies and Transgendered Performances in Feminist and Lesbian Rewritings of Cyberpunk Fiction,” in The Cybercultures Reader, ed. D. Bell and B. Kennedy (Lon- don: Routledge, 2000), 439–59.

28. I have chosen my words carefully here. Where I might have written of “men posing as women,” I have written instead “(biological) men presenting as women.” This is because I think (and shall argue) that it is often unclear which of the many identities and aspects available to an individual should be privileged as his/her/its “true self.” Better, perhaps, to see the self as the shifting sum of mul- tiple personas adapted to different contexts, constraints, and expectations.

29. It would take me too far afield, and too deep into the philosophy of personhood, to pursue this much further here. But the interested reader might look at some recent treatments such as D. Dennett and N. Humphrey, “Speaking for Our Selves” in D. Dennett, Brainchildren (Cambridge, Mass.: MIT Press, 1998), 31–55; Carol Rovane, The Bounds of Agency (Princeton, N. J.: Princeton University Press, 1998); Gareth Branwyn, “Compu-Sex: Erotica for Cybernauts,” in The Cybercultures Reader, 396–402.

30. The phrase, and the account of the Carnegie-Mellon counterattack, is drawn from “Robots Help Humans Defeat Robots,” which was a short news piece in the “In Brief” section of Trends in Cognitive Sciences 5:12 (2001): 512. The articles for that section were written by Heidi Johansen-Berg and Mark Wrexler.

31. A. Turing, “Computing Machinery and Intelligence,” Mind 59 (1950): 423– 60; reprinted in The Philosophy of Artificial Intelligence, ed. M. Boden (Oxford: Oxford University Press, 1990), 40–66. The original Turing Test was named after Alan Turing, who believed that a sufficient degree of behavioral success should be allowed to establish that a candidate system, be it a human or a machine, is a genuine thinker. Turing proposed a test (today known as the Turing Test) that involved a human interrogator trying to spot—from verbal responses—whether a hidden conversant was a human or a machine. Any system capable of fooling the interrogator, Turing proposed, should be counted as a genuinely intelligent agent. Sustained, top-level verbal behavior, if Turing is right, is a sufficient test for the presence of real intelligence.

32. See L. von Ahn, M. Blum, and J. Langford, “Telling Humans and Computers Apart (Automatically),” Carnegie-Mellon University research paper CMU-CS-02-117. See also

33. This account of the message’s routing is based on Laura Miller’s article “One E-Mail Message Can Change the World,” New York Times Magazine, Decem- ber 9, 2001.

34. The Slashdot story is based on Steven Johnson’s excellent study, Emer- gence: The Connected Lives of Brains, Cities and Software (London: Allen Lane, Pen- guin Press, 2001), 152–62.

35. Ibid., 153–54.

36. Reported on BBC News, November 24, 2001.

37. FurryMUCK is at Furry newsgroups include,, and For a fairly detailed account, see Julene Snyder’s article “Animal Magnetism,” which appeared in Life, August 26, 1998 (and can be found on the web).

38. H. Moravec, Mind Children: The Future of Robot and Human Intelligence (Cambridge and New York: Cambridge University Press, 1998), 17.

39. N. K. Hayles, How We Became Post-Human, 291.


1. William Burroughs, Dead City Radio (recording available on Island Records, 1990).

2. This debate is especially lively in the area of feminist and literature studies. For a wonderful and ultimately positive window on some of these debates, see N. Katherine Hayles, How We Became Post-Human (Chicago: University of Chicago Press, 1999).


Full Text,%20Technologies,%20and%20the%20Future%20of%20Human%20Intelligence.pdf

intern file

Sonstige Links