Cochlear Implants after Fifty Years: A History and an Interview with Charles Graser
Abstract and Keywords
This chapter analyzes the cochlear implant as a new acoustic medium. An introductory essay places cochlear implants in a longer tradition of experimentation with the electrical stimulation of hearing. This historical survey is paired with an interview Mills conducted with Charles Graser, an early research subject who field-tested several implant models beginning in the 1970s. Excerpts from Graser’s journals and field notes are included, which detail his novel acoustic experiences and his attempts to domesticate the technology.
The epigraph to Rebuilt, Michael Chorost’s 2005 memoir of learning to hear through a cochlear implant (CI), is taken from the voice-over to the opening sequence of The Six Million Dollar Man: “Gentlemen, we can rebuild him.” The 1970s television series, based on the novel Cyborg by Martin Caidin, introduced bionic implants to the general public. Although Air Force surgeon Jack Steele had coined “bionics” in 1958 to describe the engineering of machines based on biological principles, the term soon connoted superhuman, cybernetic prosthesis. With remarkable circularity between fact and fiction, early cochlear implants were marketed as “bionic ears” (Blume 1997). Chorost’s own device was built by Advanced Bionics, one of the first companies to trademark the term. Yet after decades of research in speech processing and electroacoustics, enhancement to—rather than beyond—perceptual norms continues to seem like an extraordinary feat. Today, the six million dollar man can be found pitching the Lee Majors Bionic Rechargeable Hearing Aid: “With my bionic ear you’ll know when the phone’s ringing.”1
Enhancement logic has tended to limit biological electronics to the “extension” of normative bodily and media continua. Taking a different approach, Chorost understands his cochlear implant to be a new medium, “a kind of relationship with technology that has not existed before. Mediating a person’s perception of reality by computationally controlling nerve endings inside the body is most definitely new. Glasses don’t do that; cell phones don’t do that; even pacemakers and artificial hips don’t do that. But a cochlear implant does” (2005:43). Chorost calls himself a cyborg, yet he trades the biomedical rhetorics of rehabilitation and enhancement for what might be described as a media-theoretical stance on altered, always partial, perceptual experience. With the development of cochlear implants—the first neural prostheses—in the 1950s and (p. 262) 1960s, electrical machines began to communicate directly with the nervous system, bypassing certain aspects of the human sense organ and, sometimes, even the medium of air or external generators of mechanical vibration.2 Cochlear implants afford novel forms of auditory perception. They increasingly simulate as well as stimulate hearing—these electroacoustic devices replicate particular functions of the inner ear (transduction, filtering), but at the same time they “process” or reshape incoming sounds. Implants also produce their own electronic tones—detectable only by the user—for control-setting.
If this “new medium” is defined broadly as electrical hearing—or even more broadly as electrical-neural communication—its technical infrastructure is continuously changing.3 Most cochlear implants presently consist of a fully implanted receiver and electrode array; an external microphone and speech processor, worn at the ear; and an external transmitter, which attaches magnetically to the receiver beneath the skin. Cochlear implant systems are generally programmable and amenable to upgrades. Extracting and coding speech—especially in noisy environments—remains a research priority, however substantial work is also being done to expand these “speech processors” to handle music and other sounds. I have argued elsewhere that implant hardware and software embody cultural values related to communication: the privileging of speech over music, direct speech over telecommunication, non-tonal over tonal languages, phoneme over emotional prosody recognition, quiet over noisy environments, single rather than multiple software options (for reasons of economy, namely lack of clinical time for installation and training), and black-boxed over user-customizable technology (software settings must be set in the clinic; users can adjust volume and microphone sensitivity, and select among a few preset programs based on generic “listening situations”).4 If traditional hearing aids can be considered the first mobile communication technologies, implants raise additional questions about media wearability and the contraction of hearing and communication to the neural realm (Mills 2009:140–46).
Cochlear implants have exhibited a prolonged period of “interpretive flexibility”—the multiple and often divergent perspectives on design and use that proliferate during the development of an artifact, before its eventual “stabilization.”5 Fervent debate surrounding the significance of implants suggests that they remain new media even after fifty years. At one extreme, Ray Kurzweil has described cochlear implants as a first step toward “the singularity,” when synthetic biology and intelligent machines meet each other in the culmination of universal communication (2005:195). He projects, in The Age of Spiritual Machines: When Computers Exceed Human Intelligence, that by 2029 “cochlear implants, originally used just for the hearing impaired, are now ubiquitous. These implants provide auditory communication in both directions between the human user and the worldwide computing network” (1999:221).6 Others have speculated from these trends in sensory prosthesis and signal transmission to possibilities for an expanded range of human audition, or an impending era of “synthetic telepathy,” “downloadable intelligence,” and automatic language translation. In such predictions, the cochlear implant is often figured as the endpoint of body-machine fusion—the mobile medium that will end the need for human mobility.
(p. 263) At the other extreme, first-hand accounts from cochlear implant users overwhelmingly emphasize constrained psychotechnical experience and ongoing negotiations with body-machine compatibility. One of the first implant test subjects, Charles Graser, was described by friends and observers as “the bionic man” in the 1970s (Figure 12.1). Graser’s use of experimental and often provisional devices resulted in singular acoustic experiences. “It really is like dealing with a new medium,” he noted in the fall of 1973. “It is difficult to compare what you hear one time with what you heard one month or a year ago. It’s sure too bad that what I hear can’t be recorded.”
Even so, as the primary test subject at the House Ear Institute in the 1970s, Graser’s individual physiology and psychology contributed directly to the design of the single-channel 3M/House device—the first cochlear implant to receive FDA approval (in 1984). From a medical engineering perspective, Graser and other early users were essential for establishing surgical technique, number and placement of electrodes, biocompatibility of implant materials, design of external hardware, and strategies for signal modulation. As Graser reflected in his journal from 1973, self-reports of perceptual experience by implant users were the imprecise but unavoidable independent variables for processor assembly: “There are no standard or objective evaluation points or standards in this experiment. The patient merely attempts to describe what he hears and the engineers design equipment or adjust it accordingly.” Graser was an ideal subject for these trials: he was an expert listener—a saxophonist and singer—and he had an uncommon lay expertise in electroacoustics through prior military radio work.
(p. 264) He recorded hundreds of pages of field notes and diaries during the 1970s. From a sound studies perspective, this writing documents the first moments of reckoning with a new electroacoustic medium and, in turn, with the re-definition of human hearing. It indicates: a range of hardware and software possibilities for cochlear implants, before economic and cultural imperatives began to assert themselves; a definition of hearing that exceeds ear and event, and necessarily encompasses mind and media; an understanding of hearing (and not just “listening”) as communicative and directional—yet marked by technical, bodily, and commercial filtering; a sensitivity to the exchanges and interferences that occur between electrical technologies in an increasingly dense media ecology. Most important, Graser’s notes denaturalize the finished products of ergonomics and design, revealing the trials and errors by which implant technology became mobile, wearable, and “personal.”
Graser’s involvement with the House Ear Institute in the 1970s occurred in the midst of a shift in the long history of experimentation with electrical stimulation of the ear—a turn away from the traditions of auto-experiment by individual scientists, and isolated treatments or tests of patients in clinical and pseudo-clinical (i.e., “quack”) settings. In the postwar period, “medical electronics” emerged as a field alongside the development of transistors, integrated circuits, and miniaturized devices. This variety of interdisciplinary work was institutionalized through new biomedical engineering departments at universities. At the same time, university-industry collaborations (legally formalized by the Bayh-Doyle Act of 1980, which allowed academics to patent the results of research that had received government funding) led to increasingly large-scale medical and engineering collaborations. Beginning in the 1960s, Graser volunteered to test cochlear implants in the relatively informal clinical setting of the House Ear Institute, which had partnered with a small, private engineering firm owned by Jack Urban.
This was simultaneously a pivotal moment in terms of human subjects research, occurring just before the 1975 revision to the Declaration of Helsinki, and its instantiation in US law (1981), requiring Institutional Review Board (IRB) evaluation and approval of such research on ethical grounds. Graser’s experiments preceded “informed consent” as well as health information privacy laws (i.e., Health Insurance Portability and Accountability Act, or HIPAA, of 1996). For this reason, his “voice” as an experimental subject was not formally suppressed—physicians at the House Institute often publicly cited his field notes. Graser further testified in lectures and on film about his experiences; these testimonies initiated a final turn—the slow clinical acceptance of cochlear implants, which had otherwise been greeted with intense criticism in the medical, pedagogical, and popular spheres.
Long History of a New Medium
Even internal histories of the cochlear implant, such as the often-cited 1966 literature review by Blair Simmons of Stanford (an early CI innovator), place the technology in the (p. 265) long tradition of electrical stimulation of the ear. Most of these histories begin with the experiments of Alessandro Volta, who in 1790 directed a current into his own ear canals and heard something:
I introduced, deep into my two ears, two types of probes or metallic rods, with round tips; and I made them communicate directly to the two extremities of the device. Just as the circle was thus completed, I received a shock in my head; and, a few moments later, (the communication continuing without any interruption,) I began to sense a sound, or rather a noise, in my ears, that I could not easily define; it was a type of jarring crackle, or sizzling, as if some dough or tenacious material was bubbling. This noise continued without diminishing or increasing, the whole time the circle was complete, &c. The disagreeable sensation—which I believed to be dangerous—of the shock in my brain ensured that I have not often repeated this experiment. (1800:427; translation by author)
Johann Wilhelm Ritter and Guillaume-Benjamin Duchenne, among others, eventually attempted the same auto-experiment. Duchenne likened his sensation, in 1855, to “a little dry parchment-like sound, a crackling…when the intermissions were very rapid the sound resembled a crepitation, or the noise produced by the wings of a fly flying between a window-pane and the blind” (1883:370).7 The unprecedented sounds seemed either to derive from the shaking of the ossicles and the auditory nerve (the result of electrical vibration); or it was argued that the noises were produced by the electrical equipment itself.
Duchenne subsequently turned to what he called a “gratuitous” population—deaf patients—to continue his research:
My own ears and those of my friends not being sufficient for my experiments, I borrowed the ears of deaf gratuitous patients, to whom I held out the inducement of possible improvement. By chance many of these patients were cured or bettered. This is how I was led to be a curer through my physiological curiosity. These cures were quickly known, and then deaf and deaf-mutes were sent to me in numbers, and whether I would or no, I was obliged to continue these empirical experiments. (369)
Rudolf Brenner of Leipzig later undertook the first systematic studies of this therapeutic phenomenon, applying electricity to the tympanic membranes of numerous individuals and establishing a diagnostic “formula for the reaction in the normal ear” (Roosa 1874:492). He took the perception of “hissing, roaring, ringing” to indicate an auditory nerve that might be accessed via electricity, despite other otological impairments. Hearing began to be defined according to nervous function, without reference to what had previously been thought of as “the ear itself.” These experiments were part of the emerging field of “electro-otiatrics,” which has understandably been the subject of much critique within Deaf studies (Figure 12.2). In the nineteenth century, electrical therapies were offered by lay aurists as often as by licensed physicians; these treatments were not only provided to deafened therapy-seekers in clinical settings, they were sometimes forced upon children in deaf oral schools (Scheppegrell 1898:318).8 In 1898, allergist (p. 266) and otolaryngologist William Scheppegrell published one of the first monographs on the application of electricity for the diagnosis and treatment of ear diseases. In terms of therapy, Scheppegrell analogized electricity to a pharmaceutical agent; he noted that it could be “applied” to a patient’s ear for several different purposes: to relieve pain, increase absorption, stimulate muscular action, staunch hemorrhages, destroy tumors, or heal ulcerations. Although the sensation of sound might be a side effect of electrical stimulation, nowhere was it suggested that electricity could be used to transmit complex messages into the ear or brain.
Gordon Flottorp, formerly of the Harvard Psycho-Acoustic Laboratory, dates the “modern” period of electrical hearing to 1925: in several countries that year, “radio enthusiasts” reported hearing sounds when they held electrodes near their ears. What was “new” about these experiments, Flottorp explains, was their use of alternating current through the relatively precise source of vacuum tubes (1953:236). Unlike Scheppegrell’s (galvanic) electro-therapy, in which a direct current of electricity was used to provide vibratory massage or cauterization to the middle ear, in these cases an electrical waveform carried audio information (via variations in frequency and amplitude) through the skull to the auditory nerve. This generation of experimenters described electrical stimulation as “a new medium” or “a new mode of hearing,” situating it alongside other “new” communication technologies, like radio itself.
One of these radio enthusiasts, Gustav Eichhorn, obtained patents in Switzerland and the United States in 1926 and 1927 for his “radiophone,” in which “an exciting element…is held against the head or near the ear, so that…the skin or supple part of the face in the immediate neighborhood of the ear are forced by an electrostatic effect to set up oscillations which are not transferred to the ear drum but direct to the inner organs of hearing.” Eichhorn pitched his invention to people with impaired ear drums, and to radio listeners interested in a “physiological and acoustic” mode of perception—one in which “the body of the user is included in the anode circuit” of the radio system (1927: 1–2).
Another enthusiast was Hugo Gernsback, the founder of Modern Electronics, Electrical Experimenter, and Radio News magazines, as well as Amazing Stories, the first (p. 267) science fiction journal. Gernsback, who purportedly coined the phrase “science fiction” and the word “television," is now known in the electronic music world for constructing some of the first electrical keyboards—the staccatone in 1923 and the pianorad in 1926. Among his 80 patents is one from 1923 for an Osophone or “acoustic appliance,” which employed a portable microphone to amplify and transmit speech vibrations to a listener’s teeth. The March 1934 cover of Radio-Craft, edited by Gernsback, featured his Phonosone, a bone-conduction radio-listening device “for the near deaf.” Like Eichhorn’s radiophone, the Osophone and Phonosone were similar in principle to nonelectric bone conduction hearing aids. These new inventions showed that electrically-generated vibrations could carry speech information indirectly to the auditory nerve, although there was much debate about how this occurred.
By the 1930s, physicians and physiologists began to collaborate on these experiments with radio and telephone engineers. In one widely-reported case, Stephan Jellinek (a Viennese doctor) and Theodore Scheiber (an engineer) used a telephone transmitter connected to an electrode to send an electrical current into a patient’s ear. They claimed that the patient experienced “a new method of hearing,” “not dependent on the functions of the outer or middle ear.” In a notice titled “Earless Hearing,” Time Magazine reported in April 1930 that “should the Jellinek device succeed, humans could hear an infinite range, would not have their orchestra limits the piccolo flute and the double bass” (1930:40).
The auditory nerve began to be directly incorporated within electroacoustic circuits around the same time. By the end of the 1920s, Robert Wegel and Charles Lane of Bell Laboratories had loaned Princeton psychologist Ernest Glen Wever, upon his request, an audiometer, a loudspeaker, an oscillator, and other electrical devices from Bell Laboratories (Vernon 1997). Wever and Charles Bray—both former students of Edwin Boring—set out to test his “Telephone Theory” of hearing (Boring 1926). According to this theory, the inner ear behaved like a (contemporary) telephone transmitter, converting sound waves into analogical electric currents, which were carried to the brain rather than being manipulated or reordered in the cochlea.9
Wever and Bray decerebrated a cat and attached an electrode to its exposed auditory nerve. The cat became the transmitting end of a modified telephone system; electrical signals from the nerve were conducted to a receiver in a separate, soundproof room.10 Bray listened at the end of this telephone as Wever made noises into the cat’s ear. Whispers, whistles, tones from organ pipes, room noises—these all could be easily identified. (Transmission declined as the cat slowly died; the system continued to function for up to twenty minutes after death.) The telephone theory thus seemed valid—in fact, Wever and Bray compared the cat-transmitter favorably to that of an “actual” telephone, set up in the same laboratory rooms (Wever and Bray 1930b:377–78).
Within the next two years, Harvard physiologist Hallowell Davis and his former mentor, Edgar Adrian, independently replicated the procedure, demonstrating that the Wever-Bray effect was actually the result of a second electrical impulse produced by the ear (Adrian 1931; Davis 1935, 1991:17–18).11 When Davis tried placing an electrode near a cat’s brainstem, the circuit ceased functioning. Only electrodes placed near the hair (p. 268) cells transmitted intelligible speech. There, what turned out to be a small, analog electrical response—the “cochlear microphonic”—amplified entering sound waves. Speech transmitted through electrode pickups near the hair cells had “a characteristic metallic quality, suggestive of a telephone or radio broadcast” (Davis et al. 1934:313). The ear thus transduced mechanical input in more than one way: filtering sound waves at the cochlea to trigger a response from the auditory nerve; generating electrical analogs at the outer hair cells.
The questions raised by the Wever-Bray experiment galvanized an enormous amount of ear research. Otologists quickly appreciated that this technique provided an opportunity to hear through someone else’s ears. With the assistance of Harvey Fletcher and Robert Wegel of Bell Laboratories, Samuel Crowe and Walter Hughson began to study ear pathology at Johns Hopkins (Crowe and Hughson 1932).12 In 1931, they imposed various artificial lesions on cats—damaging their ossicles, puncturing eardrums, initiating infections. Using an audiogram chart, they then plotted their own reception of sounds through the cats’ ears.13 Some “defects” seemed in fact to improve hearing.
In 1940, S. S. Stevens—a psychologist at Harvard and a colleague of Flottorp and Davis—wired an electrode to a radio set and placed it in the middle ears of graduate student test subjects, who described hearing “tin-pan” music and “occasional words.” Stevens concluded, at first, that electrical stimulation “does not promise much as an alternative means of hearing as long as so much distortion is present” (1937:194).14 Collaborating with physics graduate student Robert Clark Jones (who went on to work at Bell Labs), Stevens was eventually able to correct some of this distortion; he tried the experiment on himself in 1940 and reported, “a reasonably pleasant half-hour was spent listening to Fred Allen’s program.” (Stevens and Jones 1939:264). He named this mode of electrically stimulated acoustic perception “electrophonic hearing.”15
Later that year, working with Jones and otologist Moses Lurie, Stevens tested a set of patients without eardrums at the Massachusetts Eye and Ear Infirmary. Believing the middle ear to play an essential role in electromechanical transduction, Stevens predicted that these individuals would not respond to stimulation. Yet of twenty ears, sixteen perceived sounds like buzzes and “cricket chirps.” And although the side effects were uncomfortable—dizziness, nausea, facial twitching—Lurie reported an (unwanted) eruption of public interest:
Human beings are very peculiar people. If you promise them or you even hint that you might help their hearing, they will let you do anything in the world. I remember when Dr. Davis and I gave the first demonstration at the International meeting of the Physiologists. The newspapers obtained a report of the presentation. The first thing I knew I received letters from individuals all over the world asking when could they come and have their hearing restored and that is the great danger of this work appearing in the newspapers and the ensuing publicity. (1973:515–16)16
Direct electrical stimulation of the human auditory nerve was first attempted in 1957, when Drs. Charles Eyriès and André Djourno implanted wire electrodes (connected to (p. 269) induction coils) at the 8th nerves of two deafened patients in Paris. These individuals were able to perceive sounds of various frequencies, delivered through a microphone and external coil in the laboratory. Both experiments, however, lasted no more than several months. In one case, the device ceased working; in the other, the patient strongly disliked the experiment and ceased participation.17
Dr. William House of the House Ear Institute in Los Angeles made a few similar—and similarly brief—tests in 1961 after a patient brought him a French news clipping about Djourno and Eyriès. According to Marc Eisen, “The work of Djourno and Eyriès was published only in French and never gained momentum in further pursuing human implants. Their work would likely have remained in obscurity for many years if a patient of William House had not brought the work to his attention” (2009:91). With the help of a neurosurgeon and an electrical engineer, House began by temporarily stimulating the auditory nerve of a patient during surgery. Soon thereafter, he placed an implant at the man’s cochlea, followed by a similar trial with a female patient; he was forced to remove their devices almost immediately due to the instability of the materials. Only at the end of the decade, with improvements to transistor technology and surgical plastics—aided by the precedent of artificial pacemakers—would House resume such surgeries.
Robin Michelson, a physician in private practice in Northern California, also began to investigate cochlear implantation in the 1960s. He was first motivated by T. I. Moseley, a patient with tinnitus who was the founder of Dalmo-Victor, a firm that built radar antennae and other signaling apparatus for the military. One of Moseley’s engineers constructed a device for Michelson that amplified the inner ear’s electricity, so he could listen to the cochlear microphonic in patients with tinnitus. Through experiments with this device, it occurred to Michelson that the inner ear might also be stimulated with electricity. In the mid-1960s, he began conducting laboratory trials on cats in collaboration with an engineer from Zenith Radio. This research was funded by Department of Defense, which was interested in the possibility of implanting microphones in the heads of cats and sending them into the field as roving acoustic spies.18
Michelson eventually moved to the University of California–San Francisco (UCSF), and in 1970 he implanted four adults with single-channel devices designed by an engineer from Beckman Instruments.19 Although he had initially believed that cochlear implants would only be able to transmit simple sound cues, by this point he held the “primary goal of usable speech recognition” (Michelson 1971:323). Working with neuroscientist Michael Merzenich and otolaryngologist Robert Schindler—grandson of the founder of Jensen Radio (now Jensen Mobile Electronics)—he began to focus on the development of a multichannel device. This research, which led to the Storz and later Clarion implants, was anchored in animal studies conducted throughout the 1970s to establish the safety of electrical stimulation, especially over the long term.20 In collaboration with the Research Triangle Institute, the UCSF team began in the mid-1980s to focus on the problem of signal processing for speech, based on feedback from human implant users. At Stanford, F. Blair Simmons had implanted a patient with a multiple electrode device in 1964, with discouraging results. He, too, turned to animal research, (p. 270) partnering with Robert White of the electrical engineering department to investigate multichannel cochlear stimulation into the 1980s.21
Working in a clinical as opposed to academic context, House did not carry out basic research. In 1968, he performed an exploratory surgery to test electrical stimulation of the auditory nerve on Charles Graser, a patient at the House Ear Institute who had been deafened by high doses of streptomycin prescribed after an accident in 1959. In 1970, he gave Graser a percutaneous multielectrode “button” implant. With electronics engineer Jack Urban, they spent the next two years testing numerous modulation schemes in the laboratory before Graser was released for equally exploratory “field tests” with an early portable speech processor.22 House compared Graser to Charles Lindbergh, “who flew over the Atlantic at great risk to open this possibility for others. Mr. Graser also faced unknown risks and has made great and lasting progress” (House 1974). Graser’s physical experiences contributed to surgical procedure and knowledge about the effects of long-term stimulation. According to Marc Eisen, his perceptual experiences also set the standards for signal modulation in the first commercial implants:
Postimplantation, he worked intensely as a research subject, and continued to do so enthusiastically for many years. Many of the observations and modifications that House and Urban reported in the 1960s were based on testing only of Graser. For instance, one of the surprising findings from work with Graser was that a 16,000-Hz carrier signal helped him appreciate higher frequencies, and amplitude-modulating the carrier with the acoustic signal generally sounded the best. This signal processor strategy became standard on the House/3M (St. Paul, MN) cochlear implant. Reporting of these early results was primarily by testimonial experiences of the individual subjects rather than systematic study. Another important outcome of these early studies was abandoning the multiple electrode systems for the single-wire electrode. (2006:6)23
Stuart Blume, a sociologist of science, further suggests that Graser’s performance helped recruit other researchers internationally: “For the implant pioneers, in the 1970s, conviction grew out of personal experience. It was not aggregate data, which did not yet exist, that convinced them of the promise of the implant. It was their work with one or two experimental subjects that convinced them, or the visits they paid to William House in Los Angeles” (2010:175).
In 1984, the FDA approved the 3M/House single-channel cochlear implant for adults (Figure 12.3).24 The next year, approval followed for the Australian (Nucleus) twenty-two-channel implant designed by Graeme Clark. With the approval in 1990 of cochlear implant use by children, the longstanding conflict between mainstream medicine and the Deaf community was exacerbated—because pediatric implantation potentially decreases the population of native signers.25 Less well known, cochlear implants remained an intense medical controversy throughout the 1970s, even when applied to deafened adults. William House, in particular, was criticized by many university otologists for proceeding without animal research, an ethics review board, or valid clinical trials. Dr. Nelson Kiang of the Massachusetts Institute of Technology cautioned, (p. 271) “Enthusiastic testimonials from patients cannot take the place of objective measures of performance capabilities” (1973:512).26 A 3M employee recalled that criticism of pediatric implantation was nearly ubiquitous in the 1970s and 1980s, coming from “every conceivable quarter: New York League of Hearing, Central Institute for the Deaf, the public press, academic institutions.”27
Nevertheless, the testimonials of individuals like Charles Graser succeeded in establishing the credibility of cochlear implants—even as those individuals labored with makeshift technology to shape the design of commercial devices. In the 1970s, Graser’s “personal evidence” appeared in various formats: short marketing films produced at the House Ear Institute; live demonstrations before other physicians; positive excerpts from his field notes, published in scientific articles; even a single-run comic strip. Graser’s statements were often selected for evidence of speech recognition, although the field notes describe his engagement with a much wider range of environmental “audio clues”—and, in fact, the earliest implants were not generally successful at transmitting the complex waves of speech.
In the interview and journal entries that follow—half a century after the first implant, and forty years after his own—Graser’s disappointments as well as his enthusiasms become evident. Despite the “bypassing” of his ear, he details the myriad ways that living (p. 272) with a prosthesis is an embodied experience. These notes rupture the seamless conjunction of biological-electronics presumed by “bionics.” Graser discusses his ongoing difficulties with the neural-machine interface: trials of various metals and coatings for the electrodes; the transition from a percutaneous (Figure 12.5) to a transcutaneous implant; experiments with glasses, magnets, and adhesive means for attaching the external transmitter. He details his efforts to domesticate this new technology through his own altered habits of sleeping, showering, and self-care, as well as by technical adjustments to facilitate walking, bicycling and other “mobile activities.”
In part due to the total and prolonged nature of hearing through cochlear implants, Graser’s field notes represent an unusually extensive pilot analysis of an audio technology. He documents the transformation of his everyday soundscape, meticulously attempting to correlate the novel sounds from the device to objects and people in his environment. He marvels over hearing electronic sounds, some of which never existed as airborne waves but were delivered from laboratory equipment directly to his auditory nerve. If sound conventionally indicates motion (of vocal tracts, musical instruments, loudspeaker diaphragms, other objects in the environment), electrical hearing is here also produced by interference: in these notes, Graser deciphers sonic experiences of energy from television sets, electrical wires, and radar traps.28 His journals are thus a record of introducing a new technology into an existing media ecology; his cochlear implants become actors in a network of electronic devices—blocking some, while allowing Graser to interact with others in novel ways. In a related manner, he reflects upon the suddenly apparent presumptions and protocols of modern orality, as well as his alternative points of entry into audition.
With Graser’s assistance and approval, I have chosen the following selections from his field notes from a sound studies perspective, as opposed to a marketing or therapeutic one. Graser writes as an exceptional listener and, at once, a person with an audiological impairment. In the fifty years since the first implant surgery—and the two centuries since the first experiments with electrically stimulated hearing—200,000 people have adopted cochlear implants. The devices themselves now tend to be multichannel, with computerized processors and the capacity for software upgrades, to accommodate ongoing research into the processing of speech (and, to a lesser extent, music and other sounds). Through this technology, hearing has increasingly been defined as communication: the calculated detection of speech and environmental sounds, the machinic segregation of desired sounds from “background” noise, the possibility for on/off control (or “earlids”). Whether this new medium will gravitate toward more normative models of hearing or toward radical forms of communication—and whether its use will become common or rather remain always minor—are part of the technology’s continued interpretive flexibility.29 Although the “experimental future” of implantable audio electronics has indeed materialized incrementally, many of the emergent socialities and acoustic phenomena predicted by Graser’s field notes have proven to be short-lived. Indeed, after decades of implant use—with the risks of neural damage inherent to his work as a test subject—Graser can no longer hear through these devices (Figures 12.4). (p. 273)
(p. 274) “Customizing My Own Sound”
Mills conducted this interview with Graser via email between February 2010 and April 2011:
How did you lose your hearing?
While I was a high school teacher with a young family in Colton, California, I also worked part-time as a tank truck driver for Richfield. We appreciated the extra income, and I enjoyed a job where I got instant results, compared to social studies teaching.
While I was working one Sunday in June 1959, fuel spilled and ignited on me. I managed to put out the fire by using an industrial wheeled fire extinguisher before I collapsed. This was in a Tank Farm where a great deal of fuel was stored. Then, 17 pieces of emergency equipment arrived, and I was rushed to the hospital for a 3 1/2 month stay. I was so seriously burned on my legs and body that I wasn’t expected to live, but my good physical condition, wife, private nurse, and medical personnel barely kept me alive. Due to infection from the burns, I was given streptomycin, an anti-bacterial medicine. This was a very painful period. At first, it appeared that I was dying, but then the doctors realized that I had lost my hearing and sight. It was still touch and go to survive, but my sight returned, and I learned to walk again before I was released from the hospital to convalesce at home. After a few months, I was fortunate to be able to return to school and do some class scheduling and other tasks. Finally, the school offered me the job of school librarian to replace the retiring librarian. So, the following year, I was a deaf, busy new librarian in the high school of approximately 2,000 students.
How did deafness affect your daily activities?
Before the fire, I taught social studies to junior and senior high school students. I also had a busy social and part-time work life. After losing my hearing, I resigned my position on the city planning commission, quit singing in the church choir, driving the tank truck, square dancing, and many other things. It was a real blow to not hear our three children’s young voices, and have people communicate by writing to me. People are not very eager to do that and you begin to lose contact. You don’t get into many bull sessions and so forth after you lose your hearing.
I always enjoyed our family dinners and parties, but I started feeling lonely when I was not in the conversation. The same held true for get-togethers with friends. You always feel somewhat of a dunce when everyone laughs, and you don’t even know what was funny. Watching most TV programs was also useless until captioning began.
(p. 275) Before long, I had an appointment at the House Ear Clinic, and I began using two powerful hearing aids that gave me some feeling of sound. I also began to study speech reading. Fortunately, my wife managed to help me understand her, and she became my “interpreter.” I managed to get along, but deafness is an extremely difficult situation, and maybe even more so for a person who is very verbal and musically inclined. I had been playing the sax, singing, and listening since I was very young, and I had taken courses in speech. When I was a part-time tank truck driver, I rigged radios so that I could listen to music in those long hours.
Fairly soon, I began to communicate with Dr. William House, and he mentioned that research was being done on an artificial hearing system using electrical energy to stimulate the cochlea and hearing nerve. Deafness can also cause desperation, and I felt cochlear implant development was sort of a Calling.
Why did you decide to participate in the early trials at the House Ear Institute? What outcomes did you expect?
As I look back on it, I was really naïve about agreeing to participate in the cochlear implant development, but I was desperate to hear again. In fact, I began to communicate with Dr. William House about progress in research. We traded letters every six months. This correspondence lasted for 10 years. Then, Dr. Bill introduced me to Jack Urban at his Burbank engineering business. Jack had worked in the space program and the Lincoln exhibit at Disneyland in Anaheim, California. Jack was so enthusiastic and supportive that you wanted to join the team. Within a short time I entered St. Vincent’s Hospital in Los Angeles for an experiment attaching electrodes in my inner ear.
The first surgery via my left eardrum was not too painful, but before anesthesia [had] improved, you had to sort of fight to regain complete consciousness. The object was to find out if I could hear sound from electrical stimulation. I was really sedated, but I wanted to hear the words, “Dr. Watson, do you hear me?” I didn’t hear words, but I said that it sounded like someone tapping on a microphone with a pencil. I was disappointed, but the next time that I saw Jack, he was excited because I did hear him tap the microphone. This led him to develop an electrical “box” with controls for changing various sound components. The electronics were in a metal case about the size of a normal book. It reminded me somewhat of my radio shack duty with the 11th Airborne Division in Japan after World War II.
Tell me about the field tests you performed in the 1970s and 1980s.
I enjoyed the chance to “play around” with the switches and dials to create better sound. Jack Urban then attached the mike to the device, and I watched TV, and my wife read to me to find out how much help I was getting from the sound. This was encouraging (p. 276) enough for Jack to build a single channel portable device that I could use outside his lab.30 This instrument was approximately the size of a Blackberry, two cell phones thick, small enough to wear in a shirt pocket or holster that I wore around my neck. At first I was slightly disappointed about the sound, but when my wife and I reached our home, I held the device in my hand, and the sound drastically improved. I just wanted to keep talking, but my wife was tired and went on to bed. My young son and I stayed up much longer talking, and the sound kept improving as the moisture on my hand improved the electrical system’s ground. This quickly led Urban to attach a wire and a silver piece that I wore at my waist.
Not long after this, I was asked to demonstrate my sound comprehension to a number of visiting doctors in Los Angeles who seemed to think I was using mental telepathy, or cheating. The most skeptical doctor asked me to take a simple verbal test. He said, “Close your eyes and repeat to me a number from one to ten that I say.” When I said “two,” I noticed doctors around me smiling. I was then told to try it again, and I told the doctor twenty-two. The doctors almost applauded, and the doubting doctor left without further comment. So, I didn’t get fooled after all. This was one of many interesting experiences I had during development times.
I started doing my own adjustments to the instruments because we lived 60 miles away from Jack Urban’s lab in Burbank. Sometimes, he would telephone and tell my wife what I should do. All of this was very informal. It was also fun to customize my own sound, and I even carried a small screwdriver with me to fine tune the “box” to provide the best sound for various circumstances. I tried changing Modulation, Symmetry, Carrier Wave strength, Microphone Sensitivity and other sound values.31 This resulted in modification of the portable instrument. Jack Urban would sometimes ask me why I made changes, and I would just reply that sound fidelity was better. He was disturbed because his oscilloscope showed sort of a “dog leg” on the sine wave. I was afraid that if he corrected it, I would lose some fidelity. As changes were later made to design an instrument that could be produced for multiple people, both my wife and I were aware that the sound of a commercial device might not work as well for me. It also bothered me when more advanced devices were simplified, and I couldn’t customize sound. My concern was that the more the processor was changed, the less noise and fidelity it would produce.
All of this development was done on a part-time basis. I would leave my school librarian job as soon as school let out and head for Urban’s lab in Burbank, more than an hour’s drive from school. Dr. House was at the House Ear Institute or other pursuits all day, and Jack Urban had ongoing, busy engineering design and production work to do. Virtually the whole early development relied on verbal input. I would twist the dials and change the switch positions as I counted. I tried to give examples of what speech sounded like (p. 277) by sort of mumbling words with my hand covering my mouth. I was always optimistic about improvement.
Almost all of this experimentation was done with a single channel. When we experimented with multiple channels, no improvement was noticed, so Dr. House decided to stick with the single channel. The early testing and use was through a hard-wired 6-electrode array. I had an exterior connection on the left side of my head. The whole experience was so exciting that I didn’t mind looking a little odd. Friends even kidded me by calling me the Bionic Man. People seemed more interested in the new technology than bothered by a really wired person.
All of this was done without signing a lot of legal documents or anything comparable to current requirements. It involved just trusting each other, or ignoring the dangers involved. Dr. House even commented that he checked up on me by phoning my wife daily until many hours of successful stimulation had occurred. By that time, I was using the implant sound [during] all my waking hours with no negative experience or (p. 278) weakening of the stimulation. One of the things that made this interesting was what it did to my self-confidence and ego. I enjoyed the attention and was invited to give speeches about the development. It was fun to be in a leading-edge design to help deaf people hear.
I remember a conversation that I had with a deaf person I encountered after I started using the CI. He said he wasn’t interested in the CI. At that time I didn’t understand his attitude, but I do now. Changing communication cultures is not easy, and you can end up with none. I knew one CI recipient who learned sign language, and his total communication became worse.
Tell me about your experience of sound with the implant.
The implant initiated remarkable experiences with various sounds. I was eager to use the telephone, so my wife and I decided that I could just ask her a question, and she would say the one word, yes, or no-no to reply, sort of a code to tell me if I was correct or not. When I phoned her from work and asked what we were having for dinner, she said “spaghetti,” and I repeated it back to her after she said it several times to me. We started “playing word games.” She would say the name of a city that would be familiar to me, and I would repeat it back to her. I really enjoyed hearing the city, Albuquerque, and other interesting names.
Even though this was a single-channel cochlear implant device, I was amazed at all I could hear. Hearing the school bells and other environmental sounds was really helpful. People’s voices were the best of all. Conversations became more enjoyable because of the improvement in my lip reading. My wife and I had dinner at a noisy restaurant, and she told me it was too noisy for me to use the cochlear implant. However, I told her I could hear the voices of the people nearest to us. I wanted to hear music, but the fidelity wasn’t very good. When I joined the church congregation singing a hymn, I was told that I was OK but somewhat tone deaf. In fact, my wife, who was a good singer, said that there were plenty of other singers who seem to be tone deaf. Hearing the turn indicator sound was helpful, as well as hearing other car noises. I even sorted out a squeak in the car’s steering and applied some lubricant to stop the annoying sound. I avoided a collision one day, because I heard the sound of loud acceleration behind me. Riding my bike under high tension electrical lines produced an uncomfortable buzz, but Jack Urban was able to eliminate that problem.
My wife could get tired of me asking her to say the name of States, Presidents, and practice words, even when we were out for an evening walk. She would tire of this, but I was persistent. She also got accustomed to me asking her what a sound was that I heard. On a family camping trip, the noise of a nearby stream, pans being moved, voices of my family, and even a squirrel stealing peanuts out of a bag near me were exciting moments. Ocean waves crashing, helicopter blades beating, or small planes were easy to identify. (p. 279) I even began to listen to tapes of familiar music, which was interesting but much more difficult to comprehend.
The development team soon began to make changes in the system that required me to have additional surgeries. The crustaceous connection on the left side of my head was just temporary, and it had begun to deteriorate and needed to be removed. So, an induction coil was implanted on the right side. Electrical energy passed through the skin to this implanted induction coil. However, the problem with the exterior coil was that it needed to be held correctly against the head. Attachment to eyeglasses didn’t work very well, so adhesive was tried.32 This led to the next surgery, where a magnet and coil replaced the older induction coil. This system was later replaced by a ring-magnet coil system that was much better.
Other events occurred which caused additional surgeries. At one point, the CI electrode broke, requiring a re-designed wire. The electrode break was quite a surprise. We were on a camping trip in Canada, and when I ducked my head to exit the tent in the morning, the implant stopped working. Another time, the insulation on the electrode leaked, so a better coating was developed. As time went on, revised sound processors appeared. All of these systems required a cable from the transmitter to the head coil.
The implant I liked best came later. It was small, and the magnet was part of the instrument, so a cable was no longer needed—it fit right on your head above your ear. It was small enough that it used a regular small hearing aid battery. This on-the-head single-channel device was developed years before more advanced systems were improved enough so that they didn’t need be carried on the body with a cable to the head coil.
How many different implant systems have you tried?
All in all, I have had nineteen cochlear implant surgeries. The first implant had a plug in my skull where exterior electrodes could be connected. All the additional implants were single channel with Dr. House, until the 1990s. Later, multi-electrode implants were done at UCSF.33 After the initial implant, the experiment involved both my ears so that I could use an existing implant while an “upgrade” was done on the other side. The original plug was just pulled out of my head in the office…
[Regarding the need for these upgrades:] It took a while before the magnet attachment system was devised, and later there were changes in the type of magnet. As I have noted, earlier implant electrodes broke, and then moisture seepage ruined another.
When the CI quit working [entirely], I received an auditory brainstem implant. The ABI had some promise, but it didn’t work out.34
(p. 280) Passages from Charles Graser’s Cochlear Implant Field Test Notes, 1972–1978
Deafness really weakens one’s sense of security. There is just too much going on around you that you do not understand. In December 1959, as part of my convalescence from serious burns suffered the previous June and resulting deafness, we loaded up the family station wagon with borrowed camping gear, and with Barbara driving headed for Death Valley. We got there after dark and did not have too easy a time pitching the tent and setting up camp. Finally, we did get to bed, but I had not slept too long before Barbara shook me awake. A heavy wind was blowing, and it worried her, so she wanted me to share the situation. She took my hand in the dark and guided it to the center pole of the old umbrella tent. Then, I could feel that pole really moving and began to share her feeling that we would be blown clear out of the campground. However, in the morning we were no worse for wear and tear except to our nervous systems…
Barbara had kept telling me on the trip that she thought she heard thunder in the distance. I felt it must be blasting because I couldn’t see a cloud in the sky. However, by the time we left Rae Lake, I began to be concerned about a storm buildup. Finally, two days later it hit. We still had two days to get back to our car. That morning as we were walking away from our overnight campsite it began to rain. And then I saw lightning. The family’s pace really quickened, and I realized that we would be walking clear out in that one day. As we hiked along the family would signal me that there was heavy thunder by raising an arm. I had already seen the flash and would better appreciate how close the bolt was to us. When I saw the streak and immediately saw a raised head or two, it gave me a better appreciation of the strain the rest of the family felt…
In June 1970 just before my first implant operation, I had felt I needed to get away to the outdoors before the hospital experience, so we just loaded up the family car and headed up highway 395 to Tioga Pass and Yosemite.… Just about daylight, too dark for lip-reading, Barbara shook me awake. She sat up in bed and mimicked a bear rearing up, with her arms out and claws extended. Then I realized that a bear was in our kitchen. I stole a look out the door flap, and there the bear sat squeezing the ice chest between his legs and attempting to pry open the top with his claws.
May 1, 1972 Electronic Cochlea Field Test Comments
I was very anxious about field testing the new electronic cochlea. This was a combination of anticipation and some doubts of its effectiveness. However, fear was quickly replaced by excitement when all sorts of new sounds were experienced. From Friday (p. 281) through Sunday the aid was used 14 to 17 hours a day with no unpleasant sensation or effects. A sample of the great variety of sounds follow…
When I was setting my electric alarm clock, the movement was so loud that I decided to see if I could hear my electric watch, and I could. I notice that with concentration, I can hear quiet or distant sounds that I thought were impossible to catch. My pen on paper, for instance…
Friends tell me I speak quieter and with a normal voice…
This has been a tremendous, exciting time talking to many people and continuously asking family and friends, “What sound do I hear now?”…
The aid has to be tuned very carefully to achieve high quality and good reception. Once this is done, mike distance does not seem to be a critical factor.
The carrier frequency has to be set just right. If it is too low, sounds are too low and scratchy or garbled. If it is too high, reception is too weak and sounds are high-pitched.
The volume control has about a ½ inch range where it works well. There is not enough power on lower settings, and as it approaches maximum, it blanks out and then comes back slightly.
The instrument picks up interference from fluorescent lights, power lines, and other items. If the instrument is turned 90 degrees, it usually stops the interference. Passing slowly under a high voltage line will quickly cause one to turn the system down very quickly. I have already learned not to use my electric razor up close to my right ear with the sound on. Placing the razor by the mike is not at all uncomfortable.
Although the mike does a great job pulling in sound, it is extremely sensitive to movement of clothing, a nearby newspaper moving, or even a hand holding it with very slight movement. It will even pick up the ridges of a fingerprint as a finger moves across its holding bar.
The wind also is very noisy. This causes me to put the mike under my collar at the back of my neck when I ride my bike or walk in a breeze.
The mike wire and the implant wire are twice as long as I would choose for them to be for easy wearing. I feel that the instrument should be slung from the neck with the mike nearby (if shielded from movement) or up on or under the collar. There seems to be no advantage to having someone speak directly into the mike. The aid puts out a stronger signal if I hold it firmly in my hand. The aid also has some sound when turned off and held in the hand.
(p. 282) The potential for this instrument helping the deaf seems to be great. One receives continuous audio clues to surroundings. The speech clues seem very helpful, although it is hard to account for greater difficulty understanding my wife. There is definitely a pitch difference in peoples’ voices. Too fast speech is slurred, but more deliberate speech has individual word and sentence patterns. A multi-syllable word is not just a longer sound but has a multi-syllable pattern. However, at the present stage, I doubt if individual words could ever be learned. There is still not enough fidelity. I hope that I am wrong and that the brain can adapt to the out of range, off frequency type of sound that it transmits. Even music reminds me of poor radio reception with a wavering tone that may be interesting but not definable.
June 6, 1972
I am eager as always to speed the development of the hearing instrument.… I am convinced I have a connection problem behind my right ear. A switching noise occurs frequently when I am wearing the aid. It is also noticeable at night without the aid when the right side of my head is on the pillow and my head moves slightly. I can induce the same sounds by touching the connection with my hand. It seems like the same sound with or without the device being connected…
If a choice were available, I would rather have the microphone directly attached to the next model of the instrument. It would be easier to wear and possibly get rid of some mike noises caused by movement of the mike and holding bar on clothing.… The biggest problem with the present model, other than not being able to keep it constantly operating at its best, is electrical interference that ranges from hardly noticeable to uncomfortable. It even seems to pick up sound from wiring running through our house attic over the family room.
The original electronic cochlea and the model that quickly followed was quite a rig to wear. At first there was a foot long wire semi-permanently connected to the implant outlet behind the ear. The instrument was worn in the shirt pocket with another foot-long wire that was connected to the ear wire for transmission. However, I soon found that I could hear much better if I squeezed the instruments between my fingers, so we decided that a ground wire should be connected to my body instead of only in the inner ear. At first I taped a circular electrode of copper with a three-inch radius to my armpit. But due to the resultant skin irritation, I moved the electrode to a position under the elastic band of my undershorts. We then replaced the copper electrode with a silver wafer six inches long by an inch and one-half wide. A long wire from this electrode is then connected to the instrument.
When I get up in the morning I place the body electrode in position, moisten the contact area with saliva for a good chemical ground, then connect the ear wire to the (p. 283) instrument. When I first turn it on I hear a moderately strong high pitch sound (carrier wave, I presume) that fades as other sound is received. The high-pitched whistle then becomes very faint or fades completely according to how well the instrument is adjusted and working. There is some trouble with the ground electrode. It may slip, cutting off transmission. It also pinches one too often and when you bend over, it would like to cut you in two. Maybe I should complain more, because I bet if the doctor or engineer had to wear it, an easier more comfortable system would quickly be devised. The electrode becomes increasingly uncomfortable as my body movement continually bends the metal and causes it to crack. The cracks cause extra pinching until they are mended.
My most recent headwire is now much longer. Instead of a foot long, it now reaches from behind my ear to below my waist. It makes laboratory testing easier when I am hooked into test instrument circuitry. It also makes getting ready for bed an interesting exercise. First, you have to take a coil of wire on your shoulder, if you are unlucky, you will pull on the wire during the night and find yourself awake. Just for laughs I would like to attempt to seduce some unlucky female and see her expression when I disrobe and display my three feet of attached wire.
During the past summer, I had so much trouble with perspiration behind my ear shorting out the system that we unhooked the internal ground and began operating with only the body ground. This means that I no longer have to worry about perspiring and having the system go haywire due to shorting between the hot wire and ground wire behind-the-ear connection. It also improved the sound quality slightly.
One of the interesting and painful experiences of this experiment was testing to see if the interior ground could be disconnected with complete reliance on the ground outside the inner ear. A needle, an electrode, was inserted under the skin in a muscle by the ear and the ground wire was attached at that point. The only trouble was that if I changed my expression in the slightest, there was a resultant pain. I was pretty expressionless during the hour or so of that experiment, and Barbara left the room because she couldn’t stand the sight of that needle in my head and the look on my face. I was elated to know that the experiment didn’t need to be repeated.
Many exciting experiences occurred while wearing the new electronic cochlea. The first afternoon when I began to wear it, I was both disappointed with the quality of hearing I lacked and all the sounds I could hear. The next morning I was up early and out in the backyard when I became a little disturbed because I kept hearing a high pitched calling sound. I called to my son, Davey, who had just come out of the house and asked him what it was. He said, “Look, it’s that mockingbird there—up in the tree.” Later in the morning we went on a long bicycle ride with our community’s Young Life group. There was so much sound emanating from this group of fifty some young people that I found it hard to bear the noise. Later, as we rode off on our bicycles, I was disturbed by an even worse sound. A real heavy noise was occurring and really dominating me, but I couldn’t think of what it could be. If it were a car or airplane it would pass by. About at that time I was getting ready to switch off my new electronic gadget, a slow moving truck pulled slowly up and then passed me. For the rest of that long (p. 284) bicycle ride I was continually aware of cars approaching me from the rear—the first time in thirteen years that I had any audio clue to traffic.
Another discovery that day was that the hearing instrument also would pick up energy from high-tension wires and other electrical lines. It only took one set of wires to make me look and be prepared to turn off my power before I was close enough to get snapped by the surrounding electrical field. It wasn’t really any physical pain, but it was too uncomfortable a sensation and uncanny. I was not hearing noise, I was hearing the radiation of electrical energy.
When we got to Glen Helen Park, I lay resting on the grass and was happily aware of the kids playing basketball near by. I could not understand their words, but I could at least hear their calls. At this picnic, I conversed with two friends and fellow teachers. Previous to that day, they had stumped my lip reading. It still wasn’t easy, but it was possible with some repetition. Then I realized that I was coming back into society—that I had a chance to communicate again.
Just what can you really hear?
The cat meows. It could be some other high-pitched sound, so it is best if you are looking at the cat…
The crowd claps during or at the end of a performance.
The engine on the car runs, but sounds electronic.
The radio and TV have fairly continuous sound.
There are voice sounds
- Higher pitched sounds are usually women.
- Lower pitched sounds are usually men.
- Sometimes you can be fooled if you are not watching TV.
- Clapping could almost be static, but this is the scratchiness of the experimental sound system.
- Music is noise, probably in some cases sound you could confuse with clapping.
- A guitar is pretty good with higher pitched and simpler chords.
- A piano is tinny and you might confuse it with a guitar.
- Flutes are the best instrument of all, and you can always hear drums—too much.
- You could even march in a parade and keep easy step with the drum.
When out walking in the forest, you can hear a twig snap under your foot.
You can hear the telephone ring or someone rap hard on the door.
(p. 285) You can hear a waterfall, waves, or a stream rushing, but it may not sound much different than your car running or clapping, etc.
People can get your attention by calling, but Hey! or your name probably can’t be distinguishable.
You can dial for time on your phone and hear the recording including the time signal tone, although you won’t quite be able to understand what time it really is.
You can pick up a phone, dial for yourself, hear the answering party and communicate to them, basically by them listening to you and replying with either YES or NO-NO. If you do real well you may pick up a common word, phrase or sentence.
You can hear a car’s brakes squeak, the horn blow, or an emergency siren, but you won’t always know which one it may be…
You hear too much dinnerware during meals.
The folding of a paper bag is annoyingly noisy.
A group of people laughing at a joke or clapping is too noisy.
You can hear the teapot whistle.
You can hear the rain drum on the roof of your car.
You can meet people on the street and hear them greet you, although you won’t know whether they said hello or morning.
I can hear the bells at school signal the beginning and end of class periods, call the custodian, and give the fire drill.
I can hear a familiar song and pick up the words and tune somewhere in the middle of the piece and join in pretty naturally.
You can hear a typewriter, even in the next room, and you know exactly what it is.
When announcements are made over a PA system you will hear them but not quite understand them.
You can hear a fire crackle.
You can hear bacon sizzle.
You can hear a squeak or rattle in your car.
You are annoyed by squeaky doors that you have ignored for years.
I can tell which of my two sons has the lowest voice.
You can hear the cork pop when you open your champagne.
You can hear the glass break when the baseball flies through the window.
You can hear water run into the sink. Almost too noisy.
Bathroom fixtures are very noisy with running water.
You CAN’T hear the shower when you are in, because you can’t risk shorting out the system.
(p. 286) You CAN’T hear people talking quietly like they do in a parlor, library, or small office. They talk just too softly.
You can hear an airplane fly overhead, or a helicopter. A big jet will really disturb you.
A train whistle and the sound of the locomotives is distressingly loud.
If you are walking at an intersection, you will hear the traffic.
If you are walking down the street, you will hear a car come up behind you.
If you are driving, you often can hear a car passing you or one you are passing.
Someone whistling will annoy you with the high-pitched sound.
You can hear yourself sneeze or blow your nose, or smack your lips, or your joint pop.
You can hear a zipper going up and down.
You can’t hear yourself snore because you don’t want to wear the instrument and miss your undisturbed sleep.
You can’t hear whispered sweet nothings.
January 3, 1973: Afterschool Visit to Urban Engineering in Burbank
Jack Urban changed head wire to new shorter one. Instead of reaching to my waist, it is only 8 inches long.… In the meantime we had lost my safety pin that is used to attach the transmitter wire to my shirt to support the wire. The piece that holds the screw for the back came off, and since we were all in a hurry, we just used masking tape to put the cover back on.
As soon as we got back on the freeway, I could tell that there was a big improvement. The car noise didn’t bother me, and I could hear my own voice easily when I talked to Barbara, even though the radio was on, also.
I did get one bit of energy static when we passed under large high-tension wires just on the edge of Los Angeles.
I am afraid we have encountered a real problem with the left ear implant. I had had misgivings about ear noises, etc., and when we tested out the sending unit and coil, I received nothing. Jack and Max checked the unit out thoroughly, but it seemed to be working normally, so something must have happened to the implanted coil and electrodes.
Tomorrow afternoon we will return to Burbank. Supposedly we are going in to help Dr. William House and Jack Urban plan a script for a medical film about the electronic cochlea that is to be produced. We are going to attempt to describe the laboratory testing that we did last winter.35
My guess is that there will also be a discussion about why I have lost reception with my left ear and what to do next. I also assume that I will be scheduled for an X-ray to attempt to evaluate the condition of the coil and electrodes. My fear is that one of the wires is faulty and that I am facing corrective surgery even before I really get to use the (p. 287) left ear system for more than a brief testing. It also scares me to have to repeat surgery. You always face the problem of repeated failures or problems. Or how do you know that the next implant will not quickly become faulty.
It is strange because the implant operation was October 28. The Sunday after Christmas I began to get switching sounds in the ear, and I had the feeling that it was turning off and on. I was also getting louder than normal ear noises following a brief experiment with the new transmitter a few days previous. Several days later, we had a successful experiment using the lab transmitter and coil pickup for the left ear. Two days later we had another even more successful experience with the instrument and implant. There had been some apprehension that I might pick up outside energy with the implanted coil and electrodes, but testing it in the lab with a soldering iron transformer seemed to indicate that I needn’t worry. The following day I was getting a haircut and was bothered so much by the electric clippers when they approached the area of the coil implant that the barber had to finish that side with scissors.
Before the latest trip to Urban Engineering, I stopped to borrow the barber’s electric clippers for lab analysis. When the barber tried them out by my ear, they no longer affected me. This caused much concern, and then this afternoon the bad news was confirmed. The ear implant no longer worked. Jack Urban, Max, Barbara, and I were all depressed by the situation.
Tomorrow I am going to cheer us all up by explaining how the transmitter repairs and new head wire have improved the system again. I have been complaining since last July that the sound had deteriorated. Then after arriving home this evening and adjusting the carrier wave and modulation, sound drastically improved. I could zip right through the 10 words (baby, sailboat, cowboy, highway, earthquake, candy, airplane). Then I had exciting success with random comments from Barbara such as, “I have to go to work tomorrow. It’s nine o’clock, I need to balance my checkbook.”
January 4, 1973 [Handwritten Notes]
Got quite a bit of stray energy just walking through the building.… Was shown X-ray by Jack Urban. Bare tip of ground wire broke and moved forward in muscle. Surgery scheduled for Monday afternoon. Dr. House will re-open incision by ear, lift out end of ground electrode, strip insulation from end, and move it to a better location where muscle action won’t affect it.
Talked to another deaf patient about becoming involved in experiment. She seemed reluctant. I wasn’t a very good salesman as a result of facing my fourth surgery which will make it twice on each ear.… Talked quite a bit with Nurse Healy about ear experiments. I asked about a central clearing house for ear development information for doctors. She said there was none.36… Also had an electrode test on left ear… the test is not too comfortable because a wire electrode is inserted through the ear drum into the inner ear. Another electrode is attached into the muscle by the ear. Then a signal is sent that the patient hears if the nerve is alive.
(p. 288) January 29, 1973: Electronic Instrument Report: Comments Regarding Use after Installation of New Mike
Electrical interference picked up from a tape recorder 12 inches from mike, but without the power turned on. Interference at same distance when recorder operating.… Telephone receiver makes a whee-whoo sound when moved in area 12 inches in front of mike—without any power connections or source.… No dial tone can be heard when using the telephone. Time signal can be heard on telephone, but operator’s voice very faint…
Keys and coins jingle instead of clack…
Water running is a better, non-irritating sound.
Paper crinkling is a much more bearable sound.
Walking on the floor in the house is not just padding sounds, but is more of a hammering with an echo for each step. Sound is too sputtery.
When driving with the car radio on, you cannot discriminate the difference between ear noise and radiosounds.
Changes occurring. Son, David, was whistling for my attention in September, calling Dad, Dad in January. Wife, Barbara flashing a kitchen light for attention in August calling my name in January.
Saving money for long distance calls to make in a year or so when further improvements on the system are made.
Can you teach school, hear your own children’s voices as they grow and develop, listen to your car radio as you drive at night, hear sweet nothings in your ear, phone for assistance, order a meal.
Began to stay up an hour or two later each night to listen to TV.
In a school situation, it is also amazing to finally hear the bells that signal the beginning or end of a class period.
You would probably describe my current progress as changing from profoundly deaf to just hard-of-hearing, but difficulty hearing and comprehending is in a completely (p. 289) different league from silence. For instance, tonight I can finally hear the bell that indicates that I am at the right hand margin, as I type this letter.
Although I wouldn’t miss this experience with the new electronic cochlea for anything, it is a tremendously frustrating experience. There are changes in the instrument that affect the quality of the sound I hear, and some days I feel I am listening to one language in the morning, another in the afternoon, a completely different one in the evening.
At the school today, sound quality and volume were bad. I didn’t hear the bells as usual. My voice was very harsh and incomplete. It is very disconcerting to speak and only hear the high points of the words you say. Your voice is just not complete and yet you are getting almost too loud noises from parts of your words. It’s enough to make you stutter.
After coming home from work, I took off the aid in order to change shirts. When I put it back on, the sound had improved markedly. It would seem that turning the amplitude switch off and on caused the change. Several times in the past, when the sound suddenly deteriorated because of a blow to the instrument or something, turning the switch off and then on fixed it. At other times, it also needed to be rapped with a finder to settle down and transmit smoothly again.
When the instrument is working properly, there is a smooth flow of sound, and although it is low fidelity, it is close to understandable fidelity—but not quite. Sometimes I think of sound as a radio station that is too distant and barely off frequency. Possibly a better description would be phonography with a bad needle. There almost seems to be an element of good fidelity underneath or masked by a harshness or scratchiness. I feel that the volume needs to be higher but the harshness suppressed. I am convinced that the problem is in the microphone, amplitude switch, and/or in the transmission wire, possibly a leakage of current (very minute) between the hot wire and the ground wire. I keep wondering if there is a defect in the jack that allows leakage where the body ground wire and hot wire are connected to a two-wire cable into the instrument.
The instrument has deteriorated since it was new. The sound was smoother and more complete and would pick up weaker signals. At first this model was superior to the old, but not now. It will not pick up quiet conversation. It does pick up random energy from TV sets, electric shavers, power lines, etc. I have recently moved to another chair when reading in the front room because of the discomforting energy signal from the TV 6–10 feet away. The instrument has become as sensitive to clothing noise as the older model, maybe more so. The case has become very sensitive, as if it was the microphone. If I touch the mike, it may weaken the signal, and other times it will strengthen the signal. When it weakens it, the sound becomes harsher. The signal is improved if I moisten a thumb or finger with saliva and press the amplitude switch or back of the case. I get no change in signal from the side of the case. When I press my finger on the case, a light squealing occurs as the contact on my finger changes, almost as if the fingerprint lines (p. 290) affected the quality of the connection. Sometimes when I touch the microphone there is a squeal, and the sound transmission improves while I am touching the mike.
One of the most frustrating aspects of this whole experience is hearing changes in the quality of the sound I receive and not being able to communicate those changes to the electronic engineers to that they can understand and cope with the loss in fidelity and/or hope for better quality. The engineer may believe that the sound quality changed, but if he cannot see it on his test instruments, he can’t understand what the change is. Furthermore, since he is trained to read instruments, not listen to human reactions, he is likely to think it is a figment of the patient’s mind or merely a hope of the patient to hear still better.
There are no standard or objective evaluation points or standards in this experiment. The patient merely attempts to describe what he hears and the engineers design equipment or adjust it accordingly. If the patient and engineer use the same words but interpret them differently, they could be going in opposite directions. It also becomes difficult for the test patient to firmly remember with conviction how well he hears at one time compared to an earlier time. Does he hear better or worse? What did the sound really sound like? He is really making subjective statements about what he hears. No one else can hear this sound because it is the patient’s own hearing nerve that is converting the energy transmission to sound. When you hook the instrument to an amplifier and speaker system, you can only hope that you are hearing a similar sound.
This has been done. I listened to a transistor radio and described what it sounded like to me. I was then disconnected and my electronic instrument was hooked up to the amplifier and speaker system. I described the sound as close to intelligible but too harsh and scratchy. The background noise was too much for the better quality sound masked underneath it. Barbara and Max said the sound was scratchy and poor fidelity, but that they could understand some of it. This may be similar to what I hear over the telephone, which is one of the best sounds to me. I get the full flow of sounds and once in a while I get an intelligible word, phrase or sentence. However, no one will ever be positive what the patient hears, because they can’t hear it with his auditory system. Hopefully, the system will improve enough that the patient who remembers sound will eventually say, “Sound is just the same as before I became deaf.” But how long from now?
The best objective test for the system so far is for someone to repeat the words baby and sailboat in random patterns for me to repeat—without looking. The words usually are almost impossible to differentiate. When I can consistently tell these words apart, the system is working well. When the set is not working properly, instead of getting baby, you get BA-, and for sailboat, you get a bad sail without an “s.”… Birthday and earthquake would confuse you. You just don’t get each sound in the word. Sometimes someone dropping his fork on the table can make you think he may have called your name. Or the cat may meow and you at first think the phone may have rung.
At first I described the sound as Mickey Mouse, what he looks like he is saying, not what the voice impersonator says. In other words, it is like attempting to lip-read a cartoon character instead of human features. Or in other words low fidelity sound—or low, low.
(p. 291) I had one of the strangest experiences with the electronic instrument previously (Dec. 12, 1972). I was disconnecting a long extension cord for an adaptor plug. When I began to pull the prongs out of the female coupler, I got a transmission of energy in my ear. I had to pull the adaptor prongs a certain amount to cause this affect. No instrument or appliance was connected to the cord, so I must have been drawing energy through my fingers and transmitting it to my ear. How this could be is beyond me.
However, energy transmissions are not a new experience. The first instrument was sensitive enough to make me leery of walking or riding under high tension wires. I was uncomfortable in my own garage because of power lines that cross over the rear of the building. It picked up enough electrical interference when I sat at the dining table that I had a habit of turning the instrument in my pocket to cancel or weaken the signal.
The new instrument has deteriorated in this respect. I now hear energy transmissions when I ride my bike the ¾ mile to work. It is just very quiet, momentary static, about three or four times. When I arrive at school I get a zip of energy and realize that I have just crossed an underground cable. An experience like this really scared me at Love Field in Dallas, Texas. As I was exiting the aircraft through the boarding tunnel, the instrument momentarily went crazy. Just as I was telling my wife that the instrument was out, it calmed down, and I realized I had walked over a cable or by a cable or metal detector, or some other energy source.
I have now noticed that this instrument has started picking up interference when I am sitting at the dining table, just like the old model.
The most uncomfortable sensation I get at home is from the TV. From the beginning, I did not like to stay close when turned in the wrong direction. But lately, I have been picking up the energy transmission farther and farther away. Tonight, I gave up sitting in my regular chair 6-8 feet from the TV set and next to the antenna cable. I definitely would not go near the set, if I had a choice. It if get close, the energy warns me to keep away. It is just buzz or static, but it is not regular sound and is a disturbing sensation. It is not painful, but it is not right. Partially this is because you can combat an increase in regular sound, but you may not have control over an energy transmission. However, in most instances, if I turn a certain way, it reduces or eliminates the buzz.
October 13, 1973: 1974 Model Year Warranty Report
I try not to be impatient, or at least not let it show too much, but I am really eager for a better method of holding the sending coil in place and some type of circuit change to bring fidelity up to the level that it was in pre-production tests.… I am continually, always evaluating what I hear, counting to five, and all the other routines to test and intelligently cite my observations. Barbara is my sounding board.… Other than hoping to hear from you, the reason I haven’t contacted you was that I wanted to give you time to analyze my previous reports and also hopefully for you to get some feed back from other recent recipients of the device. As much as I like being involved, I don’t want to get in the way…
(p. 292) I am really not getting practical use out of the coil implant. As I mentioned previously, it is still too difficult to hold it in place correctly. It is annoying if it moves around much, because you pick up the primary carrier wave without giving it a chance to settle down to a useful transmission. Thus, you just have a varying carrier wave sound. I sometimes wear a “hippie” headband to hold the coil, but it does not hold the coil quite well enough, due to its flexibility, and it’s not very comfortable.… If I were to design a method to hold the coil, it would be as follows. The coil would be attached to a glasses frame…
I used to pick up sounds with the first two models at distances of 20–100 feet. Range has now increased to ½ mile if feedback and modulation are set properly, maybe excessively. However, I doubt if voice reception across a room has improved much. Fidelity of that voice is certainly lower than previously…
I feel that improvements in fidelity must come from modifications in circuitry. I can adjust range and power of transmission, but I can’t really improve fidelity. Adjustments of the instrument seem too much on and off as if there really was not a range of adjustment. I think that lower frequency sounds such as auto exhaust, suspension drumming, air conditioning compressor and fan, electronic organ base tones, etc. block other sound.… What makes this testing doubly frustrating, is that I still have the sensation that higher fidelity is just below the surface.… You may get bare undertone of quality, but it is dominated by much unusable sound. It is strange, because I feel that I need more modulation for good sound pickup and fullness of sound. Yet, I have to keep modulation at a low level to avoid too strong and harsh sound from cars, air conditioners, group voices, radio, wind, etc. If this sounds a little confusing, it is really difficult to think this out and state it clearly. It really is like dealing with a new medium. It is difficult to compare what you hear one time with what you heard one month or a year ago. It’s sure too bad that what I hear can’t be recorded.
April 26, 1977: Letter to Dr. Bill House
Presentations about animal experimentation or waiting for the perfect prosthesis always leave me skeptical about whether they will ever reach fruition. Nerve damage is a good question, but I would hate to wait fifteen more years to find that electrical stimulation is safe enough.… I have just returned from a school musical. The piano, strings, and voices were interesting. Can you imagine me giving up the last six years of ‘real’ communication to wait for the perfect, computerized, multi-channel system? Will they wait to help the blind until they think they have a system that will enable the blind to read the printed page? There must be thousands and thousands of other deaf people who want help during this lifetime, not in some distant future. I am thankful that you have the imagination and the courage to help them now. (I do hope that there will be enough engineering and production capacity so that people who use stimulators can obtain repairs and refinements.)
Anon. 1930. “Earless Hearing.” Time Magazine 15(15):40.Find this resource:
Adrian, E. D. 1931. “The Microphonic Action of the Cochlea: An Interpretation of Wever and Bray’s Experiments.” The Journal of Physiology 71: xxviii–xxix.Find this resource:
Bárány, Ernst. 1937. “Electrical Stimulation of the Cochlea.” Nature 139:633.Find this resource:
Blume, Stuart. 1997. “The Rhetoric and Counter-Rhetoric of a ‘Bionic’ Technology.” Science, Technology, and Human Values 22(1):31–56.Find this resource:
——. The Artificial Ear: Cochlear Implants and the Culture of Deafness. New Brunswick: Rutgers University Press, 2010.Find this resource:
Boring, Edwin G. 1926. “Auditory Theory With Special Reference to Intensity, Volume and Localization.” The American Journal of Psychology 37(2):157–88.Find this resource:
(p. 296) Chorost, Michael. 2005. Rebuilt: How Becoming Part Computer Made Me More Human. Boston: Houghton Mifflin.Find this resource:
Crowe, S. J., and Walter Hughson. 1932. “Experimental Investigation of the Physiology of the Ear Using the Method of Wever and Bray.” Transactions of the American Otological Society 22:125–36.Find this resource:
Davis, H., A. J. Derbyshire, M. H. Lurie, and L. J. Saul. 1934. “The Electrical Response of the Cochlea.” American Journal of Physiology 107:311–32.Find this resource:
Davis, Hallowell. 1935. “The Electrical Phenomena of the Cochlea and the Auditory Nerve.” The Journal of the Acoustical Society of America 6:205–15.Find this resource:
Davis, Hallowell. 1991. The Professional Memoirs of Hallowell Davis. Saint Louis: Central Institute for the Deaf.Find this resource:
Duchenne, Guillaume-Benjamin. 1883. Selections from the Clinical Works of Dr. Duchenne (de Boulogne). Trans. and ed. GV Poore. London: The New Sydenham Society.Find this resource:
Eichhorn, Gustav. 1927. Apparatus for Amplifying Low Frequency Speech Currents of Radio Receivers. US Patent 1,735,267 filed January 6, 1927 and issued November 12, 1929.Find this resource:
Eisen, Marc D. 2009. “The History of Cochlear Implants.” In Cochlear Implants: Principles & Practices, ed. John K. Niparko, 89–94. Philadelphia: Lippincott, Williams & Wilkins.Find this resource:
——. 2003. “Djourno, Eyries and the First Implanted Electrical Neural Stimulator to Restore Hearing.” Otology & Neurotology 2:500–6.Find this resource:
——. 2006. “History of the Cochlear Implant.” In Cochlear Implants, 2nd ed., ed. Susan B. Waltzman and J. Thomas Roland, Jr. New York: Thieme Medical Publishers.Find this resource:
Flottorp, Gordon. 1953. “Effect of Different Types of Electrodes in Electrophonic Hearing.” The Journal of the Acoustical Society of America 25(2):236–45.Find this resource:
Gernsback, Hugo. 1923. Acoustic Apparatus. US Patent 1,521,287 filed May 19, 1923 and issued December 30, 1924.Find this resource:
Helmholtz, Hermann von. 1873. Popular Lectures on Scientific Subjects. Trans. E. Atkinson, H.W. Eve. New York: D. Appleton and Company.Find this resource:
House, W. F., M. D., dir. 1974. So All May Hear. VHS. Los Angeles: House Ear Institute.Find this resource:
Johnston, Trevor. 2004. “W(h)ither the Deaf Community? Population, Genetics, and the Future of Australian Sign Language.” American Annals of the Deaf 148:358–75.Find this resource:
Jones, R. Clark, S. S. Stevens, and M. H. Lurie. 1940. “Three Mechanisms of Hearing by Electrical Stimulation.” The Journal of the Acoustical Society of America 12:281–90.Find this resource:
Kiang, Nelson. 1973. “Discussion.” Annals of Otology 82:512.Find this resource:
Kurzweil, Ray. 1999. The Age of Spiritual Machines: When Computers Exceed Human Intelligence. New York: Penguin Books.Find this resource:
——. 2005. The Singularity is Near: When Humans Transcend Biology. New York: Penguin.Find this resource:
Kyle, John Johnson. 1903. Compend of Diseases of the Ear, Nose and Throat. Philadelphia: P. Blakiston’s Son.Find this resource:
Lane, Harlan. 1993. “Cochlear Implants: Their Cultural and Historical Meaning.” In Deaf History Unveiled: Interpretations from the New Scholarship, ed. John Vickrey Van Cleve, 272–91. Washington, DC: Gallaudet University Press.Find this resource:
Lang, Harry G. 2002. “Book Review: Cochlear Implants in Children: Ethics and Choices by John B. Christiansen and Irene W. Leigh.” Sign Language Studies 3:90–93.Find this resource:
Lenoir, Timothy. 1986. “Models and Instruments in the Development of Electrophysiology, 1845–1912.” Historical Studies in the Physical and Biological Sciences 17(1):1–54.Find this resource:
Lurie, Moses H. 1973. “Discussion.” Annals of Otology 82(4): 515–16.Find this resource:
(p. 297) Michelson, Robin P. “Electrical Stimulation of the Human Cochlea: A Preliminary Report.” Archives of Otolaryngology 93:317–23.Find this resource:
Mills, Mara. 2009. “When Mobile Communication Technologies Were New.” Endeavour 33:140–46.Find this resource:
——. 2011. “Do Signals Have Politics? Inscribing Abilities in Cochlear Implants.” In The Oxford Handbook of Sound Studies, ed. Trevor Pinch and Karin Bijsterveld, 320–46. Oxford: Oxford University Press.Find this resource:
Nissenbaum, Helen. 2001. “How Computer Systems Embody Values.” Computer 34:118–20.Find this resource:
Otis, Laura. 2004. Networking: Communicating with Bodies and Machines in the Nineteenth Century. Ann Arbor: University of Michigan Press.Find this resource:
Pinch, Trevor E., and Wiebe Bijker. 1989. “The Social Construction of Facts and Artifacts: Or How the Sociology of Science and the Sociology of Technology Might Benefit Each Other.” In: The Social Construction of Technological Systems, ed. Wiebe Bijker, Thomas P. Hughes, Trevor E. Pinch, 17–50. Cambridge, MA: MIT Press.Find this resource:
Roosa, Daniel Bennett St. John. 1874. A Practical Treatise on the Diseases of the Ear. New York: William Wood & Co.Find this resource:
Scheppegrell, William. 1898. Electricity in the Diagnosis and Treatment of Diseases of the Nose, Throat and Ear. New York: G.P. Putnam’s Sons.Find this resource:
Simmons, F. Blair, M.D. 1966. “Electrical Stimulation of the Auditory Nerve in Man.” Archives of Otolaryngology 84(1):2–54.Find this resource:
Stevens, S. S. 1937. “On Hearing by Electrical Stimulation.” The Journal of the Acoustical Society of America 8:208–9.Find this resource:
Stevens, S. S., and R. Clark Jones. 1939. “The Mechanism of Hearing by Electrical Stimulation.” The Journal of the Acoustical Society of America 10:261–69.Find this resource:
Vernon, Jack. 1997. “Ernest Glen Wever: October 16, 1902–September 4, 1991.” National Academy of Sciences Biographical Memoirs. Vol. 71. Washington, DC: National Academies Press.Find this resource:
Volta, Alessandro. 1800. “On the Electricity Excited by the Mere Contact of Conducting Substances of Different Kinds.” Philosophical Transactions of the Royal Society of London 90:403–30.Find this resource:
Wever, Ernest Glen. 1939. “The Electrical Responses of the Ear.” Psychological Bulletin 36(3):143–87.Find this resource:
——. 1949. Theory of Hearing. New York: John Wiley and Sons.Find this resource:
Wever, Ernest Glen, and Charles Bray. 1930a. “Action Currents in the Auditory Nerve in Response to Acoustical Stimulation.” Proceedings of the National Academy of Sciences 16(5):344–50.Find this resource:
——. 1930b. “The Nature of Acoustic Response: The Relation Between Sound Frequency and Frequency of Impulses in the Auditory Nerve.” Journal of Experimental Psychology 13(5):373–87.Find this resource:
(2) . Implanted artificial pacemakers, which electrically stimulate the muscles of the heart (as opposed to transmitting information to the nerves), were developed in the 1950s, the same decade that research began into cochlear implants. The new transistors were essential in both cases, and cochlear implants benefitted from the prior success—and public acceptance—of implanted electronics in the case of pacemakers.
(3) . These media-perceptual formulations carry different overtones than the medicalized language of “neuroprosthetics.”
(4) . Parents and teachers can use a remote control to set these programs at a distance for children wearing implants. On the embodiment of values by electronic devices see Nissenbaum (2001:120); Mills (2011).
(6) . The current state of research includes efforts to develop implants that will not damage hair cells, in the interest of implanting individuals with “residual” hearing. See for instance the work of Bruce Gantz at the University of Iowa.
(7) . Accompanied by numbness and a metallic taste in the mouth. Today, Duchenne is perhaps best known for faradizing the facial muscles to provoke “emotional” expressions.
(8) . For more on the history of electro-otiatrics, the medicalization of deafness, the varied educational prohibitions of sign language, and the perspective of Deaf culture, see Lane (1993); Lang (2002).
(11) . And through this impulse, the ear sometimes produced its own noises. There was some rivalry between Wever and Davis, and accounts of the Wever-Bray experiment vary greatly in terms of the error they ascribe to the initial experiment. In subsequent attempts to transcribe the irregular discharges of the auditory nerve with an oscilloscope, Wever himself became more critical of the telephone theory. See Wever (1939).
(12) . Discussants of this publication included Wever, Fowler, and Wegel.
(13) . They recognized that sounds transmitted through cats were not as clear as those transmitted through a typical microphone (used as a control).
(14) . Using a group of hearing subjects—himself included—Stevens also mapped the range of current required for minimum audibility and electric shock. With very low electrical frequencies, two subjects described “a combined auditory, tactual and pressure sensation in the ear. Both observers reported it as a strange experience” (Stevens 1937:193).
(15) . He ultimately theorized that three different modes of electrophonic hearing existed (Jones et al. 1940). Marc Eisen explains, “We now know that electrophonic hearing results from the mechanical oscillation of the basilar membrane in response to voltage changes” (2006:2).
(16) . Stevens had previously demonstrated an interest in rehabilitation, designing a vacuum tube hearing aid for a friend in 1936 (currently on display in the History of Harvard Psychology exhibit in William James Hall).
(18) . Phillip Seitz, “Interview with Robin P. Michelson, M.D., 11/7/1995,” Box 612-OH, Cochlear Implants 1961–1995 Collection, John Q. Adams Center. At the request of the Stanford Research Institute (SRI), Michelson later implanted a sea lion.
(19) . Around that time, he also traveled to France and met with Djourno and Eyriès.
(20) . Biocompatibility of electrodes and other materials, as well as best technique for insertion and placement of the electrode array, were established through early surgical procedures with humans in conjunction with animal research.
(21) . Although today there is some controversy regarding credit for “the first cochlear implant,” in the 1970s these researchers visited each other’s labs, attended the same conferences, and were otherwise in contact with one another.
(22) . House implanted two other individuals around the same time but in one case the device failed and in the other case the patient moved away from Los Angeles and then died due to complications from syphilis. For extended discussions of the early implant experiments in France, the United States, and elsewhere, see Blume (2010); Mills (2011).
(23) . Unlike UCSF and Stanford, House thus decided to move ahead with single-channel implants, despite their imperfections, because they seemed more viable in the short term.
(24) . 3M voluntarily removed this implant from the market in 1985, due to various technical problems, and the House Ear Institute subsequently released their AllHear model, which was miniaturized and transmitted a broader range of frequencies (up to 6 kHz as opposed to 3 kHz). For more details comparing the two early House implants, see http://www.allhear.com/art_sys_chap1.php.
(25) . In 1990, implantation was approved for children over the age of two; in 2000, for twelve-month-olds. Today, although cochlear implants remain controversial—and have even been linked to a decrease in sign language use in certain national contexts—they are increasingly integrated into Deaf culture in the United States. On the waning of sign language in Australia, see Johnston 2004.
(26) . Although he was working under the auspices of UCSF, Robin Michelson similarly describes the overwhelming skepticism his research initially received from the medical community: “‘It’s impossible.’ ‘It won’t work.’ And one guy said, ‘I don’t understand why you don’t get sued for malpractice.’ Later I met him when I was on my tour as president of the Triologic, I met him at the Waldorf. And he said, ‘You know, we’ve done our first implant’” (Phillip Seitz, “Interview with Robin P. Michelson, M.D., 11/7/1995,” Box 612-OH, Cochlear Implants 1961–1995 Collection, John Q. Adams Center).
(27) . Phillip Seitz, “Interview with Ralph Fravel, 9/30/1993,” Box 612-OH, Cochlear Implants 1961–1995 Collection, John Q. Adams Center, 11. Due to this early criticism of pediatric implantation, 3M abandoned the cochlear implant field before FDA approval was granted in 1990. This chronology complicates Stuart Blume’s theory that the rejection of cochlear implants by the Deaf community led implant manufacturers to market the device to the hearing parents of deaf children (2010:116–17, 144).
(28) . The December 1985 issue of Car and Driver magazine reported on the hum picked up by Graser’s implant when he passed through radar traps, “Will this be the next generation in countersurveillance for bona fide speed freaks?” (From a clipping in Graser’s personal collection.)
(29) . Some otolaryngologists predict that improvements to hearing aids and to genetic or tissue engineering will decrease the demand for cochlear implants.
(30) . To experiment with the electroacoustic signal delivered through his implant, Graser used a “black box test device” in the lab and a “portable silver box” at home. Graser explains, “The lab was just Jack Urban’s workshop in Burbank.”
(31) . Graser recalls using frequency modulation with an early model, although he later found that amplitude modulation of the carrier wave produced better results. He regrets that the design of commercial cochlear implants has placed limits on user control and customization. The integrated circuitry of later models was less amenable to tinkering, and—for technical and economic reasons—software settings on computerized models must be set by a clinician. At the same time, physicians such as Robin Michelson explicitly wanted to limit user access to the processor, for fear of damage to the device: “These [device] failures, I think, were mostly with the external package, things like connectors, and patient wear, monkey sees/monkey dos, and that kind of thing. Some of them couldn’t resist looking inside the package and that kind of stuff” (Phillip Seitz, “Interview with Robin P. Michelson, M.D., 11/7/1995,” Box 612-OH, Cochlear Implants 1961–1995 Collection, John Q. Adams Center, 27). Even the basic controls (i.e., volume) and program settings (i.e., “listening situations”) on the today’s processors have been reduced to a few variables, implying a lack of technical facility among implant users.
(32) . On this adhesive system, Graser’s field notes from Summer 1978 record: “Coil slipped slightly off position Friday night—one shower and 1 ½ days later…Problem gluing the coil back on head with weight of coil and wire that makes it hard to position exactly and slides off place eventually. First re-gluing lasted 1½ hours. Second re-gluing lasted only overnight. The glued coil moves around less when chewing, etc., than coil on glasses.”
(33) . Dr. Robert Schindler performed this surgery. “His help led to about 10 years of exciting sound. His multi-channel implant replaced a non-functioning single channel implanted by Dr. Bill.”
(34) . Eventually, Graser was unable to obtain any sound through his cochlear implants, possibly due to neural damage from his many experimental surgeries. For unknown reasons, his ABI is now mostly nonfunctional. He noted in December 2010, “Every few months or so, I try my ABI. I was amazed the other day to get some hearing response. It is not high fidelity, but it is going to make lip reading easier.” He was able to hear “the click from light switches, rustling paper, exhaust fan, water running, and other simple sounds.” Several days later: “It was an interesting false alarm. Very interesting 4 days, and then silence.”
(35) . So All May Hear: The Cochlear Implant (1974), House Ear Institute, http://www.youtube.com/user/HouseEar#p/u/12/jqEvOgjmIhU.
(36) . In the early 1980s, when the number of cochlear implant users still numbered below 200, Graser became the founding President of a “Cochlear Implant Patients’ Association,” which published a newsletter and met monthly in Los Angeles for social support and information-sharing.