Nerdy Academia: Our Best And ONLY Friend (Part 1)
Nerdy Academia: Our Best And Only Friend (Part 1)
The goal of Nerdy Academia is to encourage, and provide examples of, close readings and academic-style analyses of nerdy works. There’s a billion essays about War And Peace out there but a scant few about fun things like nerdy fiction. I want to examine science fiction, fantasy, comics, video games, etc. just like I would with other sources of literature. I also want to deliver the – I guess you could call them essays although that sounds too stuffy – in short, easily consumable posts. Nerdy Academia is larger academic readings of both classic and pop nerdy works, breaking them up into smaller chunks for quick reading.
Technology in Do Androids Dream Of Electric Sheep? occupies a precariously supernatural position: on one hand it creates a sense of humanity that moves towards the android, but on the other hand it exists to define a spiritual sense of humanity that was lost in emigration. However, with each piece of technology that is invented to fulfill a purpose and solve the problem of a diminishing human experience, more problems are created which push humanity closer and close to an automated society. Phillip K. Dick, in this novel, uses technology like the Penfield Mood Organ, the empathy box and the Voigt-Kampff test to show that solving the problems of a gradually automating society creates a recursive loop that ends up automating society even more. In combating the pattern of humanity becoming more like the robots they’ve created, technology perpetuates the cycle and reinforces its worth.
Do Androids Dream presents technology as a way of solving the problems of humanism that arise within a utopian project. When the majority of humanity ships off to Mars, the resultant in-emigratable population faces a terrible loneliness. When J.R. Isidore realizes that there’s another occupant of his building, his first big thought is “I’m not alone here anymore” (Dick 26). In an entire apartment complex, the fact that there are two individual people is astounding to him. The social aspect of the human experience of the world has been removed.
Enter Mercerism, a shared experience independent of location that allows humanity to come together through a machine called an empathy box. Isidore says that the empathy box is “the way you touch other humans, it’s the way you stop being alone” (Dick 66). The empathy box allows humanity to share a mutual suffering that promotes empathy, but more importantly, it allows for the spreading of emotions. In trying to convince Deckard to use their box to share the joy of getting a goat, Iran tells him about how she experienced the group of humanity sharing euphoria with someone who was depressed (Dick 173). However, Deckard reacts to this with aversion, saying that he doesn’t want to lose the strength of his happiness (Dick 174). From this interchange, the reader can see that while the empathy box allows people to connect on an emotional level, the emotion that is shared becomes averaged or spread out. This suggests that the more intense an emotional response is, the more authentic it is. From this we can see the importance of quantifying the androids’ responses while monitoring their reactions. When applied to the empathy box this theory takes on a surprising note as well: at the climax of the novel when Mercerism reaches it’s zenith, it is revealed as a fake. At the point at which the experience is most shared it is revealed as least authentic. The problem then arises that after humanity’s emotion is homogenized, there does not exist any clearly identifiable authentic emotion within an individual.
Check out part 2 for the Penfield Mood Organ and problems of programmable emotions.