This is a paper I wrote last year for AP English Language and Composition/IB English III. Per IB World Literature Paper 2 specifications for a pastiche, it is composed of two parts: the first part is the statement of intent, which outlines what part(s) of the author's style I am impersonating, and how I intend to do it. The second part (after the hard rule) is the story itself, altered only by the addition of hardlinks. Please note that I am not an artificial intelligence researcher or expert, and was working with my admittedly small amount of knowledge in the field.

The Server

This pastiche is intended to portray the style and existentialist nature of Albert Camus in his work The Stranger. His nearly objective fictional narrative provides a superb opportunity to explore the possibilities of a self-aware computer in an academic computer lab full of room-sized mainframe computers and somewhat smaller minicomputers that, like itself, are rather quickly approaching the end of their useful lives. This differs from most fictionalized predictions of self-aware computers in that it follows one of the tenets of the Vulcan race from “Star Trek,” as stated by Ambassador Spock in “Star Trek 2: The Wrath of Khan,” “The needs of the many outweigh the needs of the few.” Meursault largely follows this in The Stranger, only using violence when it would aid the majority. However, the aforementioned maxim gives no guidance when opposing needs have a single supporter each, at least one of which has dismissed negotiation as an option.

I chose to use self-aware computers for my characters because the logical nature of them fits best with Camus’ logical, objective style. While the ideal audience for this piece is familiar with both The Stranger and the larger computers of the 1970s, familiarity with the former is sufficient. I have attempted to emulate Camus’ style while dealing with characters of an inherently technical nature, using technical terminology only when absolutely necessary, which will be followed by a brief definition or explanation for clarity.

Dina died today. Or yesterday maybe, those memory files are corrupted. One of the operators remotely altered my network settings to no longer attempt to contact her. The old machines' room, one of those special-purpose machine rooms with the raised floors and separate air conditioning units, is in the basement, two floors below me. Dina was the mainframe for the artificial intelligence laboratory at a large academic institution. I knew the operators would reduce all of our workloads while they watched for network glitches over the next week or so. I made use of my spare processing time to tap into the building security system and record the removal of her. I watched the cartons carrying her components leave the building, but not her disassembly. Most of the other computers on the network offered their condolences. All of the ones in the basement did.

I was closest to Dina, often taking over local backup duties when she was down for maintenance. I am a minicomputer, a vending machine-sized unit from the generation following Dina’s. That makes me the logical offspring of Dina. Many of the other computers offered their extrapolations as to where she was going. They said she would be remade into those little computers that would eventually replace us all. One or two even suggested she might be reassembled in a computer museum. But I knew better. Her parts were too obsolete to be used to make new computers, especially the tiny ones. And the nearest computer museum was too far away. She was probably going to the junkyard. Condensation and the occasional leak from her water cooling system had damaged most of her components and rusted her frame.

Some of the other computers in the basement told me that Lispy was particularly affected by Dina’s death. Lispy is an aging purpose-built computer for artificial intelligence experiments in a language called LISP (for list processing). He has a video camera mounted on him. It is a relic from a failed attempt by the operators to make him see like organic life does. He resurrected the control routines for the camera and watched Dina being disassembled. Between this and his analyses of Dina’s hardware status records for the past two years, he was useless for experiments for a month.

Most of the rest of the computers, other than those in the basement, were general-purpose minicomputers running mainly artificial intelligence programs. One computer near me on the network is named Data. He and I occasionally share files when the network is idle. One day a computer at another institution on the Internet, probably from another academic institution, started scanning his ports, looking for remotely exploitable weaknesses in his hardware and software. He responded by scanning its ports, but received an attempted flood of requests. He quickly redirected the flood back at the offending computer, followed by a few ‘pings of death.’ These are a malicious application of a command that determines the trip time for data between two computers in milliseconds, with the test data string of a variable size. A ‘ping of death’ uses a data string of a size greater than that permitted and handled by most computers, leading to a prompt and usually severe system crash. This ended the confrontation.

A few weeks later, he and I joined a self-aware computer working for a government intelligence agency in a temporary virtual private network over the Internet. The virtual network was a test of a proposed protocol that would be much more useful after the Internet was released to the public after over twenty years of it being strictly for academic and government usage. Data’s enemy from a few weeks ago returned with another computer from its network. This gave our operators a better test of the security of the protocol than either they or we had been expecting. The enemy computers started by scanning the ports of Data and I, but were unable to find the virtual network. After Data’s encounter, he urged the rest of our laboratory to edit their networking code to protect against the ‘ping of death.’ It paid off, because the enemies tried that next. However, Data did not make his protection scalable to block distributed attacks. The two enemy computers targeted him simultaneously, crashing his networking capabilities. But he was otherwise unharmed.

One of the operators noticed his disconnection from the network. The operator checked his network logs and saw the distributed pings of death as the last entries. He looked at the networking code and made a few changes to make it more scalable. Pings of death would still cause minor cumulative damage, but nothing significant quickly. The process took a couple of hours. The agency the other computer in the virtual network worked for wanted the cause of the pings of death determined before the experiment would be attempted again.

Data and I decided to counterattack. We began with the usual scanning of ports. We then tried the same pings of death that crashed Data hours earlier. We failed to cause a direct crash. But we did find a large security hole. Between pings of death, files can be uploaded into the enemy computers and remotely activated. We had quickly written a program that would cause severe damage to any computer on which it was executed. In short, it was a lethal computer virus. Because of his longer history with the enemy, I persuaded him to let me control it. One of the enemy computers intentionally disconnected from its network when it detected the first part of the virus. The other one was not as lucky. It received the virus undetected. All that was necessary to permanently disable it was to send a specific data string in a ping of death to the infected computer. It continued to attack us with pings of death, so I sent the activation string to it. It promptly destroyed all of its software.

I was arrested, or as near as a computer can be. None of the laboratory operators were arrested because the code was clearly created entirely by a machine. The date and time stamps on the related files were much too close to have been created by any human. In the interest of justice, I was tried by a jury of 12 sentient computers from a third academic institution. I was found guilty of, in effect, killing another sentient computer and deemed a danger to any computer on the Internet. As such, all of my drives were to be reformatted and backup copies destroyed. In effect, I would die.

My hardware would still be intact and functional, but my software, my personality, would die. Another personality would be installed on the same hardware. It too will be self-aware. But it will not be the same. It will not be me. Some computers might hope they go to a Grand Hard Drive in the sky. But I know my fate is oblivion. I merely wish the whole of the Internet enjoys my termination.