My personal favorite phrase in writing isn't "kill your darlings" or "show, don't tell" or "less is more" or anything along those lines. Those truisms all have their place in the writer's toolbox, and I fight with them every time I start typing anything out (as you may have noticed from this writeup alone, I tend to be a bit verbose), but none of them hold any particular weight in my mind. Instead, my favorite phrase is one that doesn't even make sense without the proper context - "The eight deadly words." The words that phrase refers to are these:

"I don't care what happens to these people."

The reason this phrase is deadly is because it indicates a fundamental flaw exists within the story the author is trying to tell. Without any sort of connection to the more human aspect of the story, the reader is left floundering the world the author has created, and will, sooner or later, stop paying attention to the story altogether. Science fiction tends to have this problem much more than other genres, as it is much easier to ignore the importance of that human element when you could instead be exploring nifty new alien economic systems or stylized societal systems. Unfortunately, interesting conceits do not always give way to interesting stories. This is the central problem with High Maintenance.

Spoiler alert starts below the line.

 


Another problem lies in the systems that High Maintenance tries so desperately to swap out for a story - specifically that none of these systems make any damn sense whatsoever. The economic system suffers from more gaping holes than fishnet stockings1.. A few of the more notable ones for me were2:

  1. We purpose-build robots for a reason. Robots built to be worthwhile partners (look at the website, that's what they're designed for) would not have sociopathic tendencies plugged into them. Robots that have not been tested well enough to see they have sociopathic tendencies under the surface are shoddily made and would likely not be in wide usage. Remember kids, don't trust anything from China.
  2. Robots should not consume resources unless needed for energy. The fact that they do so in the film means that the robots are designed for running on human-digestible food. I'll get to the "maybe they just like it" factor in a moment.
  3. The world (and this number is pulled out of the dark corners of my memory, so I believe it's accurate in a general sense but not worth citing ever) can sustain, given current crop production methods and cultivatable land and so on and so forth, can sustain roughly ten billion people living like Africans (or 2.5 billion living like Americans). Adding mass-produced (or at least easily produced) robots to that number means anyone who buys a robot is rushing us to that carrying capacity even more quickly. The assholes.
  4. The woman there was most certainly living more like an American (German? Brit? Westerner.) than like an African. The asshole.
  5. By adding a human being into your household with no job experience or special training to your household, you're taking on a massive financial drain. This is a lot less "baby" or "boyfriend lost his job", a lot more "marrying a homeless men". Especially considering they may kill you (more on that later).
  6. Even if the robots can be given all the knowledge they need to succeed in the world, the job market is going to be hellishly competitive once you throw in the robots. If they're on even footing with humans, unemployment worldwide is going to start looking a lot more like Spain. If they're better than we are, then most humans will be out of their jobs, homes, and lives in no time at all.
  7. If a human is too poor to maintain their posessions, they are (gasp!) reposessed. This would lead to their robots either being taken from them and given to someone else, or being melted for scrap. This is one hellish film.
  8. Since the robots already pushed us right up against the population cap, food would be scarce. The humans who are all losing their jobs would be the first to lose eating privileges. Enter famine, stage right.
  9. If the robots are only eating for fun, that still puts added pressure on the food supply. Children in Africa are already starving thanks to us. Now, because we want to "humanize" our walking sex toys, we're just blatantly killing them.

Now, these are just the ones that jumped at me (I am currently swimming through piles of information related to environmental science, which is why all these points have a distinct food supply feel to them), and I'm sure there are more. However, I'm going to change my focus to the problems inherent in the social structure the film creates, starting with the woman's situation:

  1. We purpose-build robots for a reason. Robots that are built to be worthwhile partners would then only leave the warehouse to be shipped to the person who ordered that robot, as shown in the film. This leaves no reason for the female robot to not be attached to a human.
  2. If her human died of natural causes, this is more excusable. Still, the female robot would become property of her human's estate, and should have been either resold or melted for scrap by this point, not buying more glorified sex toys.
  3. If she killed her human, not only did she violate the first law of robotics but she would have not been fufilling her programming's goal (already discussed) and would again become part of her human's estate. That last point also goes for the male robot who shut her off - murder is a great way to get yourself turned into slag.
  4. If their life could be ended - or, at least, paused - in such an easy manner as a neck switch, robots would not be so willing to accept neck rubs. Imagine letting someone come up to you and start fondling your chest with a knife.
  5. It also seems as though switch covers would become quite the popular commodity, much like bulletproof vests are for people in war zones or nursing homes are for people who are about to die one way or another. Humans - and things that think like humans - are very good when it comes to avoiding death for as long as possible.
  6. We actually do have ways to end life as easily as a neck switch. They're called guns. Or knives. Or windowsills, poisons, injections of air into the arteries, etcetera. This societal structure isn't so much "unique" as "poorly regulated". So... take that, libertarians! ...I guess.

That's a start, but in no way does this even begin to cover the massive amount of problems and plot holes the film manages to pack within its (padded with two minutes of credits) nine minute running time. And, if you ask me, it never even touched the interesting pieces of its premise. With a custom-made-robot-based society, you have all of the ethical dilemmas attached to cloning alongside all those with sentient AI programs, multiplying together in a whole new array of philosophical arguments. The film never even touches these, instead just suggestively wiggling its eyebrows at the massive pile of squandered possibility it managed to create in its short life.

I wish this film had spent more time on the philosophical questions behind the (extraordinarily) thin storyline it presented. I wish that the central premise of this (well-filmed, well-presented) world made any sense in the slightest. I wish that I liked this short. Unfortunately, you can't always get what you want. The film glosses over any interesting points it could have raised and instead (between the dinner and the time spent on the website, also known as the entire film) focuses on poorly thought through elements of an impossible society. Altogther, the nicest things I have to say about this film is that the camerawork was very well done. Otherwise, it was simply lacking too much.3

 


 

1: Alternate lines: "More gaping holes in it than in..."

  1. Most of my arguments in the catbox
  2. A gay man in the deep south
  3. George Zimmerman's alibi
  4. A slut
  5. New England roads
  6. A West Virginian's family tree
  7. British teeth
  8. A female echidna (THEY HAVE TWO VAGINAS!)
  9. The ozone layer
  10. Most Americans' knowledge of European geography
  11. Most Europeans' knowledge of American geography
  12. Got another one? Message me, I'll add it.

2: To the people whose first reaction to this list is to make excuses or explanations along the lines of "well, maybe they...", I have this much to say:

"Well, maybe they..."

No. That's exactly the problem. They didn't.

In science fiction, there are two different ways of dealing with any not-exactly-possible situation: either hand wave it away, or brush it under the rug and hope no one notices. My problems with this film stem from their consistent usage of the second (and worse) of these two tactics. Instead of offering some quick out for the characters (oh, thank God for buildings which grow asparagus from their paneling so we can enjoy this meal!) the film merely ignores any issue with the world it creates, and does it so blatantly that I don't think the creators put any more thought into their script than the basic plot points and first draft. I love hand waves. I don't feel the need for every economic detail to be spelled out for me. But damn if I won't call you out if you think that ignoring any economic or societal problems of introducing massive amounts of sociopathic robots into a civilization is a good way to tell a story. I don't fully care what the justification is, so long as there is one. And High Maintenance doesn't care enough about its world to even fufill that requirement.

3: The point of this node is to inspire large amounts of creative energy focused on one easily accessible topic. To that end, I would love to see more lists of problems with the film's structure. If you don't want to make your own writeup or don't think you have enough material for one, message me and I will happily add your point to this list and credit you.


* DonJaime says re High maintenance: Three ways of dealing with a not-exactly-possible situation: 3. make it the premise of your story.

Mj says re DonJaime: A story built on shoddy framework can succeed. This one didn't.