The Future Back Then5 min read

Science fiction holds an interesting position among other genres of storytelling.  It is probably the best barometer for measuring society’s collective mindset.  Where we are going is, after all, what the genre tries to portray.  In terms of both culture and technology, a work of science fiction can reveal a lot about the time period in which it was created.  Many people have tried to predict future trends, whether it’s through fiction or not.  In this essay, I will map out how people predicted technological changes throughout the last century through both science fiction and direct guessing.

The beginning of the 20th century saw great technological leaps.  People were streaming into cities to work at factories and new inventions, such as automobiles and airplanes, were in their infancy.  John Elfreth Watkins Jr. was an American civil engineer during this period.  In 1900 he wrote an article in The Ladies’ Home Journal entitled “What May Happen in the Next Hundred Years.”  His predictions, especially considering the time they were written, were incredibly accurate.  For example, he predicted that in 100 years, “ready cooked meals will be bought.”  Sound familiar?  Watkins predicted the rise of fast and frozen food.  He also vaguely predicted the Internet and television, saying, “photographs will be telegraphed from any distance.”  He even realized that they would all be in color.  Other predictions he made involved cell phones, highways, and airplanes.  He didn’t get everything right, saying mosquitoes will be extinct and coal will be obsolete and depleted, but most of his predictions were almost frightening in their accuracy.

Unfortunately, most science fiction did not have the same affinity for accuracy that Watkins had for direct prediction.  The first half of the 20th century’s science fiction consisted of a lot of robots.  We thought building a sentient robot would be child’s play in the future.  Isaac Asimov even created the “Three Laws of Robotics” in a sort of preparation for this seemingly close event.  Of course, we know now that intelligent machines are not so easy to create.  As Einstein himself once said, “Computers are fast, accurate, and stupid.”  That is more or less the current place of computers today.  Computers calculate and search faster than people ever could, but they can’t think.   There were so many robot stories in the early 20th century because of the popularity of pulp science fiction and the increasing prevalence of robots in factories.

Fast-forward to the Cold War, and science fiction has shifted from robots to atom bombs.  The reason for this should be obvious.  It was an era of giant bugs, Godzilla, and body snatchers.  We all thought the future, unless we nuked ourselves to extinction, would be in the stars.  Two important franchises, Star Trek and Doctor Who, began around this time.  Star Trek, in particular, focused on a future defined by warp drives, exploration, and phasers.  Even though Star Trek tried to stay more or less grounded in reality, it certainly got a few things wrong, especially in its early days.  Blinking lights defined the idea of “the future.” Also, judging by current technological trends, we will probably become cyborgs before we achieve warp.  We already have neural interfaces that allow disabled people to move a computer mouse simply through their thoughts.  Linking to a computer in the future seems more plausible to me than bending space itself for interstellar travel.

In recent decades, science fiction has shifted again from nuclear power and warp drives to reflect current technological trends.  Sure, we had fun with Back to the Future’s year 2015 predictions of flying cars; however, science fiction today trends towards depicting computers for the most part.  The Terminator franchise was about a rogue artificial intelligence that wiped out mankind.  As computers became more sophisticated, they began to take over the genre.  SHODAN and GLaDOS, from the System Shock and Portal video game series respectively, offer dark glimpses of machines with too much power.  As we move further into the 21st century, it seems that “cyberpunk” will be the most accurate description of the future.  Despite computers becoming more sophisticated, scientists have yet to figure out how to create hover boards, warp drives, or Robbie the Robot.

Of course, I am not saying that all science fiction directly reflects current technological trends.  It just has to be fun and/or thoughtful.  Some of my favorite pieces recently in this genre have no bearing at all on current trends: it fits into a current subset of genres I’ll call the “punks.”  Dieselpunk and steampunk are among the most prominent.  This idea of a retro-future (through sub-genres like steampunk) creates interesting scenarios.  Recent video games like Fallout and Bioshock take the idea of a retro-future and turn it into fun and engaging works of art.  Very little of the new Doctor Who series relies on “hard” science fiction, yet it is almost always an exciting and thought-provoking show.  Heck, I even love The Avengers, “tesseract” plot macguffin and all.

No matter how much fun other subgenres become, it is still interesting to map out the “hard” science fiction through the decades.  We started with predictions of robots in the future, moved towards a nuclear age, then shifted over to lasers and warp drives, finally “landing” in cyberpunk, where computers are increasingly tied to humanity.  Not everyone can be as accurate as John Elfreth Watkins Jr., but that hasn’t stopped many science fiction creators from trying.

Enhanced by Zemanta

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like...

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates from

You have Successfully Subscribed!