2018 Revival: Con Survivor

Finishing out a year in which a lot has happened, but there’s been almost nothing happening on this site — mostly because a lot has happened. But I’m using this place as digital storage by including some writing for unusual places from the last few months. First up, this is a piece written for the launch party of Oni’s The Long Con at Portland’s very own Books with Pictures, which ended up being read aloud by the wonderful Ben Coleman.

Based on the questions I’ve been asked over the years, there are a few preconceptions about being a journalist at Comic-Con that I feel the need to try to clear up. Firstly, no; it doesn’t mean that you automatically get into all the popular panels and hang out with movie stars and eat free food, although I did once accidentally leave Hall H in San Diego through the wrong door and ended up in the celebrity waiting room, which had a spread like you wouldn’t believe, and was filled with the cast of some big blockbuster I can’t even remember, all staring at me while clearly thinking “You don’t belong here.” I was quickly escorted out by security.

And, no, being press doesn’t mean that you automatically know where all the good parties are, and it definitely doesn’t mean that you get invites and can sneak everyone in. I mean, yes, there was that time I got into a party where the band was Josie and the Pussycats from Riverdale and they were actually performing live, and everyone lost their minds, but that happens, like, once or twice a convention, tops.

Most of all, despite what I’ve just said, it isn’t glamorous. It’s glamor-adjacent, and that’s fun and strange and great, sure, but it’s also weird and uncomfortable and occasionally just very… awkward. Here’s the best example of what I’m talking about. It’s about eight or nine years ago, and through some unlikely happenstance, I’m working for a well-known weekly news magazine that I won’t mention the name of. I mean, technically, I’m working for the website of a well-known weekly news magazine, but the distinction is meaningless to anyone I tell about the job. Honestly, it was pretty meaningless for me, too; I was firmly under the impression that I had arrived in the big leagues, and that everything was going to be great from then on.

This was before I arrived in San Diego to discover that I would be sharing a room with five strangers for the next four nights. And that the room had two single beds, and we could maybe get an extra cot if we were lucky. On the one hand, everyone seemed very nice and there was only a couple of people whose work I recognized and felt embarrassed to be sharing a bed with because, really, they deserved better. On the other, I can’t emphasize this enough: We were all working for a well-known weekly news magazine — like, one of the ones that’s actually a name — and they definitely could’ve afforded at least another room or two. This was just cheap.

It also made it difficult to do work. It isn’t unusual to end up working late into the night to meet deadlines at Con, and when you’re sharing a room with five people trying to sleep, it’s not so easy to stay up, typing away, without making people mad at you. All of which explains why I ended up sitting in the foyer of the hotel, trying to write a couple of stories at ten o’clock at night one night.

So, I’m sitting there with my laptop and headphones on, listening and listening and listening to this interview, trying to transcribe it and write whatever I was writing, and I kind of half-noticed that it was getting pretty busy. I didn’t really think that much about it, because it’s Comic-Con and everywhere is busy at Comic-Con, especially hotels. And it keeps getting busier, and busier, and at one point I look up and realize, wait, everyone looks really fancy. This is odd.

It took me about another hour or so, and by this point it’s maybe 2am and there’s really loud music and the foyer is just packed, to realize that there was actually a party going on all around me and I hadn’t realized. And it’s a big party; there’s a DJ, there’s people dancing and drinking and making out and all kinds all around me and I somehow just hadn’t noticed for hours. I didn’t know what to do, because I couldn’t go back to the room, everyone was asleep and I hadn’t finished work, so I just…stayed there. And pretended none of it was going on while I sat on a couch, with various things happening literally right beside me that were very distracting. Eyes fixed on the screen. Writing. Just writing.

And then, at one point, with no warning, the music just stopped suddenly. The crowd groaned en masse, but stopped when it became clear what was going on: Everyone shuffled aside to let an ambulance crew pull a stretcher towards the elevators, and then they disappeared. No-one said a word, everyone just staring at the elevators for minutes until the ambulance crew re-appeared, with someone strapped into the stretcher.

This sounds like a downer, I know, and you could tell at the time that the ambulance crew was clearly thinking the same thing. They didn’t look anyone in the eye as they moved towards the door of the hotel, and then they paused, before one of them said in this wonderfully embarrassed voice, “He’s going to be fine!” As in on cue, the music immediately started back up, and everyone got back to partying, like the whole thing had been planned.

That is what Comic-Con is like as a journalist. Being exhausted, under deadline, surrounded by people having more fun than you, probably, and unsure whether or not you just saw something actually tragic, or if it was some weird performance art piece in the middle of a party. And, you know, also getting to see Josie and the Pussycats perform live on a hotel rooftop standing next to the cast of Arrow as they lose they minds.

What can I say? It’s really large. It contains a lot of multitudes.

April 29

“A week or so,” I said I’d return in; that turned out to be optimistic at best, if not downright foolhardy. April proved to be an overwhelmingly busy month for a number of reasons (and is continuing to be, right up until its final day), and even on days when I had time to write here, I’m not sure I would have written more than simply “I’m so tired, I’m so, so tired” over and over again.

When I started writing here daily, I had visions of doing so every day for a year, some kind of grand plan that would also let me write for myself again, even if it were simply pointless meanderings of little worth. I started 2015 feeling as if I was risking becoming an automaton in terms of output; that the pressures of work meant that I had nothing left to give in terms of brainspace for anything else, and I needed something that was my own. (Wait What? is that to some degree, and I love it very much for that as well as for the chance to talk to Jeff on an almost weekly basis.) Hence, writing here.

And yet, the first three months kind of proved to me that I did have little left to give in terms of brainspace, for the most part; I was writing the random, stream-of-consciousness material that I’d hoped for, but it was emptier than I would have liked, and I think the hope that I’d… I don’t know, sharpen mental muscles as I went along or something, didn’t happen. When I was done, I was done; it was clear to see.

None of this should be construed as real complaints, as much as disappointment in myself and the result of a slow realization that I need to recognize my limits better (and, maybe, factor in some more downtime for myself. We’ll see if that latter one happens anytime soon, though). Will I be doing daily posts here again…? I’m unsure, to be honest; I’ll try to do them when I feel like I can do them, and they feel like something I have time and brainspace for, instead of a promise I made to myself than I have a responsibility to fulfill, if that makes sense. So, if anyone’s reading, hello again.

The Wrong Lessons

A final post, from now, from the “Things I Wrote for WIRED That Were Never Published” vault. This is from August last year.

As the Comic-Con announcements last month demonstrated, 2015 is shaping up to be a pretty big year for genre cinema, with Superman/Batman and [World of] WarCraft being added to a summer line-up for that year that already included Avengers: Age of Ultron, Star Wars Episode VII, Independence Day 2, the reborn Terminator, Jurassic Park IV, the next James Bond movie, Fox’s Fantastic Four reboot, Finding Dory, Pirates of the Caribbean 5 and the final Hunger Games movie. Pretty impressive, perhaps, but also more than a little derivative.

The swathe of anticipated big budget sequels doesn’t stop there for the year; also anticipated at some point in 2015 are follow-ups to Mission Impossible, Prometheus, Avatar, Snow White and The Huntsman and, somewhat unexpectedly, Pitch Perfect (On the plus side, we might finally have the “Cups” song out of our head by that point). When it comes to “original” genre productions, we seem to be limited to Assassin’s Creed, Ratchet and Clank and Marvel’s Ant-Man, all of which are, of course, adaptations of properties in other media.

This might seem like a sad state of affairs — perhaps one that can only lead to cannibalization given the sheer amount of big budget movies lined up to face off against each other — but you can’t blame movie executives alone for it. The genre movie slate of 2015 is very much a result of what happened this summer at the box office.

Let’s start with the reliance on familiar material. As the 2013 box office to date demonstrates, it just doesn’t pay to create something new in the genre space — or, at least, it doesn’t pay as well as recreating something old. Seven of the top 10 movies of the year in the U.S. have genre trappings, whether they feature superheroes, science fiction or monsters (Yes, monsters in university still count; I’m not counting Fast and Furious 6, even though that is essentially Car Avengers at this point in the series), and each of those movies is based upon an existing property, which isn’t entirely surprising. After all, the majority of genre movies released each year are remakes, sequels or adaptations of stories that have already created a fanbase elsewhere.

From a business point of view, this makes a certain amount of sense: Genre movies tend to be more expensive than non-genre movies — because of the cost of the necessary special effects and visual trickery necessary to make the audience believe in something that, for the most part, couldn’t exist in the real world — so the prospect of investing that increased cost in a known quantity with relatively established fanbase at least appears to be less of a financial risk than putting the same amount of money into something new, unknown and unproven.

The problem arises when the same issue is approached from an aesthetic direction. Simply put, familiarity breeds contempt, and there are only so many times we can see the same stories being told, or the same characters in action, before it gets boring. Entertainment has to be about novelty to some degree, which — by definition — requires something that we haven’t seen before. This is an area where non-genre movies — comedies, dramas and other features which tend towards realism in ways that require less money to conjure onscreen — have the edge on genre: It’s less of an investment, or business risk, to come up with something new, meaning that the ratio of “new” versus franchise outside of the genre space is far greater than it is for genre output.

As much as many would like that ratio to change — and for genre movies to become less dominated by all-too-literal attempts to recapture what has worked before — it can be difficult to argue against the business math responsible for the way things are, especially when those non-franchised movies that do get released end up falling short of the success enjoyed by the alternative at the box office. To wit: 2013 had a handful of “new,” non-franchise genre movies, each with some level of draw to mainstream movie audiences, and none made more than $100 million at the U.S. box office.

After Earth and Oblivion, both of which told similar “Earth is screwed, so we moved on but then we came back and it totally wasn’t what we expected” stories with different massive movie stars attached (Will Smith and Tom Cruise, in that order), stalled out at $60 million and $89 million, respectively. The much-hyped, fiercely-defended Pacific Rim is sitting around $86 million. The most successful of this year’s All-New Apocalypse fiction, This Is The End, is also the cheapest (It cost $32 million to make, compared with Pacific Rim‘s $190 million, After Earth‘s $130 million and Oblivion‘s $120 million); it’s managed to rake in $95 million to date.

If there’s a second lesson to learn from this summer for movie executives besides “stick to what you know,” it’s “when you choose to gamble, make your gambles as cheaply as possible.” Besides Pacific Rim, Disney’s wildly expensive Lone Ranger movie earned just $85 million, despite costing the studio $215 million to produce, never mind market. In comparison, “smaller” — and, tellingly, non-genre — movies like Now You See Me (which cost $75 million) and Identity Thief (which cost $35 million) not only recouped their investment but went on to move into pure profit.

So what, exactly, is going on here? Is there a cap on genre movies that don’t have a nostalgic or recognizable “in” to get audiences past the speed bump of traditionally niche-oriented material where suspension of disbelief in the unfamiliar is required? Are there only so many people who are interested in paying to watch robots, monsters and superheroes when they’re not accompanied by some level of childhood nostalgia?

It’s been said often enough that audiences need to vote with their dollars when it comes to demanding a certain kind of entertainment. That idea is complicated when what’s being offered up is so limited. If the audience wants to see all-new original material, but doesn’t want to see the kind of material that’s on offer in Pacific Rim, Oblivion or After Earth, should they “vote” for it or not? If the kinds of movies they want to see are only available in existing franchises, is paying for a ticket voting for that particular kind of movie in terms of tone and plot, or for “franchise movies” as some kind of invincible monolith?

Based on the box office results of the year so far, it’s no surprise that movie studios are focusing on the franchises as much as possible for summer 2015; they’re as much of a sure thing as is possible for the industry these days, even if the amazing, worrying pile up of Must-See Movies listed above suggests that some will inevitably fall by the wayside.

If, however, the lack of new, original genre movies is the result of the performance of this year’s batch of end of the world flicks, that’s unfortunate, and the result of a skewed test sample. Offer the audience some new ideas with the variety, optimism and invention–not to mention, please, some sense of frugality; no more $190 million budgets–missing from this year’s examples, Hollywood, and see whether or not the mainstream audience is ready to watch a genre movie that they don’t already know the story of. The alternative is simply surrendering to the law of diminishing returns.

Is Mad Men The Ghost of Television Past?

Another oldie, from May last year, written for WIRED.

On the slowly-unfolding AMC period drama Mad Men, character arcs and plots can take several episodes or even seasons to come into focus. As the show’s sixth season slowly unfolds, it’s tempting to suggest that the show is at risk of becoming as much a part of the past as the era it portrays. Is television even interested in this kind of programming anymore?

When Mad Men debuted in 2007 — setting a new ratings record for AMC in the process with 900,000 viewers — the landscape of television was different. The Sopranos had just finished on HBO, and The Wire was still on the air; Lost was still in the middle of its run, and the idea of television as the home of long-form, complex, quality drama was something still on the minds of many. Mad Men was simply more evidence of the future of the format.

Cut to 2013, and it’s a very different story. The first episode of the new season had 3.4 million viewers tuning in – down from last year’s season premiere high of 3.54 million – and successive episodes have dropped to around the mid 2 million mark, under the level for the same time last season. More importantly, the show’s importance to AMC has shifted, if not outright shrunk, in light of the phenomenonal success of the channel’s The Walking Dead.

It’s not just that the most recent episode of that comic book adaptation brought in almost four times as many viewers as Mad Men‘s peak, with 12.42 million people watching (It was, after all, a season finale); consider, as well, that the accompanying episode of The Talking Dead –Chris Hardwick’s talk-show companion to the zombie drama — had a series high of 4.3 million viewers; almost a million more viewers than Mad Men for a show that is far cheaper, and far simpler, to produce. No wonder that the channel has announced plans for Talking Bad, a similar show to accompany the final season of Breaking Bad this August.

It’s not only AMC where attention and focus has shifted from quality drama to genre fare. Instead of The Sopranos or The Wire, HBO’s most-discussed series these days is George R.R. Martin’s Game of Thrones, and its most-watched show is trashy vampire soap True Blood. Attempts at more low-key fare like Luck and Treme meet considerably more muted response and, as a result, have shorter lifespans. The audience clearly knows what it wants, and what it wants is apparently sexy genre fare over the chance to see middle-aged men struggle with the complexities of life as we know it.

A similar thing is happening in broadcast television; a cursory glance at the shows networks are developing for the 2013/2014 television season reveals that the hour-long drama format continues to be dominated by unchallenging procedurals, crime dramas or fantasy fare, for the most part. For all the excitement offered by Lost‘s ambitious scope or complicated narrative structure, the post-Lost television landscape has suggested that the show succeeded despite those elements, not because of them. Instead of demonstrating that broadcast dramas can challenge the viewer without scaring them off, the lesson Lost taught broadcast television was apparently that flashbacks can be a legitimate form of long-form exposition (See: Once Upon a Time, The Following).

The slowly shrinking Mad Men audience makes the fact that AMC reportedly cut budgets for both Breaking Bad and The Walking Dead in order to pay for Mad Men‘s most recent seasons somewhat confusing. Admittedly, for Breaking Bad, there is some level of logic from a purely business perspective — The show brings in fewer viewers, and therefore less advertising revenue than Mad Men — but the idea that AMC would undercut its most visible, valuable show for something that is watched by a fraction of its audience is counter-intuitive at best.

Or is it? While the show brings in fewer viewers and less advertising revenue per dollar spent than The Walking Dead (and definitely considerably cheaper The Talking Dead), Mad Men arguably brings AMC far more critical prestige than Robert Kirkman’s horror series.

The same is true of Breaking Bad; even though ratings for both shows may be a fraction of The Walking Dead‘s audience, having two of — if not the two — most highly-regarded television dramas today on its network gives AMC an overall reputation that makes the network brand more attractive to program-makers and advertisers alike. “AMC,” it suggests, “is where the forward thinkers, the early adopters, the smart buyers go for shows. Sure, less people might watch overall, but the ones that do watch are the tastemakers you want.” Add that prestige and attention to the middling ratings, and Mad Men earns its keep.

We’ve seen this before, with NBC Universal’s Syfy and Battlestar Galactica; Ron Moore and David Eick’s series was never the highest-rated show on the cable network, but unlike the more popular Stargate: Atlantis or Warehouse 13, it did snag the network a Peabody Award and prompt a discussion of human rights at the United Nations. To be blunt, you genuinely can’t buy publicity — or affirmation — like that. When a show starts connecting with people in such a way, you keep that show on the air as long as you can before it starts to really hurt financially.

The problem is that, eventually, it will start to hurt financially, and at that point you have to start to say goodbye. Television is a business, after all, and there comes a point where leaving money on the table in the name of critical plaudits starts to seem foolish; you can’t use acclaim to put food on the table, after all. Goodwill only goes so far, and with every single episode, more people are leaving Don Draper for other shows.

Battlestar Galactica lasted four seasons (Five, if you include the original mini-series that launched the reboot); Breaking Bad will last five. Mad Men, by the time it’s finished, will have lasted seven seasons. All things considered, that’s an impressively long run, especially considering the alternative programming AMC could have opted for at any point that would have brought more people watching. It may simply be that Mad Men the show has a parallel existence to Don Draper himself: Slowly becoming outdated without anyone realizing it at the time.

This doesn’t bode well for the future of television drama, though. If broadcast networks are going to play it safe in terms of selecting new shows, and the previously-reliable cable and premium cable channels have discovered that genre is far more successful than “straight” drama when it comes to return on investment and eyeballs-on-shows, that’s a problem for any new show that wants to play things slow, subtle and lavishly enough that its budgets may make executives nervous. Given the choice between something with the potential to become a breakout hit and something with the potential to break even but maybe garner critical acclaim, it’s more of a risk to go with the latter option, and with the television industry in seeming flux (Ad spending was down in 2011, back up in 2012, an election year), now might be the time to play it safe. So where will we see the next Mad Men?

The answer may be online. We don’t yet know — and may never know, considering just how closely guarded viewing numbers at Netflix tend to be — how many people have streamed House of Cards so far, but let’s do some creative math for a second: Mad Men averages between 2.5 and 3 million viewers an episode, as does Breaking Bad, so let’s say that that means there’s a three million-strong audience for those shows in the U.S. at least (Bear in mind, DVRs, DVD and streaming audiences alike aren’t factored into those numbers; Sunday’s Mad Men often tops Apple’s iTunes TV chart on Monday, so there’s a second audience right there that’s already digital to consider).

Admittedly, a new drama in that vein wouldn’t have the name recognition nor the critical acclaim that would drive people to tune in, so the math may be somewhat skewed upwards. Perhaps not, once you factor in the audience that wanted the show in another format than live-viewing and the additional audience who might be interested in the show but stays away because it’s already five years in (Or, for that matter, the audience who might watch just for the novelty of something new).

Nonetheless, it’s safe to assume that the metrics for “success” for a streaming-first show are somewhat different than that for a traditional television show, if only due to the newness of the format and the smaller scale of the audience. Is it possible that a Mad Men-style 3 million people audience would be enough to be considered a smash hit for streaming? Could the future of prestige television drama be somewhere that isn’t technically television at all?

Who is the Doctor?

From the never-published final installment of the on-again, off-again recaps on WIRED’s Underwire for the last season of Doctor Who. Funny to revisit in light of subsequent episodes.

With “The Name of The Doctor,” this latest season of Doctor Who came to an end with something that was neither a bang nor a whimper — in large part because the final few moments of the episode turned it from a revelatory finale into confusing, frustrating glimpse of things to come.

Ignoring for a second the final scene of the episode, “The Name of The Doctor” oddly crystalized a lot of the problems this seventh season has suffered through. Like so many episodes this run, Saturday’s final episode was good enough as opposed to particularly strong, and found itself relying on familiar characters, ideas and audience goodwill to distract from writing that was surprisingly messy given the series’ recent history, and filled with plot holes and unexplored ideas that could upset the story’s movement with just a minute’s exploration.

And what distractions the episode provided! We saw Clara with each of the previous Doctors in scenes that demonstrated seeming lack of convincing green screen technology (The second and fifth Doctors, in particular, appeared in scenes with a Clara obviously shot elsewhere and elsewhen. By comparison, the scenes with the first and third Doctors seemed to give her a graininess that matched the original shots), as well as henchmen that were reminiscent of both the popular Silence from the show’s sixth season and also Buffy The Vampire Slayer‘s Gentlemen, from way back when, and a third appearance this year from the increasingly popular Madame Vastra, Jenny and Strax, the alien detectives from the Victorian era. Underneath all of this, however, was a script that ultimately failed to convince.

The basic plot of “The Name of The Doctor” was, at heart, very straightforward. Our heroes were lured into a trap by a former enemy out for revenge, which they only survived due to self-sacrifice on both of their parts. It was the meat on those bones where things got somewhat convoluted: The Doctor and Clara found themselves on Trenzalore, the site of the Doctor’s grave at some unspecified time in the character’s future in order to save Vastra, Strax and Jenny from the Great Intelligence — the villain from a storyline from the series’ original run, as well as the most recent Christmas Special and the first episode from this most recent run. After death, all that remained of the Doctor in the tomb wasn’t a body, but his personal timestream, which was less an abstract concept than a quasi-physical lightshow that could be “entered” by first the Great Intelligence seeking to undo all of the Doctor’s good works, and then Clara — attempting to stop the Great Intelligence — and the Doctor himself.

That Clara was successful was hardly a surprise; the show could hardly let the Doctor die with episodes left on the clock (and anyway, we dealt with the faux threat of the Doctor dying last year). Instead, the interest in Clara’s attempt came from the fact that, by entering the Doctor’s timestream, she became scattered across his life as multiple people with no recollection of who she had been — the multiple Clara’s we’d encountered up to this point, and the “impossible girl” who had captured the Doctor’s attention in the first place, leading to his meeting the “main” Clara for the first time. Well, that and the other character Clara and the Doctor met inside the Doctor’s timestream, but we’ll get to him soon enough.

For every smart idea in the episode — The explanation for what made Clara the “impossible girl” after all, her remembering events that had been wiped from history because the Tardis was leaking time, the post-Doctor’s death slow revision of the universe’s history, and how that altered character relationships — there were moments that just seemed unfinished or needlessly rushed. The Doctor warned about crossing over with his own timeline and later collapses from having done so, but just two season finales ago, “The Big Bang” relied entirely on his doing just that without any ill-effects, for example; similarly, the surprisingly speedy and easy discovery of Clara within the Doctor’s timestream felt unearned, and undercutting the drama of her having seemingly sacrificed herself doing so just minutes earlier.

But see, we’re already at the final sequence I mentioned earlier. Up until that point, “The Name of The Doctor,” for all its flaws, felt like an ending (albeit a disappointing one). Then, in the midst of the Doctor’s personal timestream, Clara and the Doctor met a shadowy figure with his back to the camera; he was someone the Doctor was seemingly afraid of — or afraid of Clara discovering, perhaps — describing the figure as, essentially, the incarnation he’d like to forget, the Doctor who doesn’t save the day.

That this new Doctor — A future incarnation that “our” Doctor knows about because he, too, has entered his timestream? A past one? — is played by John Hurt is important only for the BBC, who’ll doubtlessly like to boast of an actor of such popularity and credibility taking on the role (How else to explain the hilarious “Introducing JOHN HURT as THE DOCTOR” credit once he turned around?); for fans of the show’s larger mythology, what is more important is that this brings the number of incarnations of the Doctor to twelve, leaving the character with just one more regeneration to go before his death, according to rules set up in the original run of the show. In recent years, it’s been teased that the rule no longer applies, but never definitively stated within the series itself.

With just one scene at the end of the episode, “The Name of The Doctor” went from disappointing closure to a shameless tease for November’s 50th anniversary episode: What has this new Doctor done that is so terrible (Being responsible for the death of every other Time Lord, an established part of the character’s backstory since the show’s 2005 revival, would be the most obvious guess)? Does the thirteen incarnation rule still exist, and if so, is the Doctor close to his final life or is there another incarnation that we don’t know about? And, more subtly, but arguable more importantly, will the Doctor be able to reconcile his actions in that incarnation with his self-image, and stop repressing an entire period of his life?

The scale of the final scene of the episode ultimately overwhelmed what had come before; it left the audience feeling energized and excited, but it was a cheap thrill in many ways. Despite the title of the episode, the name of the Doctor wasn’t revealed on Saturday, and the slight of hand that managed to make that disappointment (or relief, perhaps) disappear from fans’ minds was a sign that — perhaps, if we’re lucky — the Who that lies ahead will be as bold and fun as the one they fell in love with. It may have been a sign of better things ahead, but that doesn’t change the fact that what came before was underwhelming at best, and a sign that, when it comes to this series, familiarity may be breeding contempt after all. In more ways than originally intended, perhaps, a lot depends on the 50th anniversary episode coming up in November.

Truth, Justice and the American Psyche

This one, written for WIRED, goes waaaay back — it was written to accompany the release of Man of Steel last summer. I seem to remember something similar eventually saw the light of day in connection with Superman’s 75th anniversary later in the year.

This week’s Man of Steel gives us the latest in a long line of “new” Supermen, another attempt to retool the long-lived character for contemporary audiences. In this case, it’s a Superman who is neither entirely comfortable with himself nor the world he lives in, and is arguably more concerned with a need for secrecy than he is in doing the right thing when the opportunity presents itself. In other words, a Superman that’s worryingly in step with contemporary America.

It’s just coincidence, of course, that our newly paranoid cinematic Kal-El debuts at a time when we as a nation are reading news reports that the government is tapping into people’s Internet use and phone calls, but hardly a surprising one given the contemporary themes that the moviemakers were clearly attempting to touch upon.

Much of the movie revolves around Kal-El’s paranoia about being revealed to be something other than a regular guy, but instead a literal alien — A nod, perhaps, to the increasing xenophobia present in a modern-day America that is, in part, obsessed with where its leaders were actually born — and in a strange way, that paranoia may make him more empathetic for American audiences increasingly convinced that they are under constant surveillance.

The source of Kal-El’s paranoia comes from Jonathan “Pa” Kent who, in a break from tradition — and to the upset of many hardcore Superman fans — essentially tells his adopted son to never use his powers in public for fear of being discovered. On the face of it, this sounds like sacrilege; Jonathan Kent is historically the one who teaches Superman about the need to do the right thing no matter what, as many have pointed out. And yet, in today’s world, it makes a depressing amount of sense.

Consider the world in which Superman would be born into, today. Unlike in the days of yore, there is no realistic way that Superman — or younger Clark, for that matter — could operate publicly without being discovered by the world at large. Think about the number of camera phone videos that would appear on YouTube, or the ease of which satellite imagery would catch a super-speed blur lifting a bus out of a river, for example.

Pa’s advice to his son may have been overly cautious — Really, kids should die just to keep your son’s secret? — but hardly surprising or unrealistic in a country that is warned of increasing surveillance on behalf of the authorities (and other sources).

Add to that, the fact that the movie then goes on to prove Pa correct to some extent when Lois uses her Mad Google Skillz to find Clark Kent with an ease that seems unusual in movies, and yet feels oddly realistic for the world we actually live in today. This is, perhaps, the first true Superman for a digital age that we have yet seen.

Man of Steel, then, demonstrates one of the benefits of Superman as a character; he is just filled with subtext and potential metaphor, and is so versatile that he can be — and has been — reinvented for each new generation as an icon oddly in touch with the zeitgeist despite being invented three quarters of a century earlier.

This flexibility is, most likely, a lucky side-effect of his longevity. When Superman was created by Jerry Siegel and Joe Shuster in 1938, he may have given some thought as to what their contemporary audience wanted — Someone to defend them from an uncaring authority and “stick up for the little guy” — but both men’s eyes seemed to be on the prize of something that would be successful in the moment, as opposed to for all time.

(It’s no mistake that Superman is an immigrant, either; in that, he not only personifies the American ideal at a time where the country, emerging from the Great Depression, needed to believe in that ideal more than ever, but ties in with much of the literature of the time, with books like James T. Farrell’s Studs Lonigan series displaying the downside of the world that Superman — and Siegel and Shuster — strove to escape.)

Instead, it was the various ways in which various writers and artists added to Superman’s mythos, supporting cast and surrounding environment, trying to find more reasons for readers to come back month after month, that gave us an icon so filled with potential that he could easily say something about any period — and answer any need — that his adopted home country demanded of him.

Because of attempts to appeal to patriotic fervor in the 1940s, Superman went from being a believer in social justice to being a proud upholder of the status quo, with National Periodicals — the company that would later become DC Comics — going so far as to create special editions of Superman comics specifically for the U.S. Army in order to entertain (and, at times, educate) the troops, even as the regular editions promoted war bonds and offered propaganda back home.

As post-war America focused on rebuilding the family unit — The term “nuclear family” dates back to 1947, according to the Merriam-Webster Dictionary — Superman did his part, gaining a cousin, a pet and an increasing number of women who dreamt of being his wife, as the writers, artists and editors sought out new ways to keep their character relevant for the contemporary audience.

Less obviously, Superman stories of that 1950s took a different route to reflecting what was happening in the U.S., as a twist on the already-tired trope of the deadly kryptonite offered a chance for Superman to pave the way for the burgeoning counter-culture to go mainstream. “Red K[ryptonite] was LSD for superheroes,” Grant Morrison wrote in his 2011 book about the history of the superhero genre Supergods, pointing out the similarity between the fictional radioactive meteor and the surreal body paranoiac work of writers like William S. Burroughs emerging at the same time. While most of America hadn’t read The Naked Lunch or The Soft Machine, Superman was unwittingly preparing them for an altered state of being and idea of fluid identity that would finally break through to the mainstream a decade later thanks to the hippies and psychedelia.

By the 1970s, Superman — who, by this point, could seemingly manifest new powers as needed, and had quietly slipped into the role of benevolent face of authority — seemed out of touch with the world around him. Not only was he seemingly safe from any and all dangers that were thrown at him, but even Clark Kent, still a mild-mannered reporter for the Daily Planet, felt too distant from the common man in an era where television news was replacing print for most people.

The solution to this came in the form of a number of small course corrections for the character. The most dramatic of these — a drastic scaling back of his super-powers that coincided with a seeming end to his vulnerability to Kryptonite — proved short-lived (Kids, it turned out, preferred their hero to have powers seemingly limited only by imagination), but both Clark Kent’s career shift to television news anchorman and a renewed focus on stories exploring more relatable, human-scale events — including a series called “The Private Life of Clark Kent” — were more successful, remaking the Man of Steel as someone more likely to, in the words of a then-contemporary commercial, “reach out and touch someone” instead of punch something to solve his problems. You would believe Alan Alda could fly, perhaps.

And then came the 1980s.

Of all the makeovers that Superman has undergone throughout the years, writer/artist John Byrne’s 1986 comic The Man of Steel is arguably the most dramatic. Amidst the rise of Reaganomics, yuppies and “Greed is Good” as a lifestyle choice, Byrne removed many of the more tragic elements of Superman’s origin — Ma and Pa Kent’s deaths were undone, and Clark went from nebbish, clumsy loser to successful novelist and Pulitzer Prize-winning journalist and high-school football hero — transforming the character’s dual-identity dynamic so that he was now a winner in both guises, as the audience demanded of its heroes at the time.

(The 1990s Lois and Clark TV show drew a lot from Byrne’s take on the Superman mythos, which makes a lot of sense; in a lot of ways, Byrne’s re-imagining of the character drew from the glossy, success-worshipping soap operas of the 1980s as much as previous versions of Superman, easing a translation into that format.)

There have even been times, as with Man of Steel‘s combination of the U.S. military and the fear of loss of privacy, when random coincidence put Superman in step with the American hive mind. The 2001 storyline Our Worlds At War told of an intergalactic battle that left Earth in mourning, and Superman taking it harder than most — He even adopted a black armband in memory of those who had died. The armband debuted in Superman Vol. 2 #174, which was released less than a week before the attacks on the World Trade Center and Pentagon on September 11. Even more unfortunately, the day after the 9/11 attacks, The Adventures of Superman #596 was released, a comic which opened with the sight of twin towers partially destroyed as a result of the intergalactic war. Needless to say, Superman’s mourning period had more resonance in the real world than originally intended.

Recently, Superman’s status as avatar of the American psyche has become a murkier proposition as creators became overly aware of the potential for the character as stand-in for a country’s soul and attempt to use it to “say something” about the country as a whole. We’ve seen Superman renounce his U.S. citizenship in response to his perceived connection to American foreign policy, and the character undertake a year-long walk across the country to get back in touch with the American people for reasons that, to be blunt, still make little sense three years later. Instead of reflecting what America was feeling or thinking, the character was being used to lecture the country, instead.

Man of Steel, then, may not be entirely in line with what many people expect from a Superman story — or Superman himself, for that matter — but it does manage to return the character to his rightful place at the heart of the American psyche, even if what we find isn’t what we’d hope. We may not get the Superman we want, but as history as proven countless times, we almost always get the Superman we deserve.

Marvel’s Agents of the Status Quo

Written for WIRED, and I honestly can’t remember why this didn’t run. It’s from October last year, and events in the show have outdated this since to some extent.

For those watching ABC’s Marvel’s Agents of S.H.I.E.L.D., last week’s fifth episode of the series, “Girl in the Flower Dress,” was the one in which the show’s true enemy was revealed — and it turned out to be anyone who might suspect that secret government organizations are up to no good, or believe the information should be free. And you thought I was talking about the titular villain of the week.

To be fair, the show’s concept — its very title — suggests that this wouldn’t be a series for those who had problems with authority figures. This is a series for those who believe in the Men in Black Suits who we’re more used to seeing as untrustworthy or, at best, a necessary evil. In many ways, it’s a 180 spin on the traditional media dynamic of the solitary heroes standing up against corrupt authority figures; in Agents of S.H.I.E.L.D., we’re told, the authority figures are doing what they’re doing for the good of everyone and we should just back off and quit our whining.

When I write “literally told,” I mean it; in last week’s episode, reformed hacker Skye had the following conversation with her (arrested and illegally detained) ex-boyfriend Miles:

Miles: So I guess “due process” isn’t really S.H.I.E.L.D. protocol.
Skye: They don’t have time for it.
Miles: Are you defending them? These people are denying us our basic rights.
Skye: This isn’t about us. They’re trying to save someone’s life.
Miles: Listen to yourself. That’s what they always say to justify invading someone’s privacy, Skye. These people stand for everything we despise: Secrets, censorship —
Skye (interrupting): Enough with the manifesto, Miles!

Yeah, you tell him, Skye! Who cares about due process or privacy when someone’s life is in danger? That’s just some kind of manifesto and not, like, real life! In case the viewer wasn’t convinced enough that Miles doesn’t “get it,” the very next scene has S.H.I.E.L.D. agent Grant Ward tell us that Miles is “hiding behind platitudes,” and before too long, we find out that Miles has not only sold classified information — “I believe in all those things [about ‘information should be free’],” he says, “I just don’t know why they have to go hand in hand with barely scraping by” — but did so to a bad guy who “seemed harmless,” because — of course — he’s not only greedy, he’s also not as smart as our heroes at recognizing what the real dangers in the world are.

“Girl in the Flower Dress” was the most blatant attempt so far in the Marvel Cinematic Universe to act as propaganda for the Military Industrial Complex. Considering that the heroes to date are an arms manufacturer, a soldier and a god who hang out with them because they’re awesome — oh, and three agents working for the same secret government agency as the TV show, of course — that’s really saying something (By the end of last week’s episode, of course, Miles had come around to S.H.I.E.L.D., saying that they did seem pretty cool after all, except he’s not as hot as Skye so he didn’t get to join the team like she did following her very similar about-face).

At first, I felt some sense of disappointment for Agents of S.H.I.E.L.D.‘s boosterism for, you know, scary stuff that’s happening all around us because, in some nostalgic way, I still considered Joss Whedon — whose name remains linked to the show, despite a lack of direct involvement past the pilot — someone who stands up for the little guy. Consider his previous shows — Buffy the Vampire Slayer, Angel, Firefly and even Dollhouse were all thematically about the power of the individual standing up to whatever that show’s version of The Man happened to be. Ultimately, though, that’s an unfair comparison because this isn’t really a Whedon show — it’s very much a Marvel show, and a Marvel idea.

In some way, though, that just makes it worse. There was a time, in the earliest days of Marvel Comics, when the appeal of the publisher was that it was filled with underdogs who were misunderstood and often, in the words of the X-Men‘s tagline, “feared and hated” by the authorities despite trying to do the right thing. On some level, central to Marvel’s appeal in the beginning was the idea of an outsider standing up for what’s right, even if — especially if, perhaps? — it went against the status quo (Even Captain America, the straightest man in Marvel’s library, found himself at odds with the comic book S.H.I.E.L.D. on a regular basis).

Marvel’s Agents of S.H.I.E.L.D.‘s eager embrace of authority — and attempts to denigrate those who question it, whether by painting them as evil, greedy or just plain dumb — isn’t just offensive to those who might find themselves thinking that maybe NSA spying is something to be concerned about, then; it’s something that feels in some way out of step with the Marvel legacy in some way. Maybe there’s a swerve coming at some point in the future when Agent Coulson et al will realize that there’s a downside to their mission — certainly, the tone of the trailer for Captain America: The Winter Soldier suggests that there is some re-evaluation of S.H.I.E.L.D.’s tactics in the future — but until then, it’d be nice if the show could be a little more subtle in trying to convince viewers that anything goes as long as the men in the suits tell us it’s okay. As Skye would say, enough with the manifesto, Marvel.

Enough Nostalgia Already, Star Wars

Another Star Wars piece written for Wired, and another one that didn’t run for reasons best left undiscussed. It was actually given back to me to offer elsewhere; I would have run it at the Hollywood Reporter, but I felt that it was too similar to an Indiana Jones piece I’d written for them a month or so earlier (especially with the Indy mentions).

As the adage goes, those who forget history are doomed to repeat it. The same, it seems, is true for those who forget Indiana Jones and the Kingdom of the Crystal Skull, with reports claiming that J.J. Abrams’ first Star Wars movie will focus on the cast of the original trilogy in order to give fans “one more chance to enjoy them.”

There is only one sensible response to this idea: Please, no.

On paper, it’s something that makes a strange kind of sense: Using Han Solo, Luke Skywalker and Leia Organa in Star Wars: Episode VII gives the new movie some legitimacy, while also shamelessly zeroing in on whatever affection existing fans of the original movies have left in their hearts after the prequel trilogy.

There’s an argument to be made in favor on a purely story level, as well, with the familiar faces acting as an “in” for the audience to whatever the new status quo of the Star Wars universe is. Simply by showing us how they react to the new world, we as an audience will know whether we’re in favor or not, because we already identify so much with them. On that level alone, it’s a shorthand that has to be very tempting to producers — but that still doesn’t make it an idea that’s good enough to make it all the way through to the final movie.

One reason to ignore the appeal of the idea is to look at the bigger picture of what such a move would do to Star Wars as a franchise. Episode VII is already the most high profile Star Wars project since 1999’s Episode I: The Phantom Menace, and for many people will act as a reintroduction — or, perhaps, an introduction for the first time, depending on age — to Star Wars as a contemporary movie series.

Centering that movie around characters from a series of movies that ended more than three decades earlier seems contrarian to the point of insanity, in that case: a statement that the franchise isn’t forward looking or brand new at all, but an exercise in nostalgia that’s targeted at pre-existing fans who’ve seen all of the movies to date. Despite the title, Episode VII should be treated like a new beginning, not “the next installment of something you really should’ve jumped onto earlier” (For those thinking that J.J. Abrams is too good a filmmaker to make this mistake, I present Star Trek Into Darkness as the perfect example of a movie which was tripped up by nostalgia at entirely the wrong time).

Worst still, there’s the fact that fans don’t want to accept: as a contemporary action movie — which Star Wars really has to be in order to jumpstart the franchise the way that Disney inevitably wants it to — Mark Hamill, Carrie Fisher and Harrison Ford are, at 62, 57 and 71 years of age respectively — too old to take the lead roles. As much as we may wish otherwise, for fear of our own age and growing mortality, there’s a limit to what audiences are likely to accept from their action heroes in terms of age, and the lead trinity from the original movies are at least a decade beyond that limit these days.

The mention of Indiana Jones and the Kingdom of the Crystal Skull above was intentional; remember the last time Ford returned to a fan-favorite franchise and moviemakers tried to adapt for his age by giving much of the stunt-heavy action to his newly-introduced son? Instead of making us excited for Shia LaBeouf’s Mutt, it simply made Indy seem old and somewhat lesser, in some strange, indefinable manner. Imagine that happening again, but for the three leads of the original trilogy, and ask yourself, why would anyone want to do that again?

When the possibility of Episode VII was first rumored, there was much talk about having Ford, Fisher and Hamill appear in cameos in the film, passing the torch to whatever characters the new series would eventually center on. If, as reported, that idea has been shelved in favor of more focus on the original characters, I hope it’s a decision that gets reversed sooner rather than later. It’s not that the Luke, Leia and Han should be missing from the new movie — there really is a lot of benefit to their making an appearance, albeit a brief one — but they shouldn’t dominate it. For Star Wars to survive, it needs to be about a new hope, and not stories that happened a long, long time ago.

Work Not In Progress: Fantastic Four

An unfinished first draft of this piece for the Hollywood Reporter, abandoned midway through because it wasn’t working. I started over from another angle, and it just felt better, somehow.

Now that it looks like we may finally have Fox’s new Fantastic Four, perhaps we can finally turn out attention to more important topics — like the question of just what is going to happen in the movie, due for release next summer.

As is by now traditional with super hero movie franchises, it’s almost inevitable that the first movie in the series will be the origin story of the team. Director Josh Trank has already denied reports that the movie will follow Marvel’s own revisionist Ultimate Fantastic Four in terms of origin story — but, of course, he’d also denied that Miles Teller was up for a role in the movie, and we’ve seen how that turns out.

Given the vintage of the original Fantastic Four origin story — in which the characters get their powers as the result of an attempt to beat the Russians into space, with Sue Storm pleading “Ben, we’ve got to take that chance… unless we want the Commies to beat us to it!” at one point — it’s impossible that the movie won’t be forced to deviate from the classic version of events in some way, whether it goes the Ultimate route (A teleportation accident), follows the earlier Fox movie’s “accident aboard a space station” take on events, or finds its own way to expose the characters to those all-important cosmic rays.

The earlier film also played with the origin by introducing Doctor Doom into events far earlier than he appeared in the comic book. That was a good move for a number of reasons, not least of which being that the original origin lacks any villain whatsoever, with the bad guy for the issue appearing later and having the ridiculous motivation of wanting to destroy New York because he was so ugly no-one would date him (Sadly, I’m not joking). Substituting Doom for the poor Mole Man felt as much like an act of mercy as it did a smart storytelling choice.

Using Doom makes sense, in that he’s undoubtedly the most famous — and most interesting — villain in the Fantastic Four mythos. Luckily for Fox, however, he’s far from the only interesting threat in there. The first hundred or so issues of Fantastic Four, by Stan Lee and Jack Kirby are some of the most restlessly inventive super hero comics ever made, creating all manner of characters that could easily be spun off into their own movies should Fox desire it.

Whereas Sony’s plan to build out the Spider-Man movie series into a multi-franchise property relies on that character’s villains, a Fantastic Four family could include the Silver Surfer, Galactus, the Inhumans, entire alien races like the Kree and the Skrulls, and arguably more characters whose rights situation may be slightly more complicated (The Black Panther, for example, appears to be under control of Marvel Studios although he first appeared in Fantastic Four. Similarly, quite who owns the movie rights to Adam Warlock, who first appeared in Fantastic Four under the name “HIM!” is unclear).

Internet Cynicism

Written for Wired, and left unpublished for reasons I can’t explain — there was excitement behind the scenes for it, but it never ran. Who knows?

We’re still some distance away from Star Wars: Episode VII — Two years away, in fact — with production on the movie still months away from getting started over in London. You might assume that that would mean there’s little to say about the movie at this point, but you’d be wrong; for months now, we’ve seen wave after wave of “exclusive” reports announcing the involvement of one actor or another, of some plot development that will almost certainly be happening in the movie, and so on. Let’s be honest: It’s gotten more than a little exhausting.

With that exhaustion — and your limited schedule, dear reader — in mind, we’ve decided to offer the all-purpose Star Wars: Episode VII casting rumor report. Simply delete and fill in the blanks as applicable. You can thank us later.