2018 Revival: OMAC Essay

This only went up on the Shelfdust site a week or so back, despite my having written it in… October? November? I can’t even remember at this point. Internet deadlines can be weird, here’s an essay I wrote to accompany the Shelfdust Top 100 Comic Issues list. It’s kind of a mess — I was in a very strange frame of mind as I wrote it — but I like it anyway.

The very first page of 1974’s OMAC #1 tells the reader exactly what to expect; the opening narrative capture explains the set-up for the entire series as Jack Kirby starts the book in media res: “OMAC One Man Army Corps is the story of a young man in The World That’s Coming!!” it starts. “In that strange place, the common objects of today… may become the terrors that we never bargained for… like the one below!”

Kirby gets a lot of shit for his writing tics, all the weird emphasis and “random” “quote” “marks” where it doesn’t really seem to make sense from today’s point of view, not to mention the irrepressible momentum of it all; it’s a million miles away from the stylized, self-conscious thing that passed for naturalism in today’s mainstream comics, and for that reason alone it’s often criticized or targeted as a guilty pleasure. But it’s genuinely amazing stuff, as immediate as the best pop music and featuring turns of phrase or ideas that are wonderfully memorable and memetic decades before anyone knew what that word meant. OMAC is filled with so many examples of this kind of thing, from “The World That’s Coming!!” to Lila the Build-A-Friend, who pleads “Put me together… I will be your friend…” prompting OMAC to respond, “Where does humanity stop and technology begin? We no longer know, Lila…”

The techno-suspicion of the first issue is wonderful, and wonderfully prescient; Buddy Blank’s discovery that the one person in the world who was kind to him was just an artificial intelligence — although, again, this was decades before that term would enter popular usage — feels like a predication of the relationships formed through social media and the ways in which they can turn out to be not as real as some hoped for, or believed. But Buddy, the nebbish alter ego of the One Man Army Corps who essentially disappears from the series midway through this first issue, is what makes it feel like Kirby knew what The World That’s Coming!! was like more than most.

There’s a scene in the issue, where Buddy is wandering aimlessly through the halls of “Pseudo-People, Inc.,” the dehumanizing corporation he works for, having been bullied. What initially seems like a Marvel-esque origin story — is he the loser that no-one understands? — gets turned on its head by a subtlety and ambivalence that Stan Lee would’ve jumped away from in fear. “Maybe Fox is right,” Buddy thinks to himself. “I’m angry enough to flip out!” A page later, he says to himself, “I’m not angry at anybody… I just feel depressed, that’s all…”

OMAC #1 has all the hallmarks of a Kirby comic that people would expect from reading his Marvel work, and arguably even the majority of his Fourth World material — it’s visually bombastic, it’s fast-paced and dynamic and filled with astounding concepts that are at once ridiculous and utterly perfect. But at the heart of it is a character who feels honest and true and recognizable to so many people today: A character who is somehow more real than the milquetoast nerd stereotype of a million other comics by that point, who feels alienated and abandoned by a world around him that’s hypnotized by the toys and the technology at its fingertips, and who — most importantly, perhaps — doesn’t get a last-minute vengeance or score-evening moment of redemption.

Instead, Buddy is swallowed up by that same technology, against his will. He isn’t changed into OMAC by choice, or even an accident; he’s chosen by an authority he isn’t even aware or, and once “Omactivated,” is essentially a different person altogether: He’s more violent, more confident; a version of the cliched alpha male. Buddy is murdered by the state so that OMAC can live, if you like.

OMAC as a series is great; it’s got everything you could want from 1970s Jack Kirby, who is undoubtedly my favorite Jack Kirby. But OMAC #1, taken on its own, is something far greater than what followed; it’s a sneaky, but perfect, horror story about the world that we live in today, and the ways in which the everyman — “Buddy Blank” is a poetically perfect name for someone who could be all of us — is powerless to resist against its lure of techno-distraction and authoritarian control. 44 years after it was published, it just continues to feel more and more timely with each new reading.

And The Morning Seems So Grey

Something that no-one seemed to consider about this whole “living through history” thing is how utterly exhausting it is. We have, for the past two years or so, been in a political moment as dramatic and important as any since Watergate (at least; I’m sure there are those who feel what’s happening right now is more, somehow), and as thrilling as that may be — admittedly, alternate terms may include “horrifying” and “anxiety-inducing”; your choice as you may feel applicable — it also feels as if it’s an endurance challenge intended to destroy us.

The lives and livelihoods of friends and strangers have been constantly under threat during all this time, the moral thread of this country feels at times almost permanently lost, and reality often seems to be folding in upon itself as things which feel like paranoid conspiracy cliches turn out again and again to be true. (As I write, there are yet more stories suggesting with worrying legitimacy that the President’s loyalties lie with Russia, not the US, something that feels as if it really shouldn’t be true, as if that were too unoriginal and hacky.)

The upshot of this is a fraying of the nerves, and a growing weariness towards… Well, everything. There was a lot of don’t normalize this at the start of the Trump presidency from well-intentioned scolds, but how could we not? The alternative was to constantly live in this heightened sense of alarm and disoriented shock, which is an easy way to lose perspective on everything. And yet… isn’t that what kind of happened, anyway? I know that my good humor feels strained past breaking point, at times, now, and 2018 as a whole was a year that broke me — and legitimately broke parts of my life-as-was for good.

I was talking about this to a friend, recently. (Hi, Jeff.) He said that things feel different now, somehow, better in some inexplicable way that felt dangerous to try and identify for fear of simply tempting fate. I feel that, too, and the mixture of excitement, optimism and, to be honest, this beaten-down fear that, no, things don’t get better anymore, they just get weirder and worse is difficult to describe, beyond simply saying that it’s tiring. There was a time, once, when I didn’t feel so tired all the time, and I want to get back there soon.

Or perhaps that’s just age, for all I know.

2018 Revival: THR Newsletter Logos

Something unusual — I do the header logos for the THR Heat Vision newsletter every week. I fell into it by accident, because I started by tweaking the logos someone else had done, and then somehow I was just doing the logos every week. It’s a surprisingly fun part of my week, even if I know my logos are far below the standards of people who, you know, do this for a living. Here are some of my favorites from the first few weeks.

2018 Revival: My Personal Top 10 Comics Issues List

This one wasn’t written for publication or performance; it was the notes I made to accompany my submission to Shelfdust’s Top 100 Comics Listwhen I submitted my Top 10. (To clarify: It was specifically top 10 comics single issues, not storylines/collections/graphic novels, and it was by any definition I wanted — I went for something between what they meant for me personally and how good I thought they were.) I didn’t know that it wasn’t for publication at time of writing, because I didn’t know whether we were supposed to write note to share or not, but that just made sure that I wrote more, which is always good. 

#10: The New Guardians #1 (1988, DC Comics)

— I loved Millennium, the crossover this came from, so much that I subscribed to this (for an exceptionally large amount of money; I was in the UK, after all) before it launched. The series was a disaster, with Steve Englehart leaving midway through the second issue, but even today, there’s something special about the launch issue: A vision of socially inclusive and diverse comics that I was looking for but hadn’t found yet.

#9: The Invisibles #12 (1995, DC/Vertigo)

— The Invisibles was a (the?) seminal series for me, and this is arguably the most important issue in it; the one that introduces the true hero of the whole thing, and also explains how bad guys become bad guys. It’s very much in the whole pulp tradition, but also something that asks and expects a little kindness from those reading.

#8: Uncanny X-Men #185 (1984, Marvel Comics)

— The comic where I decided that I was going to collect comics. What was it about this? Claremont arguably in his prime, Romita Jr. and Dan Green at the 1980s best, but also the sense of it being this expansive fictional universe that went far beyond the superhero comics I’d read as a kid. This felt “other,” it was amazingly exciting.

#7: Or Else #2 (2004, Drawn & Quarterly)

— Kevin Huizenga has the honesty of an Eddie Campbell, but the formal curiosity of a Chris Ware and the heart of a Jaime Hernandez. This was the first thing I read from him, back when it was a mini comic called Supermonster #14. The reprint (that was, I think, also redrawn and/or expanded?) just cemented how wonderful he, it, and comics in general, are.

#6: Deadline #5 (1989, Deadline)

— The first issue of Deadline I bought, and the place where I discovered comics that weren’t superheroes or 2000AD. My first taste of Philip Bond, Jamie Hewlett, Nick Abadzis and Shaky Kane. This was unspeakably important to me at the time; it really felt like the world was opening up and comics were a place to explore all these things in a language I’d understand.

#5: Mister Miracle #10 (2018, DC Comics)

— No comic has ever felt like a more perfect expression of a relationship than this one, to me.

#4: Flex Mentallo #4 (1996, DC/Vertigo)

— “Being clever’s a fine thing, but sometimes a boy needs to get out of the house and meet some girls.”

#3: OMAC #1 (1974, DC Comics)

— One of the most perfect first issues ever made in comics, and also one of the most prescient pieces of 20th Century science fiction. Oddly, also released in the same month I was born, apparently.

#2: Dork #7 (1999, Slave Labor Graphics)

— Evan Dorkin writing about his nervous breakdown was (and, in many ways, still is) a shock considering this had previously been his humor anthology, but he does it with such honesty, anger and wit that it’s undoubtedly one of the best comics I’ve ever read.

#1: Grafitti Kitchen #1 (1993, Tundra)

— Simply one of the best one-shot issues ever, one of the best autobiographical comics ever — sure, he’s pretending to be Alec McGarry, but still — and one of the most honest pieces of writing about how complicated and dumb and hopeful we get when it comes to relationships.

2018 Revival: Who Is The Best Supervillain?

Another thing written for an unexpected outlet this year, and an unexpected revival — this was for io9, which asked me for a brief submission about the best supervillain. It was my first piece there for… eight years or so…? I also went to a get-together of io9 writers past and present at NYCC this year, so perhaps I’m over my weird grudge finally.

There’s a tradition in superhero comics for truly powerful beings to be beyond human morality — so, you get characters like Marvel’s Galactus, who eats planets but is somehow not evil because, hey, who are we to judge? Similarly, Marvel also has characters like the Beyonder or Michael Korvac, both of whom are omnipotent and definitely antagonists, but could they really be considered supervillains…? There’s an argument to be made against, seeing as neither are really trying to do much more than survive and learn, even if that process threatens the free will of everyone around them. Surely intent figures into deciding whether or not someone is a villain, super or otherwise…?

I really want to say it’s Darkseid, because Darkseid is obviously the best supervillain. He wants to eradicate free will, and he’s got no problem doing whatever it takes to achieve that aim, even though he’s bound by his own weird sense of honor. He’s complex, contradictory and fascinating, and he’s also been able to kill Batman and beat up Superman and screw with the entire Justice League, so he’s clearly pretty powerful. But, really, he’s not the most powerful supervillain. We’ve seen far stronger. (Nekron, for example; he could bring all the dead guys back to life as evil zombies!)

Instead, I’ll nominate the Anti-Monitor, the awkwardly-named villain of 1985’s Crisis on Infinite Earths. While his motivation and, really, personality, were somewhat unclear in that series, it couldn’t be denied that he was powerful: He was literally destroying entire universes to further his agenda of destroying all positive matter — he’s the Anti-Monitor, after all —succeeding, he killed countless versions of DC’s biggest name characters and, thanks to the cosmic laws of DC mythology, his being from the Anti-Matter universe automatically means that he’s evil. Most powerful supervillain? Almost certainly. That costume alone should earn him a place on the list, let’s be real.

2018 Revival: Con Survivor

Finishing out a year in which a lot has happened, but there’s been almost nothing happening on this site — mostly because a lot has happened. But I’m using this place as digital storage by including some writing for unusual places from the last few months. First up, this is a piece written for the launch party of Oni’s The Long Con at Portland’s very own Books with Pictures, which ended up being read aloud by the wonderful Ben Coleman.

Based on the questions I’ve been asked over the years, there are a few preconceptions about being a journalist at Comic-Con that I feel the need to try to clear up. Firstly, no; it doesn’t mean that you automatically get into all the popular panels and hang out with movie stars and eat free food, although I did once accidentally leave Hall H in San Diego through the wrong door and ended up in the celebrity waiting room, which had a spread like you wouldn’t believe, and was filled with the cast of some big blockbuster I can’t even remember, all staring at me while clearly thinking “You don’t belong here.” I was quickly escorted out by security.

And, no, being press doesn’t mean that you automatically know where all the good parties are, and it definitely doesn’t mean that you get invites and can sneak everyone in. I mean, yes, there was that time I got into a party where the band was Josie and the Pussycats from Riverdale and they were actually performing live, and everyone lost their minds, but that happens, like, once or twice a convention, tops.

Most of all, despite what I’ve just said, it isn’t glamorous. It’s glamor-adjacent, and that’s fun and strange and great, sure, but it’s also weird and uncomfortable and occasionally just very… awkward. Here’s the best example of what I’m talking about. It’s about eight or nine years ago, and through some unlikely happenstance, I’m working for a well-known weekly news magazine that I won’t mention the name of. I mean, technically, I’m working for the website of a well-known weekly news magazine, but the distinction is meaningless to anyone I tell about the job. Honestly, it was pretty meaningless for me, too; I was firmly under the impression that I had arrived in the big leagues, and that everything was going to be great from then on.

This was before I arrived in San Diego to discover that I would be sharing a room with five strangers for the next four nights. And that the room had two single beds, and we could maybe get an extra cot if we were lucky. On the one hand, everyone seemed very nice and there was only a couple of people whose work I recognized and felt embarrassed to be sharing a bed with because, really, they deserved better. On the other, I can’t emphasize this enough: We were all working for a well-known weekly news magazine — like, one of the ones that’s actually a name — and they definitely could’ve afforded at least another room or two. This was just cheap.

It also made it difficult to do work. It isn’t unusual to end up working late into the night to meet deadlines at Con, and when you’re sharing a room with five people trying to sleep, it’s not so easy to stay up, typing away, without making people mad at you. All of which explains why I ended up sitting in the foyer of the hotel, trying to write a couple of stories at ten o’clock at night one night.

So, I’m sitting there with my laptop and headphones on, listening and listening and listening to this interview, trying to transcribe it and write whatever I was writing, and I kind of half-noticed that it was getting pretty busy. I didn’t really think that much about it, because it’s Comic-Con and everywhere is busy at Comic-Con, especially hotels. And it keeps getting busier, and busier, and at one point I look up and realize, wait, everyone looks really fancy. This is odd.

It took me about another hour or so, and by this point it’s maybe 2am and there’s really loud music and the foyer is just packed, to realize that there was actually a party going on all around me and I hadn’t realized. And it’s a big party; there’s a DJ, there’s people dancing and drinking and making out and all kinds all around me and I somehow just hadn’t noticed for hours. I didn’t know what to do, because I couldn’t go back to the room, everyone was asleep and I hadn’t finished work, so I just…stayed there. And pretended none of it was going on while I sat on a couch, with various things happening literally right beside me that were very distracting. Eyes fixed on the screen. Writing. Just writing.

And then, at one point, with no warning, the music just stopped suddenly. The crowd groaned en masse, but stopped when it became clear what was going on: Everyone shuffled aside to let an ambulance crew pull a stretcher towards the elevators, and then they disappeared. No-one said a word, everyone just staring at the elevators for minutes until the ambulance crew re-appeared, with someone strapped into the stretcher.

This sounds like a downer, I know, and you could tell at the time that the ambulance crew was clearly thinking the same thing. They didn’t look anyone in the eye as they moved towards the door of the hotel, and then they paused, before one of them said in this wonderfully embarrassed voice, “He’s going to be fine!” As in on cue, the music immediately started back up, and everyone got back to partying, like the whole thing had been planned.

That is what Comic-Con is like as a journalist. Being exhausted, under deadline, surrounded by people having more fun than you, probably, and unsure whether or not you just saw something actually tragic, or if it was some weird performance art piece in the middle of a party. And, you know, also getting to see Josie and the Pussycats perform live on a hotel rooftop standing next to the cast of Arrow as they lose they minds.

What can I say? It’s really large. It contains a lot of multitudes.

April 29

“A week or so,” I said I’d return in; that turned out to be optimistic at best, if not downright foolhardy. April proved to be an overwhelmingly busy month for a number of reasons (and is continuing to be, right up until its final day), and even on days when I had time to write here, I’m not sure I would have written more than simply “I’m so tired, I’m so, so tired” over and over again.

When I started writing here daily, I had visions of doing so every day for a year, some kind of grand plan that would also let me write for myself again, even if it were simply pointless meanderings of little worth. I started 2015 feeling as if I was risking becoming an automaton in terms of output; that the pressures of work meant that I had nothing left to give in terms of brainspace for anything else, and I needed something that was my own. (Wait What? is that to some degree, and I love it very much for that as well as for the chance to talk to Jeff on an almost weekly basis.) Hence, writing here.

And yet, the first three months kind of proved to me that I did have little left to give in terms of brainspace, for the most part; I was writing the random, stream-of-consciousness material that I’d hoped for, but it was emptier than I would have liked, and I think the hope that I’d… I don’t know, sharpen mental muscles as I went along or something, didn’t happen. When I was done, I was done; it was clear to see.

None of this should be construed as real complaints, as much as disappointment in myself and the result of a slow realization that I need to recognize my limits better (and, maybe, factor in some more downtime for myself. We’ll see if that latter one happens anytime soon, though). Will I be doing daily posts here again…? I’m unsure, to be honest; I’ll try to do them when I feel like I can do them, and they feel like something I have time and brainspace for, instead of a promise I made to myself than I have a responsibility to fulfill, if that makes sense. So, if anyone’s reading, hello again.

The Wrong Lessons

A final post, from now, from the “Things I Wrote for WIRED That Were Never Published” vault. This is from August last year.

As the Comic-Con announcements last month demonstrated, 2015 is shaping up to be a pretty big year for genre cinema, with Superman/Batman and [World of] WarCraft being added to a summer line-up for that year that already included Avengers: Age of Ultron, Star Wars Episode VII, Independence Day 2, the reborn Terminator, Jurassic Park IV, the next James Bond movie, Fox’s Fantastic Four reboot, Finding Dory, Pirates of the Caribbean 5 and the final Hunger Games movie. Pretty impressive, perhaps, but also more than a little derivative.

The swathe of anticipated big budget sequels doesn’t stop there for the year; also anticipated at some point in 2015 are follow-ups to Mission Impossible, Prometheus, Avatar, Snow White and The Huntsman and, somewhat unexpectedly, Pitch Perfect (On the plus side, we might finally have the “Cups” song out of our head by that point). When it comes to “original” genre productions, we seem to be limited to Assassin’s Creed, Ratchet and Clank and Marvel’s Ant-Man, all of which are, of course, adaptations of properties in other media.

This might seem like a sad state of affairs — perhaps one that can only lead to cannibalization given the sheer amount of big budget movies lined up to face off against each other — but you can’t blame movie executives alone for it. The genre movie slate of 2015 is very much a result of what happened this summer at the box office.

Let’s start with the reliance on familiar material. As the 2013 box office to date demonstrates, it just doesn’t pay to create something new in the genre space — or, at least, it doesn’t pay as well as recreating something old. Seven of the top 10 movies of the year in the U.S. have genre trappings, whether they feature superheroes, science fiction or monsters (Yes, monsters in university still count; I’m not counting Fast and Furious 6, even though that is essentially Car Avengers at this point in the series), and each of those movies is based upon an existing property, which isn’t entirely surprising. After all, the majority of genre movies released each year are remakes, sequels or adaptations of stories that have already created a fanbase elsewhere.

From a business point of view, this makes a certain amount of sense: Genre movies tend to be more expensive than non-genre movies — because of the cost of the necessary special effects and visual trickery necessary to make the audience believe in something that, for the most part, couldn’t exist in the real world — so the prospect of investing that increased cost in a known quantity with relatively established fanbase at least appears to be less of a financial risk than putting the same amount of money into something new, unknown and unproven.

The problem arises when the same issue is approached from an aesthetic direction. Simply put, familiarity breeds contempt, and there are only so many times we can see the same stories being told, or the same characters in action, before it gets boring. Entertainment has to be about novelty to some degree, which — by definition — requires something that we haven’t seen before. This is an area where non-genre movies — comedies, dramas and other features which tend towards realism in ways that require less money to conjure onscreen — have the edge on genre: It’s less of an investment, or business risk, to come up with something new, meaning that the ratio of “new” versus franchise outside of the genre space is far greater than it is for genre output.

As much as many would like that ratio to change — and for genre movies to become less dominated by all-too-literal attempts to recapture what has worked before — it can be difficult to argue against the business math responsible for the way things are, especially when those non-franchised movies that do get released end up falling short of the success enjoyed by the alternative at the box office. To wit: 2013 had a handful of “new,” non-franchise genre movies, each with some level of draw to mainstream movie audiences, and none made more than $100 million at the U.S. box office.

After Earth and Oblivion, both of which told similar “Earth is screwed, so we moved on but then we came back and it totally wasn’t what we expected” stories with different massive movie stars attached (Will Smith and Tom Cruise, in that order), stalled out at $60 million and $89 million, respectively. The much-hyped, fiercely-defended Pacific Rim is sitting around $86 million. The most successful of this year’s All-New Apocalypse fiction, This Is The End, is also the cheapest (It cost $32 million to make, compared with Pacific Rim‘s $190 million, After Earth‘s $130 million and Oblivion‘s $120 million); it’s managed to rake in $95 million to date.

If there’s a second lesson to learn from this summer for movie executives besides “stick to what you know,” it’s “when you choose to gamble, make your gambles as cheaply as possible.” Besides Pacific Rim, Disney’s wildly expensive Lone Ranger movie earned just $85 million, despite costing the studio $215 million to produce, never mind market. In comparison, “smaller” — and, tellingly, non-genre — movies like Now You See Me (which cost $75 million) and Identity Thief (which cost $35 million) not only recouped their investment but went on to move into pure profit.

So what, exactly, is going on here? Is there a cap on genre movies that don’t have a nostalgic or recognizable “in” to get audiences past the speed bump of traditionally niche-oriented material where suspension of disbelief in the unfamiliar is required? Are there only so many people who are interested in paying to watch robots, monsters and superheroes when they’re not accompanied by some level of childhood nostalgia?

It’s been said often enough that audiences need to vote with their dollars when it comes to demanding a certain kind of entertainment. That idea is complicated when what’s being offered up is so limited. If the audience wants to see all-new original material, but doesn’t want to see the kind of material that’s on offer in Pacific Rim, Oblivion or After Earth, should they “vote” for it or not? If the kinds of movies they want to see are only available in existing franchises, is paying for a ticket voting for that particular kind of movie in terms of tone and plot, or for “franchise movies” as some kind of invincible monolith?

Based on the box office results of the year so far, it’s no surprise that movie studios are focusing on the franchises as much as possible for summer 2015; they’re as much of a sure thing as is possible for the industry these days, even if the amazing, worrying pile up of Must-See Movies listed above suggests that some will inevitably fall by the wayside.

If, however, the lack of new, original genre movies is the result of the performance of this year’s batch of end of the world flicks, that’s unfortunate, and the result of a skewed test sample. Offer the audience some new ideas with the variety, optimism and invention–not to mention, please, some sense of frugality; no more $190 million budgets–missing from this year’s examples, Hollywood, and see whether or not the mainstream audience is ready to watch a genre movie that they don’t already know the story of. The alternative is simply surrendering to the law of diminishing returns.

Is Mad Men The Ghost of Television Past?

Another oldie, from May last year, written for WIRED.

On the slowly-unfolding AMC period drama Mad Men, character arcs and plots can take several episodes or even seasons to come into focus. As the show’s sixth season slowly unfolds, it’s tempting to suggest that the show is at risk of becoming as much a part of the past as the era it portrays. Is television even interested in this kind of programming anymore?

When Mad Men debuted in 2007 — setting a new ratings record for AMC in the process with 900,000 viewers — the landscape of television was different. The Sopranos had just finished on HBO, and The Wire was still on the air; Lost was still in the middle of its run, and the idea of television as the home of long-form, complex, quality drama was something still on the minds of many. Mad Men was simply more evidence of the future of the format.

Cut to 2013, and it’s a very different story. The first episode of the new season had 3.4 million viewers tuning in – down from last year’s season premiere high of 3.54 million – and successive episodes have dropped to around the mid 2 million mark, under the level for the same time last season. More importantly, the show’s importance to AMC has shifted, if not outright shrunk, in light of the phenomenonal success of the channel’s The Walking Dead.

It’s not just that the most recent episode of that comic book adaptation brought in almost four times as many viewers as Mad Men‘s peak, with 12.42 million people watching (It was, after all, a season finale); consider, as well, that the accompanying episode of The Talking Dead –Chris Hardwick’s talk-show companion to the zombie drama — had a series high of 4.3 million viewers; almost a million more viewers than Mad Men for a show that is far cheaper, and far simpler, to produce. No wonder that the channel has announced plans for Talking Bad, a similar show to accompany the final season of Breaking Bad this August.

It’s not only AMC where attention and focus has shifted from quality drama to genre fare. Instead of The Sopranos or The Wire, HBO’s most-discussed series these days is George R.R. Martin’s Game of Thrones, and its most-watched show is trashy vampire soap True Blood. Attempts at more low-key fare like Luck and Treme meet considerably more muted response and, as a result, have shorter lifespans. The audience clearly knows what it wants, and what it wants is apparently sexy genre fare over the chance to see middle-aged men struggle with the complexities of life as we know it.

A similar thing is happening in broadcast television; a cursory glance at the shows networks are developing for the 2013/2014 television season reveals that the hour-long drama format continues to be dominated by unchallenging procedurals, crime dramas or fantasy fare, for the most part. For all the excitement offered by Lost‘s ambitious scope or complicated narrative structure, the post-Lost television landscape has suggested that the show succeeded despite those elements, not because of them. Instead of demonstrating that broadcast dramas can challenge the viewer without scaring them off, the lesson Lost taught broadcast television was apparently that flashbacks can be a legitimate form of long-form exposition (See: Once Upon a Time, The Following).

The slowly shrinking Mad Men audience makes the fact that AMC reportedly cut budgets for both Breaking Bad and The Walking Dead in order to pay for Mad Men‘s most recent seasons somewhat confusing. Admittedly, for Breaking Bad, there is some level of logic from a purely business perspective — The show brings in fewer viewers, and therefore less advertising revenue than Mad Men — but the idea that AMC would undercut its most visible, valuable show for something that is watched by a fraction of its audience is counter-intuitive at best.

Or is it? While the show brings in fewer viewers and less advertising revenue per dollar spent than The Walking Dead (and definitely considerably cheaper The Talking Dead), Mad Men arguably brings AMC far more critical prestige than Robert Kirkman’s horror series.

The same is true of Breaking Bad; even though ratings for both shows may be a fraction of The Walking Dead‘s audience, having two of — if not the two — most highly-regarded television dramas today on its network gives AMC an overall reputation that makes the network brand more attractive to program-makers and advertisers alike. “AMC,” it suggests, “is where the forward thinkers, the early adopters, the smart buyers go for shows. Sure, less people might watch overall, but the ones that do watch are the tastemakers you want.” Add that prestige and attention to the middling ratings, and Mad Men earns its keep.

We’ve seen this before, with NBC Universal’s Syfy and Battlestar Galactica; Ron Moore and David Eick’s series was never the highest-rated show on the cable network, but unlike the more popular Stargate: Atlantis or Warehouse 13, it did snag the network a Peabody Award and prompt a discussion of human rights at the United Nations. To be blunt, you genuinely can’t buy publicity — or affirmation — like that. When a show starts connecting with people in such a way, you keep that show on the air as long as you can before it starts to really hurt financially.

The problem is that, eventually, it will start to hurt financially, and at that point you have to start to say goodbye. Television is a business, after all, and there comes a point where leaving money on the table in the name of critical plaudits starts to seem foolish; you can’t use acclaim to put food on the table, after all. Goodwill only goes so far, and with every single episode, more people are leaving Don Draper for other shows.

Battlestar Galactica lasted four seasons (Five, if you include the original mini-series that launched the reboot); Breaking Bad will last five. Mad Men, by the time it’s finished, will have lasted seven seasons. All things considered, that’s an impressively long run, especially considering the alternative programming AMC could have opted for at any point that would have brought more people watching. It may simply be that Mad Men the show has a parallel existence to Don Draper himself: Slowly becoming outdated without anyone realizing it at the time.

This doesn’t bode well for the future of television drama, though. If broadcast networks are going to play it safe in terms of selecting new shows, and the previously-reliable cable and premium cable channels have discovered that genre is far more successful than “straight” drama when it comes to return on investment and eyeballs-on-shows, that’s a problem for any new show that wants to play things slow, subtle and lavishly enough that its budgets may make executives nervous. Given the choice between something with the potential to become a breakout hit and something with the potential to break even but maybe garner critical acclaim, it’s more of a risk to go with the latter option, and with the television industry in seeming flux (Ad spending was down in 2011, back up in 2012, an election year), now might be the time to play it safe. So where will we see the next Mad Men?

The answer may be online. We don’t yet know — and may never know, considering just how closely guarded viewing numbers at Netflix tend to be — how many people have streamed House of Cards so far, but let’s do some creative math for a second: Mad Men averages between 2.5 and 3 million viewers an episode, as does Breaking Bad, so let’s say that that means there’s a three million-strong audience for those shows in the U.S. at least (Bear in mind, DVRs, DVD and streaming audiences alike aren’t factored into those numbers; Sunday’s Mad Men often tops Apple’s iTunes TV chart on Monday, so there’s a second audience right there that’s already digital to consider).

Admittedly, a new drama in that vein wouldn’t have the name recognition nor the critical acclaim that would drive people to tune in, so the math may be somewhat skewed upwards. Perhaps not, once you factor in the audience that wanted the show in another format than live-viewing and the additional audience who might be interested in the show but stays away because it’s already five years in (Or, for that matter, the audience who might watch just for the novelty of something new).

Nonetheless, it’s safe to assume that the metrics for “success” for a streaming-first show are somewhat different than that for a traditional television show, if only due to the newness of the format and the smaller scale of the audience. Is it possible that a Mad Men-style 3 million people audience would be enough to be considered a smash hit for streaming? Could the future of prestige television drama be somewhere that isn’t technically television at all?

Who is the Doctor?

From the never-published final installment of the on-again, off-again recaps on WIRED’s Underwire for the last season of Doctor Who. Funny to revisit in light of subsequent episodes.

With “The Name of The Doctor,” this latest season of Doctor Who came to an end with something that was neither a bang nor a whimper — in large part because the final few moments of the episode turned it from a revelatory finale into confusing, frustrating glimpse of things to come.

Ignoring for a second the final scene of the episode, “The Name of The Doctor” oddly crystalized a lot of the problems this seventh season has suffered through. Like so many episodes this run, Saturday’s final episode was good enough as opposed to particularly strong, and found itself relying on familiar characters, ideas and audience goodwill to distract from writing that was surprisingly messy given the series’ recent history, and filled with plot holes and unexplored ideas that could upset the story’s movement with just a minute’s exploration.

And what distractions the episode provided! We saw Clara with each of the previous Doctors in scenes that demonstrated seeming lack of convincing green screen technology (The second and fifth Doctors, in particular, appeared in scenes with a Clara obviously shot elsewhere and elsewhen. By comparison, the scenes with the first and third Doctors seemed to give her a graininess that matched the original shots), as well as henchmen that were reminiscent of both the popular Silence from the show’s sixth season and also Buffy The Vampire Slayer‘s Gentlemen, from way back when, and a third appearance this year from the increasingly popular Madame Vastra, Jenny and Strax, the alien detectives from the Victorian era. Underneath all of this, however, was a script that ultimately failed to convince.

The basic plot of “The Name of The Doctor” was, at heart, very straightforward. Our heroes were lured into a trap by a former enemy out for revenge, which they only survived due to self-sacrifice on both of their parts. It was the meat on those bones where things got somewhat convoluted: The Doctor and Clara found themselves on Trenzalore, the site of the Doctor’s grave at some unspecified time in the character’s future in order to save Vastra, Strax and Jenny from the Great Intelligence — the villain from a storyline from the series’ original run, as well as the most recent Christmas Special and the first episode from this most recent run. After death, all that remained of the Doctor in the tomb wasn’t a body, but his personal timestream, which was less an abstract concept than a quasi-physical lightshow that could be “entered” by first the Great Intelligence seeking to undo all of the Doctor’s good works, and then Clara — attempting to stop the Great Intelligence — and the Doctor himself.

That Clara was successful was hardly a surprise; the show could hardly let the Doctor die with episodes left on the clock (and anyway, we dealt with the faux threat of the Doctor dying last year). Instead, the interest in Clara’s attempt came from the fact that, by entering the Doctor’s timestream, she became scattered across his life as multiple people with no recollection of who she had been — the multiple Clara’s we’d encountered up to this point, and the “impossible girl” who had captured the Doctor’s attention in the first place, leading to his meeting the “main” Clara for the first time. Well, that and the other character Clara and the Doctor met inside the Doctor’s timestream, but we’ll get to him soon enough.

For every smart idea in the episode — The explanation for what made Clara the “impossible girl” after all, her remembering events that had been wiped from history because the Tardis was leaking time, the post-Doctor’s death slow revision of the universe’s history, and how that altered character relationships — there were moments that just seemed unfinished or needlessly rushed. The Doctor warned about crossing over with his own timeline and later collapses from having done so, but just two season finales ago, “The Big Bang” relied entirely on his doing just that without any ill-effects, for example; similarly, the surprisingly speedy and easy discovery of Clara within the Doctor’s timestream felt unearned, and undercutting the drama of her having seemingly sacrificed herself doing so just minutes earlier.

But see, we’re already at the final sequence I mentioned earlier. Up until that point, “The Name of The Doctor,” for all its flaws, felt like an ending (albeit a disappointing one). Then, in the midst of the Doctor’s personal timestream, Clara and the Doctor met a shadowy figure with his back to the camera; he was someone the Doctor was seemingly afraid of — or afraid of Clara discovering, perhaps — describing the figure as, essentially, the incarnation he’d like to forget, the Doctor who doesn’t save the day.

That this new Doctor — A future incarnation that “our” Doctor knows about because he, too, has entered his timestream? A past one? — is played by John Hurt is important only for the BBC, who’ll doubtlessly like to boast of an actor of such popularity and credibility taking on the role (How else to explain the hilarious “Introducing JOHN HURT as THE DOCTOR” credit once he turned around?); for fans of the show’s larger mythology, what is more important is that this brings the number of incarnations of the Doctor to twelve, leaving the character with just one more regeneration to go before his death, according to rules set up in the original run of the show. In recent years, it’s been teased that the rule no longer applies, but never definitively stated within the series itself.

With just one scene at the end of the episode, “The Name of The Doctor” went from disappointing closure to a shameless tease for November’s 50th anniversary episode: What has this new Doctor done that is so terrible (Being responsible for the death of every other Time Lord, an established part of the character’s backstory since the show’s 2005 revival, would be the most obvious guess)? Does the thirteen incarnation rule still exist, and if so, is the Doctor close to his final life or is there another incarnation that we don’t know about? And, more subtly, but arguable more importantly, will the Doctor be able to reconcile his actions in that incarnation with his self-image, and stop repressing an entire period of his life?

The scale of the final scene of the episode ultimately overwhelmed what had come before; it left the audience feeling energized and excited, but it was a cheap thrill in many ways. Despite the title of the episode, the name of the Doctor wasn’t revealed on Saturday, and the slight of hand that managed to make that disappointment (or relief, perhaps) disappear from fans’ minds was a sign that — perhaps, if we’re lucky — the Who that lies ahead will be as bold and fun as the one they fell in love with. It may have been a sign of better things ahead, but that doesn’t change the fact that what came before was underwhelming at best, and a sign that, when it comes to this series, familiarity may be breeding contempt after all. In more ways than originally intended, perhaps, a lot depends on the 50th anniversary episode coming up in November.