More TV Movies   (2015Apr01)

Wednesday, April 01, 2015                                                1:09 PM

I love Tuesdays—that’s when Optimum adds newly released movies to their VOD menu. Yesterday was “The Imitation Game” and “Interstellar”. Both were excellent movies, although back-to-back blockbusters can be a strain on these old bones—and what a headache, too, after staring at my big screen for almost six hours straight. Were I a more considered sort of guy, I would have spaced them out and waited another day to watch one of them.

“The Imitation Game” was an excellent movie. I want to say that right at the beginning, because I have some caveats that have nothing to do with cinema, but I don’t want that to give the impression that I didn’t enjoy myself.

imitation-game

This movie is a perfect example of why it is so important to read the book before watching a movie based on a book. One can read a book afterwards, but it’s rather like smoking a cigarette before having sex—it puts the cart before the horse. A two-hour movie cannot possibly cover the amount of information to be found in an almost-eight-hundred page, carefully-researched biography—nor should it even try. “Alan Turing—The Enigma” covers Alan Turing’s childhood, his academic career, his social and family life, his sexuality, and his multi-faceted, almost unbelievable career.

turing-centenary

Turing wrote “Computable Numbers”, which introduced the concept of using symbols for both numbers and characters, amounts and instructions—and for many years, only a handful of people could understand what he wrote. Even fewer saw the grand implications of the “Turing Machine”. He then used those ideas to help England puzzle out the Nazi’s enigma code-machine, which shortened, perhaps even won, the war and saved millions of lives. But he (and everyone else involved) was sworn to secrecy about both his scientific achievements and his heroic contribution to the war effort.

After the war, he began to work on a universal machine—a machine that would not only do a specific job of controlled calculation, as at Bletchley Park, but would be capable of doing any such job, whether it be the calculation of orbits in space, the half-lives of radioactive materials, or the guidance of a rocket-propelled missile. The strangest thing about the early history of computers is that very few people saw the point. But, once they got on board, his government took the work out of Turing’s hands. So he started working on the chemical processes of morphogenesis—the mechanism by which cells create articulated creatures, rather than a featureless sludge.

turing-machine

Everything he turned his mind and hand to, every idea he highlighted for the rest of us—was amazing, unbelievable, mind-blowing. Think about it. First he said, ‘In algebra, we use letters to represent numbers—why can’t we use numbers to represent letters?’ Then he said, ‘I can break the unbreakable Nazi code and win WWII.’ Then he said, ’War’s over—I’m going to build a machine that can think.’ Then he said,’Now I have a computer—I’m going to figure out how life began.’ Then he turned forty. Then, at forty-one, he ate a poisoned apple and killed himself.

The film says nothing of all this. The film doesn’t even mention his mother, who was a big influence on his life in the book. It says nothing of his visits to America, before and during the war. It reduces the crowds of people he interacted with to a handful of on-screen characters—and it makes far too much of his relationship with Joan, simply because movies have to have that sort of thing in them, even when the leading man is a recognized homosexual.

Movies have had a lot of practice at this. There’s nothing terribly untrue about what was in the movie—it is simply missing so much that it tells a story quite different from the story told in the book. I don’t blame the movie-makers—this is in the nature of filmmaking, particularly adaptations from books. It is an accepted fact that the reactions of a movie audience are more important than the details of the story being told. This gives books a tremendous advantage. However, as I said, it was an excellent film.

Interstellar

“Interstellar” was likewise excellent, but equally limited by virtue of its being a movie. The physics of space-time are conveniently ignored or, more likely, misrepresented by beautiful CGI effects. In a movie so focused on the scientific aspects of modern life, it is notable for its lack of realism and its tendency to resemble a dream-state more than scientific research.

pulp-amazing-stories

But science fiction has always tread carefully on the borderline between fact and fantasy, using the suggestion of science to make an allegory about the human condition—quite similar to fantasy, which explains why the two are usually considered a single genre, sci-fi/fantasy. “Interstellar”, with its spaceships, scientists, and robots, presents itself as hard science fiction, a sub-genre that usually treats with sub-atomic physics or cosmology in a futuristic setting. But the story being told is one of wish-fulfillment and easy shortcuts—the opposite of hard science fiction.

pulp_sci_fi

We get only the most fundamental features of science fiction in this sort of story—we get to be awed by the vastness of space, by the mystery of time, by the power and reach of technology, and by the inexorable terror of Mother Nature. But we don’t learn any actual science, as we would when reading Arthur C. Clarke or Isaac Asimov.

Asimov is a telling figure in the world of science fiction—one of the most popular and prolific writers in the genre, but where are his movies? There’s “I, Robot” and “Bicentennial Man” –but both of those are very loosely based on the original short stories, retaining little of Asimov’s genius beyond the “Three Laws of Robotics”. What about the Foundation Series novels, or the Robot Detective Series novels? Movies, while lots of fun, are simply too stupid to encompass an Asimov story—he deals in ideas, not images. He is trapped in literature.

asimov-nine-tomorrows

Or look at Clarke’s works—one movie, and that one movie is based on one of his short stories, “The Sentinel”. Stanley Kubrick, possibly the greatest movie director that ever lived, spent more than two hours on screen with “2001: A Space Odyssey” trying to tell one short story from a hard sci-fi author. Where is “Rendezvous with Rama”, or “The Fountains of Paradise”, or “The Lion of Camarre”? Hence the glut of comic-book adaptations—only science fiction intended for children is easily adapted to the screen.

clarke-childhoods-end

But the relationship between science fiction and childhood rates a closer look, as well. Early science-fiction in the pulps was considered childish reading matter—strictly for kids. It wasn’t until we landed on the moon in reality that science fiction was able to show its face among adults. But I don’t believe this was due to children being the only ones stupid enough to be interested—it was due to children being the only ones open-minded enough to see the value of it.

jetsons

Even today, the value of science fiction is considered mostly monetary—between Star Trek and Star Wars, sci-fi has become big business. But the real good stuff remains locked away in books, too concerned with science and ideas to be adaptable into stories and images. Still, “Interstellar” was fun to watch, and it had a happy ending. I do love a happy ending. And I’d rather watch Matthew McConaughey drive a spaceship than a Lincoln….

Popular Science Sucks—I Have a Pie-Chart to Prove It (2015Feb07)

io

Saturday, February 07, 2015                            12:37 PM

The world was once a garden. Before the industrial age, everything was organic—the houses, the roads, the toilets, the farms, the furniture. We were once all-natural. When I say ‘garden’, I’m not implying any Garden of Eden—like all gardens, there was plenty of manure and rotting organic matter. If you caught that old garden in the wrong breeze, it stunk to high heaven—but it was a non-toxic stink.

duomo

 

Then the steam engine led to the combustion engine, which led to the jet engine, then the rocket engine. Edison had his time in the sun, as did Ford, Einstein, Turing, Gates, and Jobs. Now the garden is gone and what’s left is not so pretty.

To sustain our first-world population requires mining, cutting, energy production, chemical processing, and manufacturing—all in mind-blowing, humongous quantities. (Did you know the world uses billions of tons of steel, every day?) We know that Earths’ infinite abundance is an illusion—that its amazing powers of recuperation can only be pushed so far. But we ignore that. And we keep ourselves so very, very busy trying to scam each other and distract each other that it is easy to ignore even such obvious facts.

allegory

Between our old people, who are too ignorant to turn on a computer, and our young people, who are too ignorant to understand how unimportant computers are to the big picture, it’s obvious that our world is changing too fast for our society to keep up with. Meanwhile computers become ever more ingrained in our everyday lives, while computer experts baldly admit (as they always have) that the Internet can never be totally secure from malware. It’s kind of like accepting Politics, even while knowing that a bad politician can be humanity’s greatest threat—oh, wait—we do that, too.

There was no nerd happier than I when the Digital Era elevated ‘smarts’ to a sexy asset. But just as Star Wars popularized science fiction, and ended up diluting it into something sub-intellectual, so now science, math, and logic have been popularized, with the attendant dilution of these virtues into weapons of commerce and gamesmanship.

danae

There is no more popular meme than a pie-chart—but how many of today’s pie-charts illustrate hard data, and how many are printed in USA Today in an attempt to manipulate the un-informed? Back when they were too boring for anyone but us nerds, no one would have bothered to make a pie-chart of bad data—what would be the point, miscommunication? Yes, as it turns out, that’s a very good use for a mathematical tool. Because people love, love, love the appearance of reason—it’s the methodical application of reason that leaves us cold.

And words. Aren’t we all a little bit tired of words? If words had true meanings, arguments would end. If words had justice, they’d refuse to issue themselves from the mouths of many of the people on the TV news. Every word is a two-bladed sword—without good intentions, words are nothing but cudgels and self-appointed crowns. I’m so sick of the neat little bundles of words that spew from the faces of cold-blooded opportunists and greedy bastards—pretending that a logical algorithm of honest-sounding terms can erase horrible injustices that even three-year-olds would know in their hearts. A good argument is no substitute for a good person—and you can talk all day without changing that.

20120930XD-GooglImages-WllmBlake-DeathOnAPaleHorse

But let’s return for a moment to pie-charts. I witnessed the early days of computing and I can attest to the fact that spreadsheet software was a big player. Descartes’ invention of a chart using an x-axis and a y-axis proved so useful that it pervaded mathematics and remains a part of it today. Just so did business leaders find in the mighty spreadsheet a powerful tool for business analysis, sales, and forecasting. Breaking down business activity into rows and columns of numbers gives people great clarity—if you’re into that sort of thing. But we’re not all math geeks—some of us prefer a simpler challenge to the mind. Presto, bar-graphs, pie-charts, etc.—graphic representations of numerical values—so simple even a child could use (or misuse) it.

20130106XD-Googl_Imag-CircutBord04

And way back then, I had a problem with the whole GUI, WYSWIG, object-oriented, ‘visual’ dumbing down of computer science. It seemed to me that if you couldn’t understand computer code, it wouldn’t help having everything be point-and-click. But the world has long over-ruled me on this point, and it’s only getting worse. What is the point of having scientists conduct a study—and then have a government official decide whether the study should be released? What is the point of a laboratory that conducts studies at the behest of large industrial sponsors—don’t they know that such circumstances taint the report before it’s even issued? Who do they expect to believe them? What is the point of classifying proprietary data from pharmaceutical studies—are they afraid the competition will steal their dangerous, toxic drug ideas while they’re being sued by their ‘patients’?

20130111XD-GooglImag-Screens05

We like that the world is getting more confusing—or, at least, some of us do—it makes it easier to lie and cheat and steal. And just to super-charge the confusion, we have a mass-media machine that craves excitement and ignores substance, like a spoiled child. Somewhere between the ‘yellow journalism’ at the break of the last century, and this century’s Fox News, we used to enjoy a historical ‘sweet-spot’, where Journalism was respected and professional—they even got to the point where it was available as a major in college study. TV news started out as a mandatory, public-service requirement for public broadcasters! They still have Journalism majors in colleges—but the classes are usually titled something like “Communicating In Media”, or some other name that lets you know you’re not dealing with ‘reporting’ anymore, you’re ‘communicating’. More dilution of something great into something ‘meh’.

20130111XD-GooglImag-Screens06

And that’s where the whole world is heading. Where once was sweet air and crystal-clear water, flush with fish and game, free of toxins—we will now enjoy ‘meh’. Where once dumb people could remain comfortably dumb, and scientists were trusted to think, we will now enjoy a free-for-all of debate points and well-turned phrases made out of pure bullshit—until reality pulls the plug. I once had hope that we would control ourselves in some way—I was so stupid. I guess I was misled by my intense desire for us to survive as a species, maybe even live as good people. Ha. We all have to grow up sometime.

20120821XD-NASA-hubble_sparkles_30DoradusNebula

The Oscars in the Era of Digital Entertainment

20130226XD-Googl-RPO_001

“Ready Player One” by Ernest Cline –an excellent read in its way, a real page-turner–I just finished reading it at 3am earlier this morning—I’ve slept most of the intervening time, but my eyes won’t focus today. See—that’s the difference between age and fatigue—fatigue is something that fades quickly, whereas the limitations of age are more holistic—don’t read an entire book in one day (I was surprised I still could.) if you want to use your eyes for something the next day, and maybe the day after.

20130226XD-Googl-RPO_002(ECline)

Also, the book is set in the near future, but concerns the nineteen-eighties in an OCD-‘Best of the 80s’-treasure-hunt that is central to the tale. I started in the mid-nineteen-seventies (pre-PC, pre-Windows, pre-WWW) with mini-computers—new sensations in the small-business world, particularly the easily computerized industries—insurance, real estate, mailing lists (yes, this was before e-mail and its evil twin, spam, too). But they were still using up an entire room—an air-conditioned room, too.

20130226XD-Googl-RPO_012(matt-groening_Homer)

The micro-computers that started showing up a few years later are now known as PCs—and the first way to hook them together was a Local-Area-Network, or LAN. The first modems had misshapen foam cradles which held the old phones’ receivers and worked by analog audio beeps and chirps. My first PC had a two-megabyte internal hard-drive—it couldn’t hold a single hi-res JPEG by today’s standards.

20130226XD-Googl-RPO_013(Simpsons)

Back then, everything was B&W, just letters and numbers, logic and calculations. When I first saw Windows 2.0 I asked what the point of it was—I was told it made it easier for people to use a computer. I replied that people who didn’t understand how to use a computer weren’t going to have any more luck with a GUI (Graphics User Interface—aka ‘Windows’—except for Macs). What I failed to realize was the pressure digital-era literacy would force on us all—suddenly typists needed to learn Word Perfect and bookkeepers had to learn Lotus 1-2-3 (early spreadsheet software).

20130226XD-Googl-RPO_008(TreyParkerMattStone)

I spent my late teens and early twenties learning computer-literacy and computer maintenance systems that vanished practically overnight, sometime around 1985, and was replaced with home-video games that killed the arcade industry, the WWW, which killed the LAN and WAN industries, and MS Office Suites, CorelDraw Graphics Suites, and Roxio Audio-Visual Suites (and their Mac equivalents)—all of which killed the individual programmer-maven job market. Hot-shot coders were supplanted by Nintendo, Microsoft, Google, YouTube, Facebook, I-Phones and other industrial-sized app- and mega-app-creators.

20130226XD-Googl-RPO_009(SouthPark)

So the 1980s digital watershed as experienced by the writer (I’m assuming) came around the time I was losing the ability to indulge in childish things without embarrassment. For instance, Matthew Broderick, a central figure in the book, is much younger than I am—and I won’t get into how depressing it is to see him graying with age in the present day. Yes, boys and girls, if you live long enough, even the sci-fi makes you feel old.

20130226XD-Googl-RPO_010(SouthPark_last-supper)

By 1980, I was in my mid-twenties—this made me a generation older than the oldest man in the book. So, I’m reading a sci-fi thriller set in the near future and all I feel is ‘old’—that’s just so wrong. But enough of my whining… let’s discuss.

Society used to imply a fixed point of geography—but no more. The way I see it, any place or time that has fixed morals applicable only to that place or time, is a ‘society’. For instance, Commuter Traffic is its own society—indeed, commuting has at least three societies—the drivers, the bus and train-takers, and the walkers.

20130226XD-Googl-RPO_011(Book_of_Mormon)

Walking the sidewalks of mid-town Manhattan during the morning rush seems very cattle-like, especially to the people in its grip. But it actually requires a very heads-up approach—you need to watch the whole 360 degrees around you, your pace should be brisk but not breakneck, and the only real crime is to behave as if it weren’t rush hour, when personal stopping and going and distraction won’t impact the entire flow of the press of people all around such an out-of-place fool.

Walking is usually the last step in the journey. And there are many who go by subway—but in my relative inexperience, I leave its description to someone more inured to its ways. Nevertheless I have spent years on both of the other circuits, ‘driving in’—and ‘taking the train’.

20130226XD-Googl-RPO_007(ThCleveShow)

Taking the Saw Mill River Parkway into Manhattan’s West Side Highway is not for the faint of heart. Its lanes were designed for the days when it was truly a scenic parkway—and for cars which topped out at, maybe, 30 mph. It’s modern reality is a cross between Disney World’s Space Mountain and the Grand Prix—hurtling cubes of steel, inches apart, doing 60, 65—and some of them are in a bigger hurry than the rest—these restless souls try to pass other cars as they go and will push their driving skills to the limit. This forces anyone in the lane beside them to be just as razor-sharp in controlling a vehicle that may not have the road-hugging quality of a BMW.

Taking the Harlem-Hudson line into Grand Central has had many changes since my day—the locomotives were diesel, there was always at least one smoking car and the night-time commuter trains had a bar car, which was an automatic smoker. The seats were upholstered but badly sprung—and larger. But some things remain the same—the etiquette of boarding as a group, of sitting beside a stranger (don’t read their paper—get your own!)

And the strange race for pole-position when debarking at Grand Central. This took planning. Firstly, one had to rise when the train had neared its platform, and move towards the doorway. If you weren’t first in the doorway, there was no way you would have a chance to sprint towards the exit ramp with the other contenders. The choice of when to rise was a personal one—some rose quite early and simply stood in the doorway for a good ten minutes, others waited until the last minute and relied strongly on line-cutting bravado. Once the train stopped, there were maybe fifty yards of empty platform which the prepared passenger sprinted across, hoping to avoid the human condensation that made that exit a twenty minute delay for those who took their time getting off the train.

20130226XD-Googl-RPO_006(american-dad)

This was the most cattle-car moment of any commute—people actually touched each other while we crowd-shuffled towards the open terminal beyond the platform gate. This was a world-class pot-luck situation—the people who crushed against one could be very attractive or quite repellent, even odiferous. There was no logic to the Brownian motion of the crowd—you couldn’t position yourself to mash against someone of your own choosing.

Eye contact, personal space, split-second go/no-go choices made at traffic-lit corners or when spotting an unmarked traffic cop car in the work-ward rally—all these and more were self-imposed by the natural human reactions to the different intimacies of rush-hour mass motion. And, not surprisingly, all these societies have a night-time, complimentary society, with different rules respecting the fact that everyone is in an even bigger hurry to get home than they were to get to work that morning, but with the luxury that no one got fired for getting home late.

These societies have a geographical ‘location’ (if an unsupervised racetrack can be called a location) but they come into being for a few hours in the morning and again at night, each time fading away almost as soon as it peaks, barring delays and bad weather. The ‘train stuck in a blizzard’ has a society, too—which only comes sporadically and can skip whole years at times.

familyguy_seth7

Talking on the phone is a society—or, again, several societies, based on context. A phone conference, a sales call, a relative calling to gab, a friend calling with an invitation—each one has its own little head-dance and body language. And we could hardly leave out Facebook or the internet in general, when cataloguing the many sub-societies we join and quit all through our days.

These were my musings on Society this morning after I read the New York Times Art Section article reviewing the Oscars and the reviews others gave it, particularly PC groups that disapproved of the irreverence and insensitivity of the jokes and songs—and of Seth McFarlane, personally. The Times article pointed out the discrepancy between the Academy’s need to bring in ratings, especially from the younger demographic (call it the “Family Guy”-factor) and to appear sensitive to the community-watchdog groups that have been attacking “Family Guy” since its premiere in 1999.

Seth McFarlane is a media juggernaut with three (yes,3!) TV series now in operation: [Family Guy (1999–2002, 2005–now), “American Dad!” (2005–now), “The Cleveland Show” (2009–now)]. His ‘tastelessness’ finds favor with a younger audience because it embraces (as far as a TV show can) the new Internet society—which has few editors and even less censors. This younger entertainment society accepts the crassness as ‘bold honesty’ of a sort (which dawned, IMHO, upon the Seinfeld episode when Jerry, et.al. all repeat the phrase “Not that there’s anything wrong with that.” until the defensiveness of PC-speak becomes its own post-modern joke/attitude).

familyguy_seth7

PC-abandonment is the new humor in this society—if it makes old people like me wince, it’s funny. And television, in many ways, is still bound hand and foot by wincing old people. These dinosaur-people miss the point—we joke because we love—and we love ourselves—even our bigoted, foul-mouthed selves. And we won’t pussy-foot around about it anymore. Any old geezer that can’t let go of the militancy that served human rights so well in the twentieth century can’t help but be scandalized by our new-minted idols, like Seth, who are comfortable making a joke about Lincoln being shot in the head without being suspected of hidden racism or some twisted fundamentalism.

I would like to join in—but I’m too old and set in my ways to reinvent myself as an aging hipster—besides, comedy was never my strong suit… But my point is this: we have two major societal paradigms that are at something of a disconnect—Network TV and the World Wide Web. I can’t get in the spirit of it—for me, half the fun of a show is watching it when it’s aired. The feel of live TV—even scheduled, recorded, first-run TV shows—is lost for me whenever I have to find the show on the cable-box’s VOD menu—but my son watches all his ‘TV’ online, using our Netflix account. And I grew up admiring martyrs to the cause of civil and gender rights—I’ll never be able to speak lightly of those momentous changes that informed my lifespan.20130226XD-Googl-RPO_004(SMcFarlane)

I can handle Seth McFarlane, Matt Stone, Trey Parker, Matt Groenig—all the new-wave, internet-capable entertainers, but my laughs are a little repressed by the sheer effrontery of their attitudes. When I was a boy I wondered why it was so hard for my parents to see my point—now I understand—by their standards, I didn’t have a point. I wasn’t seeing everything through their experiences, I was seeing everything as new and without emotional context. And now I’m trapped in my memories of what our children see as ‘history’, if they notice it at all. Paperless, wireless, unconventional families, uncensored entertainment, the disintegration of traditional religious institutions’ power to shape people and events, access to everything—information, encyclopediae, maps and navigators, definitions, language translations, 24-hour news cycles—all the things that have remade what was once my stable little spot on the Earth—our children take them as givens—the same way we took drinking from our lawn hoses for granted (back when people still felt safe drinking from ground wells).

So, in the end, Seth McFarlane did a great job hosting the Oscars—he also did a terrible job—it depends on your age.

Ben Affleck