From Ritual to Romance   (2015Nov08)

Sunday, November 08, 2015                                            6:21 PM

“From Ritual to Romance” was written by Jessie L. Weston in 1920. It is mentioned by T. S. Eliot in the notes to his poem, ‘The Waste Land’: “Not only the title, but the plan and a good deal of the incidental symbolism of the poem were suggested by Miss Jessie L. Weston’s book.”  Weston’s book, along with Sir James George Frazer ‘s “The Golden Bough: A Study in Magic and Religion”, first published in 1890, were hot topics in Eliot’s day. Frazer’s ‘Golden Bough’ did for anthropology what Darwin’s “On The Origin Of Species” did for biology in 1869—it presented academic research indicating that the Christianity of the day was evolved, in many ways, from more-ancient rituals and earlier gods. Further, it showed that religion changes with the times, while it re-tasks older beliefs and traditions. Simple examples include the importance of mistletoe in Christmas tradition—a holdover from Druidic beliefs and rituals—and Christmas itself, a pre-Christian mid-winter festival re-assigned as the day of Christ’s birth, whereas the historical Jesus was most likely born in the spring.

Just as Darwin’s work slowly percolated for decades after its initial publication (the Scopes trial wasn’t until 1925) so too Frazer’s research would not bear the fruit of Weston’s and other writers’ works until well into the beginning of the twentieth century—and this affected T. S. Eliot, scion of a famous Unitarian family and a student of Ancient Greek, Latin, and even Sanskrit (he familiarized himself somewhat with Eastern philosophy—the final ‘shanti’ in The Waste Land is Sanskrit for ‘peace’)—but an intellectual who considered himself an atheist early in his writing career. That he would join the Church of England in his later years, he admitted, was in large part due to his desire for ritual and the focused meditation of prayer.

In his essays on Christianity, culture, and society, Eliot worried that the ending of borders in Europe would lead to an overly homogenous culture, losing the variety of differences between the many nations. His concerns were misplaced, as the United States would handily blanket the globe with Pepsi and Quarter-Pounders soon after the next World War. But the foundation of his concern for cultural diversity, as well as his eventual decision to rejoin a religious community—was at heart a concern for meaning in one’s life and indeed in the lives of everyone.

His masterpiece, “The Waste Land”, was to some extent a gigantic howl at a universe that was losing its old meanings—and having trouble replacing them with modern equivalents. Industrialization, science, and technology were erasing many of the givens—people of different countries were no longer separated by mere physical distance—the secrets of life, of matter, of the universe—all of which had been the province of faith—were now being revealed by scientific inquiry—‘God’ himself had been dethroned.

And Eliot raises a valid point—I spent many years being agnostic, being unsure if my rejection of all religion was based on valid reasoning—but once I decided absolutely on atheism, I’ve spent every moment since in trying to find a way to give life meaning without reverting to any magical improvisations that would simply be religion in another guise. And it’s not easy.

As I watched a PBS documentary on Johnny Carson today, this issue of rituals again raised itself in my mind. In my youth, TVs were made from tubes. This required a TV to be big and boxy—the bigger the screen, the bigger the whole box had to be. So—a very substantial piece of furniture sat in the center of virtually every home—and, at dinner-time, virtually every American turned it on, like a national campfire, and watched either Walter Cronkite or Chet Huntley and David Brinkley tell them the news of the day. Later, at bed-time, Johnny Carson would come on and clue us all in on what was going on, what to care about, what was ‘cool’, and what to laugh off.

The real importance of this was in the following day—our conversations with each other would always have a common context—we all referenced the same ‘source material’. Equally important was our unanimous acceptance of whatever information was received—we talked about how we felt about current events—we never discussed whether we believed what Cronkite or Carson had told us. That’s where the cliché of ‘water-cooler conversation’ comes from—although presently even water-coolers are a thing of the past—now most office workers show up to work with their own individual caffeine drinks from Starbucks or Dunkin Donuts.

Older times saw technology enabling us to be tribal on a larger scale—first radio, then television, gave us a sense that the entire nation, from coast to coast, was all ‘on the same page’. Automobiles allowed us to congregate in public places in larger numbers—and from a larger overall area. The limitations of corded, rotary landlines—mostly always just one to a household—retained the sense that real communication could only be accomplished face-to-face.

And while we are tempted to blame laptops and i-phones for the insularity of modern communication, we should remember that earlier electronics began the change—the advent of touch-tone dialing, call-waiting, multi-party calls, caller-ID, etc.—all made telephony simpler and more akin to an actual conversation. It was around this time that phone cords of exaggerated length became popular—phoning had become easier, and we began to feel a restlessness from still being pinned to one spot in the home.

The differences today are many: we all have our own phones now; we can take them wherever we go now; we don’t have to worry about missing a call—not only do we know who tried to call us, but they can leave a recorded message for us to hear later. Point-of-contact used to be the family kitchen—now each wandering individual is a point-of-contact. Telephone contact is so universal today that we are confronted by situations, as when driving a car, where talking on the phone can actually kill us.

Similar conveniences have stripped away the trials of scholarship—fifty years ago one would inevitably find oneself in need of a public library—specifically the reference section. ‘Mini-reference-sections’, called encyclopedias, were sold door-to-door—mostly to minimize the number of trips to the library. We got to know our librarians; we got to know each other—if we were the kind of people who spent a lot of time reading or studying or researching. Today, I have no need for the reference section of my local library—I don’t even have to cross the room to use my own encyclopedia (yes, I still have a set)—I can just do a Google-search, or check Wikipedia, or find the e-text of a classic tome on the Gutenberg Project website.

Don’t get me wrong—there’s tremendous power there. Not only do I have access to the equivalent of a library reference section—I have access, from right here where I’m sitting, to every university, laboratory, professional association, research society—hell, with the right access codes, I could rifle through the files of DARPA, NASA, or CERN. But my point today is not concerned with the wonders of the Internet—I’m focusing on the fact that I don’t need to break my solitude—I don’t need to open my front door—and I still have access to virtually every bit of information known to mankind.

Convenience in communication, and in scholarship, was welcome progress—but we still needed to get together to have ‘something to do’. Increasing the number of TV channels from three to 300 made it possible to watch a lot more TV—and cable TV made it possible to watch movies without attending a movie theater—but still, there is a limit to how much TV a person can watch. Likewise, there is only so much time that can be spent talking on the phone or studying. In my day, a person always reached a point where he or she simply had to go outside, to mingle with the throng—or simply hang with one’s friends.

Eventually, one way of ‘hanging with friends’ became playing video games—a group of kids would congregate around a TV hooked up to a video game system and take turns using the controllers. And this is where everything came off the rails, in a sense. The advent of multiplayer online gaming, combined with the use of laptops and cellphones, made it possible to both play with friends and socialize with friends—all without leaving the privacy of one’s room. Additionally, one could leave one’s room—could in fact go anywhere—and still remain essentially within that gaming social gathering. This leads, of course, to the phenomenon whereby your kids could be in the room with you, but not really ‘be’ there at all—they’re texting, or IM-ing, or gaming with unseen other kids while their bodies, devoid of conscious awareness, sit in the same room you’re in.

We call this new generation ‘digital natives’—people who grow up with digital, online technology as a given. To digital natives, being physically present is of less importance than online connection—they pay attention to their screens, not to the people in their environment—hence all the car-crashes caused by cellphones. There was once a time when a rainy day was bad news for kids—it meant we couldn’t go outside to play—and that was a major tragedy in our young lives. Nowadays, when parents force their kids to go outside, it is more likely to cut them off from their friends and their playtime.

In a culture that shops online, plays online, watches online entertainment, communicates online, and learns online, we find that something is lost. In Eliot’s time, they felt the loss of religion as an absolute—but they also lost the comfortable patterns of a life where God was central to everyday activities. In our time, we are experiencing the loss an even more elemental aspect of our daily lives—shared physical presence. And the list of rituals being lost in this new ‘normal’ is even greater.

Consider laundry—there are still parts of the world where we could witness the weekly washing of clothes by a riverbank—those people gather and mingle and chat as they do their laundry ‘community-style’—and for centuries, all mankind did their laundry in this way. When washing machines came along, people hung up their wash on clotheslines—often socializing with their neighbors over the back fence—a smaller social group, but still partially a community activity. Then came electric dryers—and homemakers found themselves, at least as far as laundry was concerned, acting in solitude, shut up each in their own homes.

Why are rituals important? Look at it this way—we can strive for success, for achievement, for goals of many types—we can chase after lovers, mates, and romance—we can eat, sleep, and work—but all of it is empty without a context, a continuum, that is the cycle of our daily lives. Humans are a social species—we need the comforting presence of others, we need interaction with our peers. But we are raising children in an environment of solitude—where are they supposed to find meaning and fulfillment in their lives? How can they build a comforting pattern of social rhythms to give their lives continuity?

And make no mistake—we have need of these things. Take the Sabbath day as an example—with the decline of religion, one might ask why bother with a day of interruption? But we need rest as much as we need sleep—however we came up with the idea of a ‘day of rest and prayer’, it fits our biological rhythms—even without feeling obligated to pray to God once a week, we still benefit from the rhythm of taking every seventh day off. Or take another example—the taboos on certain foods, like pork or shellfish, were once considered religious observances—but they were useful in that such foods are health risks if not carefully cooked. Further, in modern America, where a person can eat anything—and as much of it as they please—we find that eating without limits presents greater health risks than any one type of food could ever pose.

Boundaries, rituals, democracy, all the inconveniences of being part of a group, rather than a free, solitary agent—these things have a value to our mental and physical health, to our sense of having a rich, fulfilling life. We may be able to get along without our imaginary friend, God, but we are finding out that life can be even more empty and angst-ridden if we try to live without each other, without community and society. There may come a day when we no longer have prisons—we may come to recognize that everyone is already in a prison, that criminals can be punished and isolated from society by the simple expedient of taking away their online connection.

This may seem rambling and generalizing, but I’m trying to make the point that the rhythms and patterns of community provide a substrate for the discrete pursuits of life—earning a living, raising a family, the arts, the sciences, politics, etc. We focus on these ‘goals’ of life and overlook the fact that life has a context within which all this goal-seeking behavior occurs—that there are moments between these activities—that our consciousness goes in and out of these discrete pursuits, but our awareness is confronted by an unbroken continuum of existence—and that overall ‘existence’, without substance, becomes a void that we fall into whenever we are not consciously busy with a particular aspect of our interest. No matter what our individual interests may be, we still need our overall lives to have texture and substance. Without experience outside of our online connections, life becomes disjointed, disconnected, and begins to lose value or meaning.

The human animal can adapt to many changes—but not to emptiness. It has been noted that a person left in a sensory-deprivation chamber will quickly be driven mad by a nervous system bereft of input. We are in danger of finding our global village trapped in an electronic isolation that will drive the whole world mad—we may find that civilization will ultimately be destroyed, not by fire or ice, but by our lust for convenience.

Life on a Go Board

ancienmanor

I don’t like it when words are used as stones on a Go board, or statements used as chess-pieces—those are combat simulations—since when did communication become combat? For that matter, when did words become the only form of communication? Actions speak louder than words, but words, or perhaps videos’ scripts, are considered a life-connection from you or me to someone halfway round the world. Am I really connected to those people? Funny story (you know I accept friend-requests from anyone) this new Facebook-friend of mine only posts in Arabic—it’s beautiful stuff, but I don’t even know the basic phonemes of that written language—and I had to ask him to tell me his name (or equivalent sound) in Roman script.

I don’t want to get into a debate here about argument. Formal argument, or debate, is certainly useful and productive—as is regular old arguing, when it’s done with restraint or when its goal is an elusive solution or resolution. The Scientific Method, itself, is an implied debate—a conflict between prior theories and the new theories that overthrow them—or that are overthrown thereby—no, I’m not saying that communication isn’t rife with conflict—my purpose here is to discuss other forms of communication and sharing. So, please, let’s not argue (—jk).

ancienMask

I finally realized that all these unsolicited friend requests from the Mid-East were because I was using a photo of Malala Yousafzaya as my Profile Pic! I’m glad—now I know they’re not shadowy extremists trying to cultivate an American connection—they are instead the liberals of their geographic zone.

Such international friends frustrate me—the lack of words that I don’t type could be just as offensive as any thoughtless words I post—and there are plenty of those. I wish I knew what they were. Whenever someone wants to Facebook-friend me as their American friend, I start right in on criticizing all their grammar faux pas and misunderstood colloquialisms—they love it—that’s what they want from their American friend. I’m afraid geek-dom knows no borders—only my fellow geeks from faraway lands appreciate criticism—I’m sure people with the Cool gene flock together across the datasphere as well (but then, I’ll never know—will I?)

ancienRug

But communication, as a means of sharing ideas and organizing cooperative efforts, is far more than a battle of witty words. Political cartoons, cartoon cartoons, obscene gestures, and ‘making out’ come first to mind—although there are plenty more examples. The Media (a term I use to denote People magazine, other newspapers and periodicals, radio, cable-TV, VOD, cable-news, talk shows, private CC security footage, YouTube and the omnipresent Internet.) I say… the Media is looking for trouble.

They aren’t broadcasting cloudless summer skies or a happy family sitting around the dinner table or the smoothly proceeding commuter traffic a half a mile from the traffic accident. And I don’t blame them. Their job is to entertain—that’s what pays their bills. And I don’t blame us, either. We are happier watching dramatic thrills than watching paint drying. There’s no getting around that.

And I won’t play the reactionary and suggest that we go back to a time when entertainment was a brief treat enjoyed, at most, once a week. Even the idle rich (and this is where that ‘idle’ part comes from) just sat around socializing when they weren’t at a fox-hunt or a ball. To be entertained was almost scandalous—think of it—in a deeply religious society, such escapism went against the morality of the times—and even as a once-a-week diversion, it was frowned upon not only to be a stage player, but to attend the performance, as well.

ancientseal

But entertainment, like a gas, expands to fit the size of its civilization—those old scruples took a few centuries to kick over, but once the digital age had dawned it seemed quite natural that everyone should have access to twenty-four-hour-a-day entertainment (call it ‘news and current events’, if it helps). And now we have people walking into walls and driving their cars into walls while they stare fixedly at their entertainment devices.

So, trite as the word may seem, Media is a mandatory entity to include in any discussion of the human condition. And more importantly, it must be a part of the Communication topic, as well—most especially with a view towards a formulation of culture that does not make conflict our primary means of sharing and informing our minds. So let’s recap—Entertainment equals drama equals conflict equals fighting (See ‘Arnold Schwarzenegger’). Information equals scientific method equals discussion equals human rights (See ‘Bruce Willis’—jk).

capeBordr

To begin, there is one thing that needs to be acknowledged—learning is NOT fun. I’d love it if it was—I know fun can be used to teach some things—It’s a lovely thought—but, No. Learning is a process of inserting information into the mind. People talk a lot about transcendental meditation but, for real focus, learning is the king. To learn, one must be patient enough to listen; to absorb an idea, one must be willing to admit that one doesn’t know everything; to completely grasp a new teaching, one may even have to close ones eyes and just concentrate—nothing more, no diversions, no ringtone, no chat, no TV, no nothing—just thinking about something that one is unfamiliar with—and familiarizing oneself.

We forget all that afterward—the proof in that is that none of us graduate from an educational institution with the ability to ‘sub’ for all the teachers we’ve studied under. We have learned, but only a part of what was taught—it’s implications, ramifications, uses, and basic truths may have eluded us while we ‘learned to pass the class’. Contrariwise, our teachers may have bit their tongues—eager to share some little gem of Mother Nature’s caprice implicit in the lesson plan—and had instead put the ‘teaching of the class to pass’ before the ‘teaching of the class’.

dinPlate

And that is no indictment of teaching, that’s just a fact—it doesn’t prevent me from admiring great teachers. But I couldn’t help notice that great teachers always color outside the lines in some few ways. Teaching people to learn for themselves, with that vital lesson neatly tucked into the course-plan of the material subject of a course—it takes effort, discipline, and way more patience than that possessed by most of the rest of us—but it also requires an allegiance to the Truth of Plurality, that incubator of eccentricity.

merit

But we forget our Learning. It becomes something we simply ‘know’, something that we just ‘know how to do’. Part of good parenting is learning to teach well—young people have the luxury of just understanding something, while parents must struggle to figure out how to explain it, or teach it, to their children. And then we forget about that learning—and must scratch our heads again, struggling to explain ‘explaining’ to our grown-up, new-parent offspring. It’s a light comedy as much as anything else.

femIdol

So learning is not fun. There is a thrill involved, however, that is almost always worth the ticket price. The internet and the TV blare words at us in their millions, info to keep us up-to-date—just a quick update—and now there’s more on that—and we’ll be hearing a statement from the chief of police….—also, we are seduced by lush orchestrations or driven musical beats, by the gloss and beauty and steel and flesh of literal eye-candy, and that dash of soft-core porn that is the engine under the hood of so many TV series.

We see breaking YouTube uploads of rioting in a faraway land—we believe that our quiet little lives are nothing, that all our sympathy and concern should be spread across the globe to billions of strangers in distress. We are flooded with information by the Media—but because it’s the Media, only conflicts and crises are shown—the peaceful, happy billions of people that pass by those crowd scenes, that seek refuge across the border, that have families and generate love to whomever gets near enough—we don’t need to see them.

housOclay

But that isn’t true—it’s true for the Media, but it isn’t true for us. The Media can’t change—but we can be aware of its bias. We can take note of the fact that the Media should not be the major part of our dialogues with one another. Best of all, we can become aware of how much the Internet can teach us—if we can stop IM-ing and web-surfing and MOMPG-ing long enough to notice that the Internet is a hell of a reference book.

No, I’m not saying we should trust the Internet. I’m saying that the real information is there, and finding it and using it will be the road into the future that our best and brightest will walk along. They will pull their eyes away from the Mario Race-Cart, the YouTube uploads of kittens and car-crashing Russians, and George Takei’s Facebook page—and they will throw off the chains of Media and make it their bitch again, back where it belongs.

lareale

In WWII, fighter-group captains and flight-team leaders are always saying ‘Cut the chatter, guys—heads up!’ I think we need the same thing—everyone should have a little devil on their shoulder that says the same thing—“Hey! –so the Internet connects you to the entire civilized world—that doesn’t mean you have to say anything—it just means you can.”

Our high-tech communications infrastructure is no small part of the problem—the digital magic that flings words and pics and music all over the world bestows an importance and a dignity to our messages that many messages don’t deserve. Posting to the Internet is kind of like being on TV—it grants a kind of immortality to the most banal of text-exchanges—it can even be used against you in court—now, that’s very special and important—and now, so am I, just for posting!

precolumbnGoblet

So, yearning for the perennial bloodlust of Law & Order: SVU, our self-importance inflated, and our eyes off the road, we speed towards tomorrow. I hate being a cynic.

[PLEASE NOTE: All graphics courtesy of the Quebec National Gallery]

The Oscars in the Era of Digital Entertainment

20130226XD-Googl-RPO_001

“Ready Player One” by Ernest Cline –an excellent read in its way, a real page-turner–I just finished reading it at 3am earlier this morning—I’ve slept most of the intervening time, but my eyes won’t focus today. See—that’s the difference between age and fatigue—fatigue is something that fades quickly, whereas the limitations of age are more holistic—don’t read an entire book in one day (I was surprised I still could.) if you want to use your eyes for something the next day, and maybe the day after.

20130226XD-Googl-RPO_002(ECline)

Also, the book is set in the near future, but concerns the nineteen-eighties in an OCD-‘Best of the 80s’-treasure-hunt that is central to the tale. I started in the mid-nineteen-seventies (pre-PC, pre-Windows, pre-WWW) with mini-computers—new sensations in the small-business world, particularly the easily computerized industries—insurance, real estate, mailing lists (yes, this was before e-mail and its evil twin, spam, too). But they were still using up an entire room—an air-conditioned room, too.

20130226XD-Googl-RPO_012(matt-groening_Homer)

The micro-computers that started showing up a few years later are now known as PCs—and the first way to hook them together was a Local-Area-Network, or LAN. The first modems had misshapen foam cradles which held the old phones’ receivers and worked by analog audio beeps and chirps. My first PC had a two-megabyte internal hard-drive—it couldn’t hold a single hi-res JPEG by today’s standards.

20130226XD-Googl-RPO_013(Simpsons)

Back then, everything was B&W, just letters and numbers, logic and calculations. When I first saw Windows 2.0 I asked what the point of it was—I was told it made it easier for people to use a computer. I replied that people who didn’t understand how to use a computer weren’t going to have any more luck with a GUI (Graphics User Interface—aka ‘Windows’—except for Macs). What I failed to realize was the pressure digital-era literacy would force on us all—suddenly typists needed to learn Word Perfect and bookkeepers had to learn Lotus 1-2-3 (early spreadsheet software).

20130226XD-Googl-RPO_008(TreyParkerMattStone)

I spent my late teens and early twenties learning computer-literacy and computer maintenance systems that vanished practically overnight, sometime around 1985, and was replaced with home-video games that killed the arcade industry, the WWW, which killed the LAN and WAN industries, and MS Office Suites, CorelDraw Graphics Suites, and Roxio Audio-Visual Suites (and their Mac equivalents)—all of which killed the individual programmer-maven job market. Hot-shot coders were supplanted by Nintendo, Microsoft, Google, YouTube, Facebook, I-Phones and other industrial-sized app- and mega-app-creators.

20130226XD-Googl-RPO_009(SouthPark)

So the 1980s digital watershed as experienced by the writer (I’m assuming) came around the time I was losing the ability to indulge in childish things without embarrassment. For instance, Matthew Broderick, a central figure in the book, is much younger than I am—and I won’t get into how depressing it is to see him graying with age in the present day. Yes, boys and girls, if you live long enough, even the sci-fi makes you feel old.

20130226XD-Googl-RPO_010(SouthPark_last-supper)

By 1980, I was in my mid-twenties—this made me a generation older than the oldest man in the book. So, I’m reading a sci-fi thriller set in the near future and all I feel is ‘old’—that’s just so wrong. But enough of my whining… let’s discuss.

Society used to imply a fixed point of geography—but no more. The way I see it, any place or time that has fixed morals applicable only to that place or time, is a ‘society’. For instance, Commuter Traffic is its own society—indeed, commuting has at least three societies—the drivers, the bus and train-takers, and the walkers.

20130226XD-Googl-RPO_011(Book_of_Mormon)

Walking the sidewalks of mid-town Manhattan during the morning rush seems very cattle-like, especially to the people in its grip. But it actually requires a very heads-up approach—you need to watch the whole 360 degrees around you, your pace should be brisk but not breakneck, and the only real crime is to behave as if it weren’t rush hour, when personal stopping and going and distraction won’t impact the entire flow of the press of people all around such an out-of-place fool.

Walking is usually the last step in the journey. And there are many who go by subway—but in my relative inexperience, I leave its description to someone more inured to its ways. Nevertheless I have spent years on both of the other circuits, ‘driving in’—and ‘taking the train’.

20130226XD-Googl-RPO_007(ThCleveShow)

Taking the Saw Mill River Parkway into Manhattan’s West Side Highway is not for the faint of heart. Its lanes were designed for the days when it was truly a scenic parkway—and for cars which topped out at, maybe, 30 mph. It’s modern reality is a cross between Disney World’s Space Mountain and the Grand Prix—hurtling cubes of steel, inches apart, doing 60, 65—and some of them are in a bigger hurry than the rest—these restless souls try to pass other cars as they go and will push their driving skills to the limit. This forces anyone in the lane beside them to be just as razor-sharp in controlling a vehicle that may not have the road-hugging quality of a BMW.

Taking the Harlem-Hudson line into Grand Central has had many changes since my day—the locomotives were diesel, there was always at least one smoking car and the night-time commuter trains had a bar car, which was an automatic smoker. The seats were upholstered but badly sprung—and larger. But some things remain the same—the etiquette of boarding as a group, of sitting beside a stranger (don’t read their paper—get your own!)

And the strange race for pole-position when debarking at Grand Central. This took planning. Firstly, one had to rise when the train had neared its platform, and move towards the doorway. If you weren’t first in the doorway, there was no way you would have a chance to sprint towards the exit ramp with the other contenders. The choice of when to rise was a personal one—some rose quite early and simply stood in the doorway for a good ten minutes, others waited until the last minute and relied strongly on line-cutting bravado. Once the train stopped, there were maybe fifty yards of empty platform which the prepared passenger sprinted across, hoping to avoid the human condensation that made that exit a twenty minute delay for those who took their time getting off the train.

20130226XD-Googl-RPO_006(american-dad)

This was the most cattle-car moment of any commute—people actually touched each other while we crowd-shuffled towards the open terminal beyond the platform gate. This was a world-class pot-luck situation—the people who crushed against one could be very attractive or quite repellent, even odiferous. There was no logic to the Brownian motion of the crowd—you couldn’t position yourself to mash against someone of your own choosing.

Eye contact, personal space, split-second go/no-go choices made at traffic-lit corners or when spotting an unmarked traffic cop car in the work-ward rally—all these and more were self-imposed by the natural human reactions to the different intimacies of rush-hour mass motion. And, not surprisingly, all these societies have a night-time, complimentary society, with different rules respecting the fact that everyone is in an even bigger hurry to get home than they were to get to work that morning, but with the luxury that no one got fired for getting home late.

These societies have a geographical ‘location’ (if an unsupervised racetrack can be called a location) but they come into being for a few hours in the morning and again at night, each time fading away almost as soon as it peaks, barring delays and bad weather. The ‘train stuck in a blizzard’ has a society, too—which only comes sporadically and can skip whole years at times.

familyguy_seth7

Talking on the phone is a society—or, again, several societies, based on context. A phone conference, a sales call, a relative calling to gab, a friend calling with an invitation—each one has its own little head-dance and body language. And we could hardly leave out Facebook or the internet in general, when cataloguing the many sub-societies we join and quit all through our days.

These were my musings on Society this morning after I read the New York Times Art Section article reviewing the Oscars and the reviews others gave it, particularly PC groups that disapproved of the irreverence and insensitivity of the jokes and songs—and of Seth McFarlane, personally. The Times article pointed out the discrepancy between the Academy’s need to bring in ratings, especially from the younger demographic (call it the “Family Guy”-factor) and to appear sensitive to the community-watchdog groups that have been attacking “Family Guy” since its premiere in 1999.

Seth McFarlane is a media juggernaut with three (yes,3!) TV series now in operation: [Family Guy (1999–2002, 2005–now), “American Dad!” (2005–now), “The Cleveland Show” (2009–now)]. His ‘tastelessness’ finds favor with a younger audience because it embraces (as far as a TV show can) the new Internet society—which has few editors and even less censors. This younger entertainment society accepts the crassness as ‘bold honesty’ of a sort (which dawned, IMHO, upon the Seinfeld episode when Jerry, et.al. all repeat the phrase “Not that there’s anything wrong with that.” until the defensiveness of PC-speak becomes its own post-modern joke/attitude).

familyguy_seth7

PC-abandonment is the new humor in this society—if it makes old people like me wince, it’s funny. And television, in many ways, is still bound hand and foot by wincing old people. These dinosaur-people miss the point—we joke because we love—and we love ourselves—even our bigoted, foul-mouthed selves. And we won’t pussy-foot around about it anymore. Any old geezer that can’t let go of the militancy that served human rights so well in the twentieth century can’t help but be scandalized by our new-minted idols, like Seth, who are comfortable making a joke about Lincoln being shot in the head without being suspected of hidden racism or some twisted fundamentalism.

I would like to join in—but I’m too old and set in my ways to reinvent myself as an aging hipster—besides, comedy was never my strong suit… But my point is this: we have two major societal paradigms that are at something of a disconnect—Network TV and the World Wide Web. I can’t get in the spirit of it—for me, half the fun of a show is watching it when it’s aired. The feel of live TV—even scheduled, recorded, first-run TV shows—is lost for me whenever I have to find the show on the cable-box’s VOD menu—but my son watches all his ‘TV’ online, using our Netflix account. And I grew up admiring martyrs to the cause of civil and gender rights—I’ll never be able to speak lightly of those momentous changes that informed my lifespan.20130226XD-Googl-RPO_004(SMcFarlane)

I can handle Seth McFarlane, Matt Stone, Trey Parker, Matt Groenig—all the new-wave, internet-capable entertainers, but my laughs are a little repressed by the sheer effrontery of their attitudes. When I was a boy I wondered why it was so hard for my parents to see my point—now I understand—by their standards, I didn’t have a point. I wasn’t seeing everything through their experiences, I was seeing everything as new and without emotional context. And now I’m trapped in my memories of what our children see as ‘history’, if they notice it at all. Paperless, wireless, unconventional families, uncensored entertainment, the disintegration of traditional religious institutions’ power to shape people and events, access to everything—information, encyclopediae, maps and navigators, definitions, language translations, 24-hour news cycles—all the things that have remade what was once my stable little spot on the Earth—our children take them as givens—the same way we took drinking from our lawn hoses for granted (back when people still felt safe drinking from ground wells).

So, in the end, Seth McFarlane did a great job hosting the Oscars—he also did a terrible job—it depends on your age.

Ben Affleck

“The Big Book of Movie Annotations”

I’m gonna write a book about all the historical details of all the movies, just like those annotated Shakespeare books that explain what ‘wherefore’ actually means—and why pouring poison into someone’s ear was a normal method of assassination in the context of “Hamlet”, etc.—I’m gonna include all the details I notice when I watch old movies, such as a modern closed-captioning transcriber’s mistranslation of a certain slang phrase from the thirties because it can be mistaken for something similar, if only phonetically, in the present day.

20130111XD-GooglImag-Screens08

Future generations may need it spelled out for them. They may not appreciate the difference between Bill “Bojangles” Robinson dancing down the stairs with Shirley Temple in “The Little Colonel” (1935), say, and the heartbreaking montage of ‘blackface’ film-clips in Spike Lee’s “Bamboozled” (2000). They may miss the tragedy of Bill Robinson appearing, near the end of his life and far past his prime, in one of his very few film appearances—a world-famous dancer whose perception, by white Americans, as ‘inferior’ kept him excluded during what is sometimes called ‘Hollywood’s Golden Era”—the ‘studio system’ movie industry that monopolized filmmaking until the 1950s.

20130111XD-GooglImag-Screens07

They may not understand the mournful soundtrack behind Lee’s montage of examples from popular culture of the Jim Crow era’s easygoing dismissiveness of African-Americans’ humanity—the TV executive character may live in more modern times, but his self-regard and his own experience of life have been just as marginalizing, if less overt.

So much of history is subtle. The Looney Tunes of the thirties had blatantly bigoted caricatures of non-whites—absorbed, unnoticed, by most audience-members of that time—that are since aired (and that rarely) with a warning message of introduction that specifies the thoughtless racial profiling as an evil that was part and parcel of the creative culture of its day. As late as 1946, the syndicated comic strip “Walt Disney presents Uncle Remus and Tales of the South” was the basis for the Disney film, “Song of the South” (1946)—the NAACP disapproved of the African-American portrayals in the film even before “Song of the South” was released. This was the first time a Walt Disney movie was criticized for its ethical content (with the exception of Fantasia, for animated ‘nudity’, five years earlier).

20130111XD-GooglImag-Screens06

It’s amazing, really, the glacial change in racial attitudes, from slavery, to Jim Crow, to the Civil Rights Movement. The NAACP and the Civil Rights Movement began just after WWII, but racism was still a source of rioting and conflict in the Sixties, and isolated media spikes like the Rodney King beating—caught live on tape yet still exonerating the brutality of the LAPD—to the present day (that vigilante shooting of an unarmed teenager in Florida was less than a year ago).

Our first ‘black’ President was so ahead of schedule that no one my age or older could watch his 2008 acceptance speech without tears in their eyes. We may be forgiven if we mistake that for an end of prejudice in America—it is so certainly the end of any public ambivalence about racial equality that it’s almost as good. Racism has been reduced to marginal personalities and inbred cultural pockets—which, like domestic abuse, religious extremism, and misogyny—can only be changed by the law and time.

20130111XD-GooglImag-Screens05

But that is only one of the many threads of history that are woven into our films—not the vicarious world of the movie itself but the techniques, language, artistry, science, and craft of all moviemakers, from starlet to soundstage doorman. The events of their day created mind-sets that varied as the world went on, from Edison’s early forays into cinema theaters to the CGI FX of the now.

Even deeper down, we can see the differences in attitudes towards the shared past—from Sergei Eisenstein’s “Alexander Nevsky” (1938), to Richard Thorpe’s “Ivanhoe” (1952), to Ridley Scott’s “Kingdom of Heaven” (2005)—we see the era of the Crusades, but through three different cultures’ interpretation! It gives a parallax effect to the movies, particularly those with historical settings.

Similar to Shakespeare, who requires translation due to the archaic language which old William was both using and inventing as he went along; similar to Dickens, whose early-Industrial-Era British-isms are as often a search into history as they are dialogue or narration; the movies of the twentieth century include a panoply of annotation-worthy dialogue, motivation, slang, and perceptions, both of their time and their view of past times.

To begin with, there are, of course, the Stars—and they offer so much of interest that, while writing my book, I shall have to be careful not to lose sight of my subject and get lost among the fanatical discourse (so-called ‘news’ of celebrities who are the objects of the ‘Fan’-public’s obsession). Then, there are the producers, the directors, the hundreds of others listed on today’s film credits (which is odd, if you consider that more people probably worked on the old films, when the studio only allowed about twenty or thirty names to be on the credits).

20130111XD-GooglImag-Screens04

All those people had family (and/or love-lives) so there are ‘dynastic’ threads, as well, that could be linked chronologically to the shooting schedules of certain films. The same goes for their health—accidents, illnesses, dissolution, stress, mania—all these things are part of the scheduling, the tone, and the final team of filmmakers for any film.

Then there is music—and the films are not shy about the importance of music—biopics of musicians are a significant percentage of all movies made:

There’s “Amadeus” about Mozart, “Shine” about David Helfgott, “La Vie En Rose” about Édith Piaf, “I’m Not There” about Bob Dylan, “Nowhere Boy” about John Lennon, “La Bamba” about Ritchie Valens, “Ray” about Ray Charles, “Coal Miner’s Daughter” about Loretta Lynn, “Walk the Line” about Johnny Cash, “The Benny Goodman Story” (1956), “Rhapsody in Blue” (1945) about George and Ira Gershwin, “Till the Clouds Roll By” (1946) about Jerome Kern, “Immortal Beloved” (1994) about Ludwig von Beethoven, “Impromptu” (1991) about Frederic Chopin….

20130111XD-GooglImag-Screens03

— And movies don’t stop at the life-stories—see this link for IMDB’s list of every Chopin piece included in every movie (hundreds !): http://www.imdb.com/name/nm0006004/#Soundtrack .

This is the reason I think movies must have hyperlinks—my “Big Book” of cinematic ‘anatomy’ may be a thing too large to exist as a single book. And movies (and thus their ‘annotation-logs’) are still being made, faster and faster so as to keep pace with the public maw—upturned and opened, like a baby bird’s beak, through the theatres, IMAXs, DVDs, VODs, Premium Cable, Basic Cable, and Network TV media.

And we approach a singularity, as well—the line that distinguishes a film from a television program erodes further with every ‘Sopranos’-style premium cable, cinema-quality series and every independent film that is released the same day both in select theatres and on VOD.

 

Making an ‘Encyclopedia Galactica’ reference-site, online, would be best served by starting now, while the living memories of its constituents can still provide the perspective for what is already becoming an endless pantheon of images, ideas, theatre, and history. And I find it strange that no one has yet popularized a phrase that means ‘all audio-visual media, including the oldest nickelodeon flip-cards, animations, silent films, early TV broadcasts, et, al., all the way up to today’s (tonight’s, really) new prime-time episode or cinema release, or TV commercial or news report. It is an undeniable stream of impactful media that has no single name.

‘Media’ is a word that gets thrown around a lot by people who don’t care about etymology. The Latin word media connotes ambiguity, neutrality, moderate, or middling. Prior to the digital era, it was mostly used as a term for the materials used in a work of art, for example: marble carving, tempura on wood, oil and canvas. The implication (I suppose) was that an artist’s tools were in a neutral state until used in a work of art—that red is merely red, ink is merely ink—and this was, for the most part, accurate. But technology changed that. Marshall McLuhan famously opined, “The medium is the message”—he was referring to Television—but the message applies to movies, Youtube, and video-blogs, as well.

20130111XD-GooglImag-Screens09

At present the medium we use most is electricity—but it is a refined, controlled, and programmed type of electricity which allows its use to create music, literature, images, animations, and videos. We can call it ‘electronic media’, but that doesn’t signify much—like the word ‘art’, it has several meanings, and no specific meaning. Post-modern creativity has a real problem with nomenclature—it is so much more intricate a process than early arts that the terminology can end up sounding like the title to a doctoral thesis in physics. But when we attempt a sort of shorthand, we end up calling them images or audios or videos—and, again, it means too much, and nothing in particular.

 

The one aspect that is diligently worked upon is the ‘genre’. In many ways, McLuhan’s quote could be re-phrased, “the genre is the message”. But that’s only part of it—‘message’ is an old-fashioned concept as well. Most entertainment industry ‘art’-work is used to sell ad-time, or charge a ticket for. So, a fully post-modern McLuhan might say, “The genre is the market-demographic”. Genre is also fascinating in that it implies a sensibility, a preference of content—that’s a pretty gossamer concept for a ‘pipe’ which entertainment-producers intend to siphon revenue through.

In some ways, we regular folks ought to consider being annoyed about market-demographics—but Hollywood would just blame sociologists, and rightly so. Ever since Sociology (the science of people in large numbers) proved that, while no individual’s behavior can be predicted, the behavior of people in groups can be predicted accurately —and the larger the sample-size (number of people) the more accurate the predictions are—ever since the 1950s, really—advertisers, marketers, promoters, campaign managers, even insurance salespeople have been finding more ways to use this information to prime their revenue pumps, and keep them flowing.

It’s insulting—the fact that we can be predictable, as part of a group, is almost as dispiriting as if we were predictable as individuals—as if we only thought of ourselves as individuals. Here’s another insulting concept—I heard someone the other day saying something about ‘there are sixty million people in LA—so even if you’re one in a million, there’s sixty others just as good as you.’

20130111XD-GooglImag-Screens10

Now that Earth’s population has reached seven billion, we ought to accept the fact that our ‘media-surroundings’ will be controlling our perspectives, our aspirations, and our plans—and that China has a point when it comes to locking down the sources of internet communication. ‘Crowd-sourcing’ is a new, but still primitive, form of getting a group of people to act as a single unit—the evolution of crowd-sourcing and propaganda and news-manipulation in the age of the internet has a massive potential, not just for putting unheard-of power in the hands of an individual, but of taking power away from more plodding, ancient centers of command, like governments and corporate executives.

We don’t study ourselves as much as we study what is in front of us—we always run towards the glamour in the wood—we never stop to question ourselves, our motivations, our priorities. Arthur C. Clarke was fond of pointing out, in the 1960s and 1970s, that humanity was racing to explore space when we had yet to explore two-thirds of our own planet. He was referring to the oceans, of course, and, as always, Clarke was right. We are still a long way from total exploration of our own planet—we are doing a much faster job of destroying it so, if we wait long enough, there won’t be any undersea life to explore.

20130111XD-GooglImag-Screens02

By the same token, we don’t study our desires and urges, either. The study of entertainment is as important, and undeveloped, today as psychology was in Freud’s time. Few people took psychology seriously at first, and we still don’t see a whole lot of progress in that area—it is unpleasant to study humanity, ourselves, when it comes to the ‘dirty’ parts, the childish or selfish or cruel parts of our personae. So, too, would we prefer to enjoy our movies and TV shows and YouTube videos without anyone being a killjoy by pointing out what our entertainment choices say about us.

Layers of info are growing thicker and thicker over the sphere of civilization—safety tips, how to do well in school, how to get a job, how to keep a job, how to date, how to marry, how to raise children. Old living rooms never had remote controls—and old folks never had to learn to use them. Old car dashboards never had a buzillion buttons and slides, and old drivers only had to learn how to shift gears and step on the brake. Our lives are hemmed ‘round with protocols, user-manuals, assumptions (such as assuming you know what the ‘don’t walk’ light means when you’re standing on a street corner). We have to key in multiple digits from a number pad to enter our homes, pay with our credit cards, withdraw from an ATM, or log on to a computer. Even total idiots who do nothing but wander the streets are, nowadays, required to know a great deal about our public works and utilities to avoid the ‘death-traps’ that otherwise surround them in a modern city.

What used to be called propaganda is now an immersive experience, from cradle to grave, and if we don’t analyze our input, we will never know how used, manipulated, or conned we are in our daily lives. When our children began watching TV, we were very careful to explain about how it’s all fake, how it’s all trying to sell something, and how it’s ultimate goal is to make money by piquing our interest for an hour or a half hour.