When You Knew Who the Heavyweight Champ Was – Part 2

Joe Frazier and Muhammad Ali

To reiterate my last column, we Boomers grew up knowing who the world heavyweight champion was. Nowadays, at press time, there are four with names I’ve never heard of.

In 1921, the World Boxing Association was formed. It presumed to call the shots as far as who the champions were in each weight division, and all seemed to agree that they were cool with that. Thus, the whole world agreed that the reigning champions had names like Max Baer, Joe Louis, Max Schmelling, Gene Tunney, and Rocky Marciano.

Then, in 1963, another boxing authority was formed: the World Boxing Council. Uh-oh. Now, we had the potential to have TWO different champions.

And, wouldn’t you know it, that’s exactly what happened. Ernie Terrell (the brother of one of Diana Ross’s Supremes) was named champion by the WBC in 1965 after Ali backed out of a fight with him, while Ali remained the champ as far as the WBA was concerned.

But there were only two names to remember, and only one after Ali brutally savaged Terrell in 1967 in revenge for Terrell referring to him as “Clay,” insulting the recently converted Muslim in the process.

Once again, all agreed who the champion was.

For some reason, in 1983, the International Boxing Federation was formed. And for some yet other reason, the World Boxing Organization was formed in 1988, Now we have four different governing bodies to decide who the champions are.

What a mess. Combine that with fighters like Ali and Jerry Quarry who are debilitated by their careers, the deaths of Duk Koo Kim and others in the ring, and the psychopathic behavior of former champion Mike Tyson, and it’s easy to see why the popularity of boxing is a shadow of what it was when we were kids.

Make no mistake: I really don’t mind.

When You Knew Who the Heavyweight Champ Was – Part 1

Ever heard of Wladimir Klitschko? Oleg Maskaev? Ruslan Chagaev? Sultan Ibragimov? According to the FOUR different authorities who govern professional boxing, those are the names of the current heavyweight champion.

What a stinking mess. What happened, anyway? Remember when the champ was Floyd Patterson, and everyone knew it? Or Joe Frazier? Or the Greatest of all, Muhammad Ali?

I don’t regret professional boxing’s fall from grace and loss of popularity. It is a sport with tragic consequences for many of its participants. Who doesn’t fight tears when they see Muhammad Ali wracked with Parkinson’s syndrome? Despite how you feel about pugilism, if Ali would have gone to work in another trade, odds are nowadays he would be a sixty-something that would never shut up (and who would be loved by all who knew him). However, I do look back fondly on the days when everyone knew the name of the heavyweight champ.

Oleg Maskaev, right

When I was an electrician working in Southern California in 1981, I worked on a sprawling house that was being built by one of the various heavyweight champions of the time: Mike Weaver. Perhaps you have heard of him. I had. But how many people knew he was the reigning world champion (according to the WBA)?

So the demise of a universally acclaimed champion had already taken place by then. While some would say the last was Ali, I surmise it was actually Larry Holmes. After all, everyone knew about Holmes, even though many didn’t like the fact that he was responsible for humiliating the aging loudmouthed champion we had all grown to love (outside of the US, nobody would sanction the fight there).

So what did happen to boxing, anyway? Why can nobody name the heavyweight champions nowadays, even if they COULD pronounce their names?

In this case, I think the Greatest is the one responsible.

In researching this piece, like most of them, I consulted Wikipedia. It had this to say:

Prior to 1921, champions were acknowledged by the public at large. A champion in that era was a fighter who had a notable win over another fighter and kept winning afterward. Retirements or disputed results could lead to a championship being split among several men for periods of time.

Tune in tomorrow for more.

When We Thought All Leaders Were Assassinated

Watching Tom Brokaw’s 1968 on the History Channel, I had a memory spring back to mind that was buried too deeply for me to dig up without a little help.

After the unspeakable assassinations of that year, I, and many other kids, assumed that leaders were destined to die violent deaths.

My mother’s shock of JFK’s death was still fresh in my mind on April 4, 1968, when the local programming was interrupted to announce that Dr. Martin Luther King had been gunned down in Memphis. While the former caused her to be hysterical, her reaction to King’s death was more controlled.

It had nothing to do with racism. My mother was born in the Texas Hill Country to hard-working parents. My grandfather, who lost a lung to WW1 mustard gas, ran his own gas station. My grandmother was a schoolteacher.

My grandmother was also a paradox. The world she grew up in taught her to use the dreaded “n word” with no embarrassment whatsoever. Yet, the way she lived her life proved that she had no real animosity against any ethnic groups.

I thought it was hilarious that the taboo word would fly out of her mouth repeatedly in the course of normal conversation. But if my mother ever heard ME use it, I was beaten as hard as if I had cussed.

No, my mother’s resigned reaction to King’s death was because she had seen it coming. Many, many great men had died in the fight for civil rights, and she, and many other adults, knew that some racist peckerwood would get the unquestioned leader of the movement sooner or later.

But we kids weren’t so in tune to the events of the time. All we knew was that another famous leader had been gunned down.

The angst became heavier on June 8 of that year. After giving a speech in Los Angeles, and apparently heading for a Presidential nomination, Robert Kennedy was gunned down by some Palestinian nutcase who had an issue with Bobby’s stance towards Israel.

That did it for me. As Hubert Humphrey squared off against Richard Nixon, I wondered who would be killing them. I also wondered why anyone would even want a job that had a certain violent death sentence attached to it.

It seems naive, of course, but it was a real part of my world. According to Brokaw’s 1968, I wasn’t the only one, either. Teachers reported that entire elementary school classes held the same belief.

I really don’t remember if my own classmates felt that same way. But I would be quite interested in finding out how many of you Boomer readers once thought that all leaders were destined to be gunned down.

When We Converted to the Metric System – NOT!

1970’s posters advising us that we had better learn the metric system!

The things we Baby Boomers were destined to accomplish! We would be the generation that would usher in cheap, clean nuclear power! We would be driving flying cars by 2000! And we would take the lead in adopting the efficient, easy-to-use metric system!

OK, enough with the exclamation points already. Obviously, all three of these particular dreams were overblown.

However, it may surprise you to know just how close we are to being a metric nation. Read on.

It all started by those lovable masters of illogic, the French, who decided we needed a logical system of measurement. According to metric scholar Pat Naughin:

The metric system used all around the world has three parts. In France in the 1790s, it was named the “decimal metric system”. The system part came from John Wilkins in England, the metric part came from Burattini in Italy, and the decimal part came from the USA. Benjamin Franklin, Thomas Jefferson, and George Washington were very active in getting the French “philosophes’ to use decimal numbers for the “decimal metric system”.

US Dept. of Commerce brochure of the 70’s

OK, raise your hand if you knew that our founding fathers were part of the team behind the metric system. THIS history buff didn’t!

During the 60’s, Great Britain, Australia, and Canada all began a systematic conversion to the same system that most other countries had officially adopted: the metric system. This left the US, Liberia and Burma still using the English Imperial units. Thus, talk began spreading among legislators,educators, and manufacturers about switching over.

There was really no choice in the matter. Go metric, or lose every economic and intellectual advantage you have over the rest of the world.

Thus, the metric system began being taught to us Boomer kids in schools.

At this point, it would be good to point out that pharmaceutical manufacturers had been using the metric system since early in the 20th century. So had much of the tooling industry. When it came to teeny tiny amounts, it just made more sense to them to use grams and millimeters.

But it was in the 70’s that many others followed suit. For example, food manufacturers. Cereal boxes began being sold in metric weights, with the standard weight in parentheses. The implication was that the METRIC system was the preferred one.

The liquor industry was an eager adopter. I know that by the time I could legally purchase hootch in 1980, the half-gallon, fifth, and pint were gone, replaced by their metric equivalents. Wine was sold in metric quantities as well. Interestingly, the working man’s preferred libation, beer, stubbornly resisted change.

Indeed, it was stubborn resistance by the working stiffs among us (including, for the longest time, ME) that kept the US from jumping in headlong with the rest of the world and becoming an official metric nation. We cringed at the sight of kilometers on our speed limit signs. We rolled our eyes at temperatures given in Celsius. We were disgusted when our SAE wrenches that we might have inherited from our fathers and grandfathers no longer fit these newfangled nuts and bolts on our cars.

A flyer published in Singapore in the 70’s, shows an approach that might have made the transition easier. Children, teach your parents the metric system!

But instead, we tended as a generation to agree that our feet, pounds, miles, and gallons were just fine, thank you.

However, the metric system continued to be adopted despite our indifference and/or opposition. Our American-made cars began to sport speedometers that read in both MPH and KPH. Some gas stations extended the functionality of their old two-digit-price-limited pumps during the Arab oil embargo of 1973 by marketing the now-expensive stuff as liters. The Olympics turned us into begrudging experts into what constituted a meter, a kilogram, or a kilometer. Domestic cars even began to be manufactured with metric bolts.

That brings us to today. The US continues to be one of three blips on the world map that are still officially non-metric. But in reality, we are as much, or even more so, metric than some official countries. It just doesn’t say so on our company letterhead.

One last thing: for an inaccurate, but close enough, conversion from Celsius to Fahrenheit, just double the number and add thirty. Thus, 30 degrees Celsius becomes 90 degrees “Americun”. Actually, it’s 86 degrees, but at least you know it’s HOT!

When There Was a Bad Draft in the Air

The older members of the Boomer generation got to see lots of cool things. They watched Howdy Doody. They wore coonskin caps. They got to play with baking powder submarines.

However, they also held a dread of one day turning eighteen. The draft was on, and a particularly nasty war was ongoing. Kids (and I mean that literally, as I was certainly a kid when I was eighteen) had to make profound decisions. Would they opt for ROTC? Would they volunteer for a more appealing form of service than the swamp-wading, booby-trap avoiding Army grunt? Or would they stay in school, or apply for CO status, or, head for Canada?

The draft began with a proclamation signed by Abraham Lincoln during the Civil War. Back then, a one-time payment would exempt you from having to serve.

After that, drafts would be implemented during times of war and suspended afterwards. In 1940, FDR signed the Selective Training and Service Act which created a draft during a time of peace, although the writing was clearly on the wall regarding the US’s future involvement in WWII.

But after the War, the draft stayed. Fewer young men were drafted in peacetime, but the possibility was there nonetheless. As wars like Korea and Vietnam escalated, more and more youths were sent the dreaded letter from Uncle Sam.

In 1969, a lottery was held which you did NOT want to win. Up until then, the government’s policy was to draft older individuals first as needed, meaning the odds would increase that you would be called up until you reached whatever cutoff year was in place. But the lottery chose birthdates at random as the primary prerequisite of when you would be selected. The later your birthdate was drawn, the less likely you would be called.

Twenty-year-olds were the primary target of the lotteried draft. If you turned twenty on September 14, 1969, you were virtually guaranteed being called up. That was the first date drawn. June 8 was drawn 366 (it was a leap year), so you had a pretty good chance of avoiding the dreaded letter if you were born on that day.

The draft started with twenty-year-olds, then progressed through each older year until 25. Then it dropped to nineteen, then eighteen.

However, even though both forms of the draft were set up to spare eighteen-year-olds, the fact is that many of them were still drafted. Curious.

As Vietnam slowed down, so did the draft. In 1973, it was discontinued altogether. I was fourteen. I was very, very happy. So was every other Boomer male who had evaded compulsory military service. In 1975, even registration was stopped.

But in 1980, Jimmie Carter reinstated registration due to the Soviet invasion of Afghanistan. He intended to send a message to the Kremlin. Of course, now the Soviet Union is no more, and Russia is a democracy, and WE’VE invaded Afghanistan. Despite those strange twists of political intrigue, registration continues.

But today’s youths largely view it as a mere rite of passage. We who remember JFK, however, can recall a time when “draft” had a much more sinister connotation than a good cold beer or a chilly breeze in one’s house.

When the Seven Deadly Words Were Really Deadly

The year was 1972. George Carlin, brilliant comedian best known at the time for his portrayal of the “Hippy Dippy Weatherman” on Johnny Carson and Flip Wilson Show appearances, released an album called Class Clown. The album, which appeared without parental advisory labels way back then, contained a magnificent, highly offensive routine called “Seven Words You Can Never Say on Television.”

The Seven Words, which you can view here in all their profane glory, exemplified Carlin’s rapier-sharp intelligence when it came to deducing how society works. Clark Gable used what was then known as “the D word” way back in 1939 in Gone With the Wind, and from then until 1972, many formerly taboo words had become acceptable for broadcast television.

But there was no doubt about it: in 1972, there was NO WAY you would hear any of Carlin’s Deadly Seven on broadcast television.

How times have changed.

George Carlin being arrested in 1972

Carlin, who taped the album in Santa Monica, California, was arrested later that year in more conservative Milwaukee for performing the sketch onstage. The charge? Violating obscenity laws. It was tossed by a judge soon afterwards, but the bust still earned the comedian a great deal of infamy that only enhanced his rebellious reputation.

What would happen to him today in Milwaukee? Well, obviously no charges would be filed, but equally obviously, nobody would bat an eye. You hear worse than that at a hip-hop concert. That is one of the differences, for better or worse, between the world we Boomers live in now and the one that we grew up in.

The Deadly Seven soon began becoming obsolete shortly after Class Clown’s release. One of the first to fall was the “S word.” It was frequently heard at sporting events. The dreaded “F word” also was picked up by parabolic microphones on the sidelines of football games and such. And Carlin pleaded the case that the “T word” didn’t even belong on the list. Well, it too was heard on many a live broadcast.

Scene from The Deer Hunter

But the list came crashing down for me in 1982, when a local station, KTLA I believe, in Los Angeles aired the uncut version of The Deer Hunter. At each commercial break, the audience was warned that what they were seeing was the original R-rated movie.

I don’t know how they managed to pull it off without getting nailed by the FCC, but I remember pulling it in on a pair of rabbit ears over the airwaves. And, if I recall correctly, that film contained every single one of Carlin’s Deadly Seven. Repeatedly.

Nowadays, even the cleaned-up A&E version of The Sopranos is liberally sprinkled with “S words.” SNL is known for regularly receiving FCC wrist-slaps for “F-bombs.” And let’s face it, an episode of Deadliest Catch, while technically “bleeped,” leaves very little to the imagination as to what “French” is being spoken.

But once upon a time, George Carlin nailed the state of television censorship with his unforgettable Seven Deadly Words. Ironically, while coming up with seven words too obscene to say on TV is quite impossible today, there are probably hundreds of words too Politically Incorrect to ever make the airwaves. And lovable Otis the drunk on Andy Griffith, or Foster Brooks at the Dean Martin Roasts? Horrors! That could never happen today! Our village-raised children should never see such excesses! Why, the next thing you know, they would be too upset to listen to Akon on their iPods!

To repeat myself yet again, what a long, strange trip it’s been, being a Boomer.

When Lucy Was on Television

We Boomer kids had a few constants in our lives growing up, some good, some bad. There would be coverage of the Vietnam War every night on the news. Dad would install a new license plate on the car every January. And Lucille Ball would have a hit television show.

Lucy was best known, of course, as Lucy Ricardo, beloved bride of Ricky, in a show that is ranked as the most popular ever by many. I Love Lucy ran for seven seasons beginning in 1951, and the duo went three more years on The Lucy-Desi Comedy Hour. And just because you were too young to catch it the first time didn’t mean you had to miss it. The first series to be filmed in the studio, instead of being broadcast live, Desi and Lucy shrewdly gained all rights to the show after production ceased, meaning they made untold millions licensing it for syndication.

It’s nice when the artists win, instead of the executives.

Anyhow, it is unlikely that a single Boomer in the US has never seen an episode of I Love Lucy. To this day, it remains of of the most popular syndicated shows on television.

But just because I Love Lucy sailed off into the prime time sunset didn’t mean Lucille Ball was done with television. Far from it.

Scene from The Lucy Show

In 1962, Lucy, now amicably divorced from Desi, began starring in The Lucy Show. It was a perennial ratings giant for CBS, and starred Lucy’s buddies Vivian Vance and Gale Gordon, who played the constantly angry Mr. Mooney. Gordon was the first pick to play Fred Mertz on I Love Lucy, but was committed to another series (Our Miss Brooks) at the time. However, he did make some guest appearances. When Lucy was offered her new show, Gordon was immediately selected to play the part of Banker Mr. Mooney.

Wouldn’t you know it, he was committed yet again to another series, this time playing Mr. Wilson on Dennis the Menace. But by the show’s second year, he was available, and he replaced a previous banker character played by Charles Lane.

Lucy’s own Desilu Studios produced the show, so Lucy was the show’s boss. When she sold out to Gulf and Western Industries in 1967, she decided she didn’t want to work on a show over which she no longer had full creative control. So The Lucy Show disappeared, to be immediately replaced by Here’s Lucy.

Scene from Life with Lucy

Here’s Lucy starred Gordon, Mary Jane Croft (who had assumed the role of a best friend for Lucy after Vivian Vance’s 1965 departure), and, of course, Lucy herself.

The show featured all new characters, but the audience had a hard time telling them from the former ones. They had different names, but as far as were concerned, it was still Mrs. Carmichael and Mr, Mooney. And Lucy, of course, still called the shots.

Here’s Lucy was as popular as the previous show, consistently landing great ratings until 1974, when its popularity sagged just a bit. At this point, what happened is not exactly clear. Either Lucy herself declared that a great show had run its course, or CBS pulled the plug on the last old-style comedy of the 60’s to enhance its reputation of taking bold new moves with shows like All in the Family. Either way, it was still in the Top Thirty when it was canceled.

So for the first time in memory, Lucy was not on television every week.

Lucy and Gale Gordon tried a 1986 comeback, Life with Lucy, but by then the humor formula of the old shows no longer worked. I recall it having the same premise as her two previous shows, just with everyone having gotten a lot older. I think it would have been a success had it been released in the 60’s. Two short months after its debut, it was gone.

Nowadays, the very concept of the sitcom has been largely shove aside. Long-running shows like Everybody Loves Raymond, The King of Queens, and Friends are quite rare. Something called reality TV has unfortunately proven popular, with numerous clones of the (IMHO) disturbing genre getting high ratings.

Ah, for a simpler day, when you could count on Lucy and Mr. Mooney to be on TV reliably every week.

When Litigation Wasn’t So Blasted Commonplace

Oh, lord. I’m opening myself up to cease-and-desist orders and libel lawsuits here.

Well, I have freedom of speech. So here goes…

According to the Georgetown Journal of Legal ethics, Summer 2005 issue, in an article by Emily Olsen, this summed up the stance of the American Bar Association once upon a time:

In 1908, the American Bar Association (“ABA”) established and promulgated its first ethics code, known as the Canons of Professional Ethics, which condemned all advertisement and solicitation by lawyers. Academics at the turn of the century generally viewed advertising as not appropriate for the legal profession. They believed that only tricksters used legal advertising in order to improve their reputation and an honest lawyer worked to earn his good name. “In the case of the lawyer, advertising of one’s own willingness to be trusted as a man of unselfish devotion frosts the rose before it has a chance to bloom.”

Wow, shades of the NRA (and I am NOT anti-gun, before anyone’s hackles get raised) coming out against machine guns and sawed-off shotguns in the hands of the general populace in the 1930’s. Once upon a time, common sense was much more common.

Well, the times they-have-a-changed.

In the mid 1970’s, the state of Arizona had the good sense to file suit against a lawfirm that, in its view, was crossing the line with its advertising.

In 1977, the US Supreme Court sided against Arizona. On that day, floodgates were opened which led to massive full-page-Yellow-Pages ads promising you wealth and justice if only you will hire the mentioned legal firm to sue the crap out of whomever you felt like wronged you.

Perhaps a less-than-contemptible idea in philosophy, in practice, it has transformed a society whose individuals once by and large blamed themselves for their mistakes to lay the blame on the state, the nation, their doctor, or perhaps McDonald’s or another wealthy corporation who can darned sure spare the cash.

Prop 65 warning, telling me the light fixtures I just bought might cause cancer

The same article that I quoted from outlines a comprehensive list of rules that lawyers must now follow when advertising their trade.

Sadly, that list doesn’t bar bad taste. Thus, we are bombarded by things like garish full-page ads in phones books and magazines promising large payoffs if the potential plaintiff who reads the blurb chooses to enlist the firm (no win-no fee!!!) to go after the real or imagined injurer with barrels-a-blazing.

And society has paid for the litigation free-for-all that widespread lawyer advertising has contributed to.

Now in all fairness, I’m a big John Grisham fan. Grisham himself is a licensed attorney who instead uses his literary talents to make a nice living. And yes, I’m familiar with Grisham stories like The Rainmaker, in which a corrupt insurance agency has to be sued to cause it to see the error of its ways.

But if I recall, none of the plaintiffs in the book found Rudy Baylor in a large, intrusive Yellow Pages ad or on a massive billboard or on an often-rerun television commercial.

Thanks for letting us know

And yes, corporate corruption exists that victims must use legal means to fight against.

But if you find it necessary to write down a 1-800 phone number from an afternoon ad in order to seek justice, then you have issues of your own.

You see, lawyers have thrived on word-of-mouth referrals since time immemorial. I have had a day or two in court, and each time, my lawyer was someone who was recommended by a trusted friend or by a previously-utilized attorney who was unable to handle the particular case.

When we were growing up, a lawyer’s advertisement was his shingle, or perhaps a discreet bold-type phone listing. And, if I recall, medical bills were reasonable enough to pay on the spot upon dismissal from the hospital. McDonald’s didn’t have to put large warnings on their coffee cups designed to let the particularly low-witted know that the contents therein were hot.

But nowadays, if the average Joe slips on the ice, or trips and injure his ankle, or (horrors!) spills hot coffee on himself, instead of warning himself to be more careful next time, he frequently finds himself reciting an 800-number that he’s heard so many times on TV ads that he has it memorized. And that is one of the many differences between the world we grew up in and the one that we now inhabit.

BTW, for a defense of the infamous McDonald’s decision that is completely self-unaware of how ironically funny it is, click here.

When Everybody Smoked

If a TV show or movie about the 50’s or 60’s is REALLY authentic, it shows nearly everyone above the age of 21 having a smoke.

Our generation was perhaps the smokingest one in history, at least during those two decades. And no wonder! We were bombarded with ads on TV, radio, in newspapers, magazines, and billboards. And we had a pretty good idea that it was bad for us, but we weren’t 100% sure.

In 1947, Merle Travis’s Smoke, Smoke that Cigarette painted a dark picture of tobacco addiction and its ultimate effect: death. But it was not until 1966 that the US government finally required labeling on cigarette packs stating that “Smoking may be hazardous to your health.”

In my house, the warning had an immediate effect. Mom was a Salem chain smoker, and as soon as I was old enough to read that warning, I began hounding her to stop. She finally did, a few years later.

Smoking began declining late in the 60’s, but it was still extremely commonplace. The idea of a smoke-free restaurant, or even a smoke-free SECTION of a restaurant, was inconceivable. If you went out to eat, you smelled cigarette smoke. It was a given.

All cars had ash trays. So did the vast majority of homes. Movie stars like John Wayne hawked cigarettes, though he filmed many anti-smoking ads late in life as he fought cancer.

Speaking of the ads! TV advertising was banned in 1970. Yet, even though I was only ten when I heard my last cigarette commercial, I can easily recite at least twenty jingles and slogans that I heard over and over.

Smoking was simply cool. That’s why so many teenagers did it. There was a stern warning on cigarette machines warning minors not to operate them, but it was never enforced. Therefore, many eighteen-year-olds already had two-pack habits.

A commercial aired the same year that the warning appeared on cigarette packs that was replayed for at least twenty years afterwards. It was a kid imitating his father’s every move, including picking up a pack of cigarettes and having a look at them immediately after his father had lit up.

Smoking has a negative reputation these days. Smoking is no longer allowed in many establishments any more. You can’t smoke on US domestic air flights. But in the 50’s and 60’s, it was everywhere. We grew up smelling it, and got to where we didn’t even notice it anymore.

When Cursing Got Your Mouth Washed Out With Soap

One of the most obvious differences between the present day and the world we Boomer kids grew up in is the amount of naughty words flying through the air. What would our grandparents think if they heard modern-day conversations at the shopping mall? Anyone who watches network television is now subjected to a number of George Carlin’s famous Seven Deadly Words on a regular basis. Shocking stuff to someone who might have just time-traveled here from 1965.

Profanity, I discovered, has a very interesting history. Taboo words have been largely generational. Thus, thumbing one’s nose is nowadays considered a childish insult. But go back a hundred and fifty years, and “cocking a snook,” as it was then known, was as obscene as the modern-day one-fingered salute.

The scatological S-word has taken the opposite track. Once, it was as proper to use as, say, the term “feces.” But somewhere along the line, it gained a reputation for vulgarity.

One thing’s for sure, though. Words and expressions that were sternly forbidden by society in general, our parents in particular, are now quite commonplace, for better or, mostly, worse.

The Mona Lisa racily thumbing her nose on a 1911 postcard

But other pendulums swing in opposite directions. Take ethnic terms, for instance.

1960’s Miami, Oklahoma was ethnically diverse, to a degree. The degree consisted of two races: white, and Native American. Of course, back then, the latter race was “Indian.” But nowadays, that word has taken on some tarnish. Thus, you don’t hear it as much as back then.

We kids also grew up using the infamous “N” word with great innocence and lack of ill will. We used it as a playful insult, the kind of name you’d call a friend in jest. If you were really mad at someone, the N word would NOT be in the arsenal of insults you would fling at them.

Perhaps the absence of blacks in 1960’s northwest Oklahoma is why we used the word so freely. I would never dare utter it during trips to Tulsa, as we knew that it was indeed a strong insult when used on those whose family histories include slavery.

But at the schoolyard, one of the favorite tricks to play on gullible friends was to say “Guess what?” “What?” “you’re a N- and I’m not!”

Nowadays, that expression, when used by anyone other than a black person, carries the same social stigma, or perhaps one even stronger, as that of the classic F word.

The whole lightening up of the on-air use of salty language has to be traced back to Rhett Butler’s famous adiós statement to Scarlett O’Hara. The rumor has long been that David Selznick was fined $5,000 for putting the word “damn” in the film. However, the fact is that the Motion Picture Association board passed an amendment to the Production Code on November 1, 1939, to insure that Selznick would be in compliance with the code. The amendment allowed the use of two words, hell and damn, as long as their use was occasional and necessary to the storyline. The first hurdle had been removed.

The Bawdy Bard

It didn’t take long for those two words to be used more than occasionally. Profanity steadily increased throughout the 40’s and 50’s. In 1966, Who’s Afraid of Virginia Woolf? was released with an M rating, suggesting the film should only be viewed by mature audiences, due to its use of of the “GD” combination. The next year, the F-bomb made one of its earliest appearances in two British films, and the S word turned up in the American release In Cold Blood.

My own wake-up call came in 1973, when I saw Cops and Robbers. I was stunned to hear language that had only been heard behind the school woodshop in a film rated PG.

Television was quick to follow. G-D was one of the earliest harder curses to make it on the air, “son of a bitch” was close behind. I remember Alan Alda used that term in a 1979 MASH episode to great effect in insulting a South Korean officer who was transporting a female civilian to her execution.

Thus today, television profanity is either out in the open or else bleeped just barely, often the beginning and end of the word or phrase left audible.

However, to bring all of this into perspective, perhaps we should look at the works of William Shakespeare.

The bawdy bard liberally sprinkled his works with words like Gadzooks, Zounds, God’s bodkins, God’s body, by God’s mother, and most horrifying of all, “God’s blessing on your beard.” In Shakespeare’s time, combining the use of God with a sarcastic reference to a man’s beard was right up there with today’s “M-F!”

In other words, Shakespeare’s plays were largely of the R-rated variety, or at least PG13.

Obviously, there’s a balance in there somewhere.

But by and large, many if not most Boomers fondly look back on a time when one used foul language at the risk of a mouthful of soap bubbles, and one was protected from such offensiveness on television and at the movie theater.