The ancient Greeks had a dim understanding of what we moderns call "individualism," but they were the first to articulate it.
Stupid Is the New Smart
Stupid is the new smart.
Consider the nostrums of the digital era. “Video gaming is just a new form of literacy.” “Reality shows . . . challenge our emotional intelligence.” “Who cares if Johnny can’t read? The value of books is overstated.” “If you’re not on MySpace, you don’t exist.” Watching cartoons is “a kind of mental calisthenics” for small children. “The truth is, we need multitasking as much as we need air.”
World of Warcraft is not altogether different from The Canterbury Tales. Vicarious existence through reality television contestants enables us to enjoy healthier interactions with real people. Ignorance is underrated. Living is interacting through online intermediaries. Let your television babysit your kids. Attention deficit disorder is strength, not malady.
Welcome to Idiotville, population seven billion.
Pop culture is a wasteland.
The strange paradox of television is that viewing options have decreased even as the number of channels has dramatically increased. Viewers have their choice of hundreds of stations devoted to reality television, celebrity gossip,Faces of Death–lite-style home-video shows, and envy-inspiring vignettes on how wealthy people live. Worse still is television’s invasiveness. Standing in the supermarket checkout line, waiting in the airport terminal, pumping gas at the service station, standing on the train, sitting in the backseat—there is no escape. People are afraid to be alone with their thoughts.
The newsstand is increasingly indecipherable from the supermarket checkout aisle. Celebrity news and shark attacks pass for what matters. Even political coverage resembles a dressed-up version of high-school gossip: He’s popular/she’s not. Can you believe that he is sleeping with her? Fight! Did you hear the outrageous thing he said? Something important generates interest only when reduced to its most trivial aspect. Declining readership has queerly motivated newspaper and magazine editors to race to the bottom to capture “readers” who don’t read by dumbing down content and dramatically shaving word counts. All this has left Susan Jacoby to quip in The Age of American Unreason, “It is only a matter of time before a publication markets itself as ‘The Magazine for People Who Hate to Read.’ ”
The cultural rot is especially visible on the silver screen, which advertises Hollywood’s lack of creativity by green-lighting sequels, remaking old movies, and providing cinematic treatments of long-canceled television shows. All mediocre movies are destined to be remade as even worse movies. What got red-lighted because The A-Team, Herbie Fully Loaded, The Green Hornet, and The Day the Earth Stood Still got green-lighted? The success of one 3-D movie unleashes a barrage of 3-D movies. An industry made on creativity persists on copycatism.
In popular music, the visual bizarrely trumps the audio. No amount of studio wizardry can make Miley Cyrus, Britney Spears, or Katy Perry sound like Aretha Franklin, Kate Smith, or Mama Cass. Of greater consequence, no amount of airbrushing can make today’s Aretha Franklins, Kate Smiths, or Mama Casses look like Miley Cyrus, Britney Spears, or Katy Perry—who clearly possess the most important prerequisite to a successful career as a recording artist. Are fat, ugly, and old people necessarily bad singers too?
The World Wide Web has ironically made life more insular. The popularity of social networking sites extends the clique’s reach online; Twitter broadcasts the inane musings of one’s favorite celebrities in 140 characters or less; and BlackBerries allow for the illusion of keeping in touch by staying beyond the touch of friends and loved ones. The Staten Island teenager who fell down a manhole while texting, the Sacramento high school senior who sent and received 300,000 texts in one month, and the Pennsylvania woman who obliviously texted her way into a mall fountain are signs of the times.
People are too lost in the virtual life. They have lost track of how to live the real life. New amusements have not made old distractions obsolete. Americans spend more than two and a half hours every weekday watching television. The figure has risen since the advent of the digital age. By way of comparison, Americans spend about twenty minutes per weekday reading and about fifteen minutes “relaxing/thinking.” How we spend our leisure time is largely a waste of time.
Pop culture, of course, is not high culture. It has always appealed to the lowest common denominator. But reaching the lowest common denominator has never required pop culture to go so low. More disturbing, and injurious, is the go-along-to-get-along mentality of institutions ostensibly committed to cultural betterment. Libraries, schools, and museums rationalize assaults on culture as defenses of it.
At tony Cushing Academy in western Massachusetts, $40,000 in tuition doesn’t even get you a library anymore. “When I look at books, I see an outdated technology, like scrolls before books,” the prep school’s headmaster notes, adding, “This isn’t Fahrenheit 451.” It is, and 1984, too. In the place of twenty thousand discarded books, the school spent $500,000 on an Orwellian “learning center” complete with three giant flat-screen televisions and a cappuccino machine. School officials guessed that only a few dozen books had been checked out at any one time. “When you hear the word ‘library,’ you think of books,” one student explained to the Boston Globe, “but very few students actually read them.” The solution to this problem, so obvious to the administrators at this preparatory school, was to abolish books.
“I don’t read books,” a Rhodes Scholar and former student body president of Florida State University explains. “Sitting down and going through a book from cover to cover doesn’t make sense.” He Googles his way to the answer. A Duke University professor of literature candidly confesses, “I can’t get my students to read whole books anymore.” These aren’t dropouts scorning literacy but rather the young adults touted as the best and the brightest. Intelligent people are using reason to rationalize intellectual laziness as progress and ridiculing time-tested methods of acquiring knowledge, wisdom, and understanding as outdated.
Much of K–12 schooling involves educating for a standardized test, superficial learning that does to the mind what Botox, steroids, and plastic surgery do to the body. The type of education predominant in college is professional training that prepares cogs to fit into the economy rather than liberally educated citizens ready for the responsibilities of freedom. Institutions that shun broad knowledge graduate shallow people with narrow interests.
Quest for Learning is a public school in Manhattan where students play video games, blog, create podcasts, film YouTube-style videos, and partake in other digital activities that young people overdose on outside of the classroom. They also call their teachers by their first names, refer to their school as a “possibility space,” and forgo traditional lettered grades for “pre-novice,” “novice,” “apprentice,” “senior,” and “master.” Crassly mixing philanthropy with business, the Bill and Melinda Gates Foundation helps fund the school. Teacher Al Doyle calls spelling “outmoded,” says that podcasting is “as valid as writing an essay,” and regards memorization as irrelevant in light of search engines. “Handwriting?” Doyle remarks. “That’s a 20th-century skill.” It was surprisingly unsurprising when the New York Times Magazine said of the school’s executive director, “Until a few years ago she knew little about educational pedagogy and was instead immersed in doing things like converting an ice-cream truck into a mobile karaoke unit that traveled around San Jose, Calif., with a man dressed as a squirrel dispensing free frozen treats and encouraging city residents to pick up a microphone and belt out tunes.”
“In the 21st century, libraries are about much more than books!” the American Library Association boasts in conjunction with its National Gaming Day, “the largest, simultaneous national video game tournament ever held!” In the midst of branch closings and budget cuts, public libraries have acquired a new product to lend: video games. “The literacy aspect is huge,” maintains Linda Braun of the Young Adult Library Services Association. “Many video games have books related to them. And there is a lot of reading that goes on with actual game play.” In addition to lending discs, libraries host massive video-game tournaments and feature on-site consoles allowing patrons to play. “A library is no longer just a place for books,” argues Ryan Donovan, a public librarian in Manhattan. He says that gaming involves “a high degree of literacy and problem solving skills” and will “hopefully attract a new audience to NYPL [New York Public Library].” “Video games have evolved,” explains Allen Kesinger, organizer of the National Gaming Day at a library in Southern California. “They have become a medium to deliver sophisticated, emotionally charged stories.” He claims that “this strong focus on narrative” will help libraries “attract hesitant readers.” It is an open question whether games will serve as a gateway to books, as some librarians hope. Settled is their role transforming libraries from centers of education to centers of amusement, from quiet sanctuaries in a noisy world to extensions of that high-decibel environment in which “shh” is the only verboten sound.
Science writer Steven Johnson, author of a book called Everything Bad Is Good for You, lauds the virtues of television zombies and mesmerized gamers: “Parents can sometimes be appalled at the hypnotic effect that television has on toddlers; they see their otherwise vibrant and active children gazing silently, mouth agape at the screen, and they assume the worst: the television is turning their child into a zombie. The same feeling arrives a few years later, when they see their grade-schoolers navigating through a video game world, oblivious to the reality that surrounds them. But these expressions are not signs of mental atrophy. They’re signs of focus.”
It is later than you think.
It wasn’t always so.
For much of the twentieth century, there was a concerted effort among intellectuals to spread knowledge and wisdom far and wide. Correspondingly, many regular people took full advantage of the great educational effort. Rather than mind-numbing amusements invading places of learning, learning invaded the leisure space. Blue-collar intellectuals were those most fervently dedicated to the idea of a well-rounded, educated citizenry.
A blue-collar intellectual is a thinker who hails from a working-class background, and whose intellectual work targets, in part or whole, a mass audience. Given that blue-collar intellectuals benefited by such outreach efforts when they were more blue collar than intellectual, it is hardly surprising that they would lead such efforts when they found themselves in positions to do so.
This book focuses on a half dozen blue-collar intellectuals:
• Before Will and Ariel Durant partnered to write books, they joined in a scandalous marriage uniting a teacher with his child-bride student. Will, a seminarian excommunicated from his church, and Ariel, a pariah within her immigrant family for marrying a gentile, appeared destined for divorce. But the partnership yielded appearances on best-seller lists from the 1920s until the 1970s, and a Pulitzer Prize in 1968. Despite serving as the de facto professors of world history to millions, the Durants have strangely escaped the interest of actual professors of history.
• Mortimer Adler, an Ivy League Ph.D. who held neither high school nor college degree, fittingly launched the most successful adult education program in history, the Great Books Movement. As much a salesman as a scholar, Adler successfully marketed ancient texts to television-age America.
• Milton Friedman received state scholarships to Rutgers, New Deal employment, and government research grants. Then the pragmatic libertarian became the twentieth century’s most effective exponent of the free market, winning the Nobel Prize in 1976. Whereas the other giant of twentieth-century economics, John Maynard Keynes, the scion of a famous economist, was educated at Eton and Cambridge and cavorted in an aristocratic bisexual clique, Friedman was the progeny of Jewish immigrants whose Rahway, New Jersey, home doubled as a sweatshop.
• Dubbed “Ike’s favorite author,” Eric Hoffer captured America’s imagination in two prime-time CBS specials and left an indelible mark on political discourse through his landmark book, The True Believer. He was an intellectual everyman who migrated from skid row anonymity to Rose Garden chats with the president. Though readers continue to consult the unschooled but well-educated longshoreman philosopher, writers have overlooked one of the twentieth century’s most fascinating lives.
• So destitute that he shared the same cot with his brother into adulthood, so awkward that even dorks brushed him off, Ray Bradbury elevated his lowly finances and meager social status by selling five million copies of Fahrenheit 451, penning teleplays for Alfred Hitchcock Presents, The Twilight Zone, and ultimately his own Ray Bradbury Theater, and placing stories in Mademoiselle, Harper’s, Playboy, and The New Yorker. In the process of lifting himself up, the “poet of the pulps” elevated not only his readers but heretofore marginal genres such as science fiction as well.
Blue-collar intellectuals spoke to educated laymen without talking down to them. In the process, they uplifted the masses and rescued ideas from the academic ghetto. Such sins are not easily forgiven.
Get the Collegiate Experience You Hunger For
Your time at college is too important to get a shallow education in which viewpoints are shut out and rigorous discussion is shut down.
Explore intellectual conservatism
Join a vibrant community of students and scholars
Defend your principles
Join the ISI community. Membership is free.
Why John Paul II Believed National Identity Protected Human Freedoms
Is national identity the same thing as unhealthy nationalism? John Paul II didn't think so.