Wednesday, July 25, 2012

Should student loan debt be reduced or forgiven?


“Before any great things are accomplished, a memorable change must be made in the system of education and knowledge must become so general as to raise the lower ranks of society nearer to the higher. The education of a nation instead of being confined to a few schools and universities for the instruction of the few, must become the national care and expense for the formation of the many.” ~John  Adams
There was a time when the greater purpose of higher education in the United States was creating informed citizens who could find a place and engage in a participatory democracy.  Americans have always prided themselves on being a beacon of democracy to the rest of the world; an example to counter dictatorial, totalitarian, or fascist regimes. We as a nation have promoted democracy as the way forward for countries who wanted to throw off the political and economic shackles of repression.
One of the instruments of promoting democracy took the form of access to higher education. Higher learning was seen as a bastion of everything from enlightenment thought to industrialization. We built cathedrals to higher education; Harvard University, for example, was founded 140 years before the American Revolution. The development of creative thinkers with a well-rounded education that included the Arts and Humanities was held in high regard as a moral, cultural and economic imperative.
After the Second World War, the GI Bill allowed millions of veterans to pursue higher education; for the first time in the nation’s history, higher learning was accessible to virtually all classes of people. This was truly a transformational moment in American history; most of the men and women who benefited from the GI Bill were the first in their family to go to college. These events coincided with a temporary spike in civic engagement that lasted well into the 1960s; voter participation was particularly strong at this time (Pintor, Gratschew & Sullivan, 2003).
It is clear from John Adams’ quote that the Founders intended a role for the Federal Government in the education of our nation. However, if Adams’ were transported from his time to the present, he would not recognize anything in our current education policy that demonstrated this priority. In fact, for the last thirty years or so, public education, including higher education, has either been under attack for supposed inefficiencies while continuously finding its funding subject to cuts; or has been commodified by corporate interests to be little more than vocational education factories to supply their too often temporary, low-paying, benefits-poor labor force. But perhaps the most telling aspect of the shift in the curriculum emphasis in higher education is the sharp rise in tuition, with the concomitant need for ever larger student loans. Student loan debt has exploded in recent years, with solutions to the problem ranging from forgiving most or all of the debt to abolishing the program altogether (Raum, 2012). The question is: should student loan debt be reduced or forgiven? As someone who believes that we cannot call ourselves a fully functioning democracy if every citizen does not have equality of opportunity, I argue that not only should student loan debt be reduced or forgiven, but also that tuition costs should be reduced and that all of this can be paid for if our national priorities are in order. In this paper I will show how our priorities have been skewed away from a democratic notion of education to one that largely benefits corporations and university administrators.
If we are to comprehend what the Founders intended for educating the American populace, and how education can feed the soul of democracy, it is instructive to understand the forces behind the psychology and measures employed to undermine democracy (while disguised as pro-democracy), bifurcate our education system along class lines, and blur the distinction between the economic (capitalism) and the political (democracy) in order to maintain the balance of power in the direction of big business and away from the average citizen and their voting voice.
In the 1920s, government, PR and advertising firms, and business leaders, spurred by the success of President Woodrow Wilson’s World War I propaganda service, banded together to form public opinion and habits around “creating artificial wants, imagined needs, a device recognized to be an effective technique of control” (Chomsky, 2004). Edward Bernays, a member of Wilson’s propaganda service and one of the founders of the PR industry, claimed that “the general public are ignorant and meddlesome outsiders whose role in a democracy is to be spectators, not participants” (Chomsky, 2004). These business and government leaders sought to create a “philosophy of futility” and “lack of purpose in life” (Chomsky, 2004) through the focus on meaningless and superficial consumption. They knew that, just as in Adolf Hitler’s Germany, a passive, disinterested public is one that is also easy to control and manipulate; one that will turn command of their lives over to presumably better educated, business-savvy men of property. Though the New Deal cultural programs and introduction of the GI Bill served to ameliorate some of the darker aspects of this form of brainwashing, the process of cultivating a passive American public has continued mostly unabated ever since.
Another aspect of this anti-Democratic agenda was to champion the notion of two sets of standards for education as it related to democratic participation. Journalist Walter Lippman, in concert with the aforementioned parties, reported on his version of “democracy” in the 1920s, stating “…representative democracy entailed creating two modes of education – one mode would be for the elite, who would rule the country and be the true participants in the democratic process, and the other branch of education would be designed for the masses, whose education would train them to be obedient workers and passive spectators rather than participants in shaping democratic public life” (Giroux, 2004).  On the other hand, in the tradition of John Adams and the Founders, Henry Giroux makes the case that “Progressives like W.E.B. Dubois, John Dewey and Jane Addams rejected such a divergence of educational opportunity outright. They believed that education for a democratic citizenry was an essential condition of equality and social justice and had to be provided through public and higher education” (Giroux, 2004).
In 1964, Mario Savio, a leader of the Free Speech Movement, recognized what was happening to higher education and to students hoping to benefit from it. Before his now-famous comments on “the operation of the machine,” he told his audience that University of California at Berkeley President Kerr compared the Board of Regents to shareholders in a corporation and himself to the manager of said corporation, which for Savio meant that the students are raw materials, to be sold in the market place; Savio strenuously objected to this characterization, stating, “we are not employees, we are human beings!” (Savio, 1964).
The commodification (we as individuals are a commodity, and our entrance to a higher social class will cost us) and vocationalization (shift from humanities to a business-oriented or corporate training model) of higher education has resulted in ever-rising tuition costs. University presidents are hired from the ranks of the corporate world and are paid accordingly (as long as they are generating the appropriate amount of revenue). Universities and corporations are teaming up to build entire degree programs around entrance directly into that corporation’s work force, going as far as including the corporate name in the title of the programs and courses; surely these firms are looking for a financial payoff. The direct influence of corporations on college and university curriculum and administration is a recent phenomenon, but it is no coincidence that its appearance on the scene coincides with the steep rise in tuition. In the 1980-81 school year, the average tuition fees, for both public and private four-year colleges and universities, were $8,672; in the 2009-10 school year, they were $20,986 (What are the…2011). During that same span, median family income was virtually stagnant, ranging from approximately $37,000 in 1980 to approximately $46,000 in 2008 (Median household income, 2009). Clearly, for the vast majority of Americans, college tuition is not affordable.
Perhaps one is not convinced that there is a connection between corporate influence at universities and higher tuition fees. I would like to introduce an additional tack, then. In France, there are no university tuition fees; in Germany, students or their families pay only $1,000 per year to attend university (Sheng, 2010). One argument that occasionally surfaces when arguing for reduced fees or free college education is, “Not one French university appears in the top forty of world universities; you get what you pay for” (Sheng, 2010). According to this line of reasoning, there is a correlation between higher tuition fees and a better education. Then why are American companies going out of their way to hire foreign workers (Bort, 2011), often paying them less than they would have to pay an American worker?; and these workers are supposed to be arriving here after attending inferior schools, whether from Asia or Europe, because higher tuition fees signal a better education! So we pay higher tuitions to ostensibly have the best university education, yet there are no jobs for us when we have completed that education/training and people who have studied in other countries, presumably with lesser standards and rigor because they pay little or nothing for it compared to us, are being paid less to do the jobs we have trained for at supposedly better colleges and universities.
In an interview with the San Jose Mercury News, the chief executive of Intel, Craig Barrett, discussed the integration of India, China, and Russia into the new global economy this way: “I don’t think this has been fully understood in the United States. If you look at India, China, and Russia, they all have strong educational heritages…The big change today from what’s happened over the last 30 years is that it’s no longer just low-cost labor that you are looking at. It’s well educated labor that can effectively do any job that can be done in the United States (Herbert, 2004). What about American educational heritage? Long before the student loan debt problem reared its ugly head, Americans had been led to believe that we have the best educational system in the world; now we’re supposed to believe that it’s a good thing to have corporate logos plastered all over campuses, professors and administrators more concerned about the bottom line than educating future citizens, and college sports programs (sponsored by corporations) raking in hundreds of millions while tuitions skyrocket; and all of this to see good-paying jobs either shipped out of the country, or filled with foreign workers from “strong educational heritages?”
            Then there is the matter of our colleges and universities becoming increasingly more vocational. This is not working out very well for Americans, either. The few manufacturing jobs left in this country are now often filled by workers who went to college, where in the past most of these positions were filled by less educated workers. Does a college education make them more productive? It appears not. Productivity levels in most of Europe are higher than the U.S. (including France); number of hours worked in manufacturing in Europe has trended higher for several years; and workers are better paid in these countries (International comparisons of…2011). If vocationalizing higher education was supposed to improve the corporate bottom line, it appears it might not be working so well for anyone (unless the company moved the jobs overseas); especially workers, some who are paying those high tuitions. We are paying higher tuitions to work longer hours at lower paying jobs. And we often can’t afford to pay back the student loans.
“Where did this idea come from that everybody deserves free education? Free medical care? Free whatever? It comes from Moscow. From Russia. It comes straight out of the pit of hell” (Moyers, 2003).
Perhaps the mindset that Texas State Representative Debbie Riddle demonstrates in this statement is at the root of many of our society’s ills; it also brings us back to Edward Bernays and Walter Lippman and the indoctrination of the American public to a condition of apathy. What kind of a society do we want? Do we believe in democracy, or just pay lip service to the ideology? Do we want to educate our people to compete in a global economy? If our priority is to educate Americans toward full participation in democracy and compete globally, nothing less than a complete shift in how we organize our society is in order. I am sure the people in Europe and all around the world, whose way of life comes “straight from the pit of hell” yet who are better educated and more productive than us, would agree that a good place to start would be to reduce or forgive student loan debt, lower college tuition substantially, and place more emphasis on a humanities-driven, well-rounded higher education. No less than the future of the country is at stake.
References
Bort, J. (2011, December 06). Despite high unemployment, u.s. companies are hiring from overseas at record pace. Business insider, Retrieved from http://articles.businessinsider.com/2011-12-06/news/30481039_1_h-1b-h1b-visa-petitions-visa-program
Chomsky, N. (2004). On nature and language. (pp. 182-183). Cambridge, UK: Cambridge University Press.
Giroux, H., & Searls Giroux, S. (2004). Take back higher education. New York: Palgrave Macmillan.
Herbert, B. (2004, January 26). Education is no protection. New York Times. Retrieved from http://www.nytimes.com/2004/01/26/opinion/education-is-no-protection.html
Median household income. (2009). Unpublished raw data, Stanford University, Palo Alto, California. Retrieved from http://www.stanford.edu/class/polisci120a/immigration/Median Household Income.pdf
Moyers, B. (2003, May 14). [Video Tape Recording]. Now with bill moyers. , Retrieved from http://www.pbs.org/now/transcript/transcript220_full.html
Pintor, R., Gratschew, M., & Sullivan, K. (2003, January). Voter turnout rates from a comparative perspective. Retrieved from http://www.idea.int/publications/vt/upload/Voter turnout.pdf
Raum, T. (2012, April 03). Explosion in student loan debt reaching crisis proportions, but largely flying under radar. Coast reporter. Retrieved from http://www.coastreporter.net/article/GB/20120403/CP01/304039992/-1/sechelt/explosion-in-student-loan-debt-reaching-crisis-proportions-but&template=cpArt
Savio, M. (Performer) (1964). Mario savio on the operation of the machine [Web]. Retrieved from http://www.youtube.com/watch?v=PhFvZRT7Ds0
Sheng, J. (2010). [Web log message]. Retrieved from http://jimsheng.hubpages.com/hub/Comparison-of-cost-of-higher-education-around-the-world
U.S. department of education, National center for education statistics. (2011). What are the trends in the cost of college education? (NCES 2011-015). Retrieved from Institute of education sciences website: http://nces.ed.gov/fastfacts/display.asp?id=76
U.S. department of labor, Division of international labor comparisons. (2011). International comparisons of manufacturing productivity and unit labor cost trends. Retrieved from Bureau of labor statistics website: http://www.bls.gov/web/prod4.supp.toc.htm

Monday, June 4, 2012

Rebellion or Acquiescence


The following scenario might have taken place in 1968 in the United States, or Paris, or Prague.
            A few would-be revolutionaries meet at a sidewalk cafĂ©, and over espresso light the spark of rebellion. Meetings are held, ideas are expressed and articulated and parried and regurgitated; pamphlets are published; events take place, and the media takes notice. So does the clandestine belly of the state beast. The passionate minority take to the streets; violence and chaos ensues, the status quo appears to be threatened. The media and government propaganda machine springs into action, vilifying the voices of change at every turn. Military units are called in to quell the disorder, and suppression finds retaliation. Those in the silent majority that might have felt pangs of sympathy toward the expressed grievances of the vocal minority, remain silent. The media and academy reflect on events, perhaps a few textbooks are rewritten, but the history is largely white-washed. Meeting places are surveilled and closed down. For a while, identity checks are stepped up. The world returns to normal. All the commotion, and what is the result? The status quo is retained. And the media and government pat themselves on the back, as another rebellion is quashed.
For two hundred and thirty five years, almost the entirety of our history, Americans have mostly stopped short of the kind of rebellion that brought about the creation of our republic. Of course, there are always a few individuals who are willing to be bloodied, jailed or even die for their cause. However, the relative modicum of comfort that most Americans enjoy, regardless of the economic uncertainty that envelops the majority of the working class and poor today, prevents them not only from seeking redress through revolutionary means themselves, but also makes them squeamish when seeing others protest. These attitudes are often accompanied not only by complete apathy toward the political process and those who participate in it, but also a palpable disdain for community organizing or involvement.
What makes so many Americans turn away or rail against revolutionary ideology? Most would offer simple explanations, such as a “conservative mindset” that views protestors as ungrateful and misguided at best, dirty and traitorous at worst. It should not be difficult for a young protestor today, at least one that knows history, to understand why they are looked upon with some indulgence, if not reviled. They need only read about or see how those in most vocal opposition to the Vietnam War, or those who rallied for social justice in that era, were treated and how they are remembered. As we shall see, part of the problem is that most Americans do not know their history; both those who demonstrate passivity and incredulity toward protestors and movements, but also the protestors themselves, who often do not understand the roots of so-called “conservative backlash.” In light of the burgeoning “Occupy” movement taking hold throughout the world, it is instructive to examine these roots, and why the same forces that checked rebellion in the 60s, using the same methods, are out in force to try to eliminate the Occupy phenomenon and ensure that those who have always demonstrated indifference towards the political process, remain on the sidelines.
The question of why so many Americans are apathetic on the subject of politics, or even the issues that affect them and their neighbors on a local or personal level, is a concern (though an indirect concern) of MIT professor of linguistics Noam Chomsky, who has reported extensively on the antiwar movements both in the Vietnam era and the more recent conflicts in the Middle East, as well as the use of propagandistic language. Chomsky might say that we need look no further than the advertising and public relations industries and their activities starting in the 1920s, for the origins of passivity. They learned their lessons and received their marching orders from the state. In his research into the origins of twentieth-century propaganda campaigns, Chomsky found that the United States and Britain founded state propaganda agencies during World War I.  The goal of Britain’s “Ministry of Information” was to “control the thought of the world” and particularly “American intellectuals, who could reasonably be expected to be instrumental in bringing the U.S. into the war” (Chomsky, 2002). U.S. President Woodrow Wilson formed the “Committee on Public Information,” which proved enormously successful in turning a “country of pacifists into hysterical jingoists and enthusiasts for war against the savage Huns” (Chomsky, 2002).
The success of these programs caught the attention of both Adolf Hitler and the American business community; one using it to win on the propaganda front leading up to and during World War II, the other utilizing its power to “shape attitudes and beliefs” (Chomsky, 2002); both targeting the civilian populations of their respective countries with a propaganda onslaught unparalleled in world history at the time. A founder of the PR industry, Edward Bernays, commented in his industry manual Propaganda, “it was the astounding success of propaganda during the war that opened the eyes of the intelligent few in all departments of life to the possibilities of regimenting the public mind” (Chomsky, 2002). Soon government and industry enlisted the assistance of esteemed journalists such as Walter Lippman, also a member of Wilson’s propaganda office, to advance their agenda. Lippman called for nothing less than “the manufacture of consent,” aiding business and the state in unleashing a tidal wave of warlike propaganda designed to “put the public in its place” (Chomsky, 2002). In a disturbing precursor to contemporary political rhetoric, Lippman writes “the general public are ignorant and meddlesome outsiders” whose function in a democracy is “to be spectators, not participants” (Chomsky, 2002). Apart from the fact that this statement would not represent a dictionary definition of democracy, let alone any sensible person’s understanding of the concept, Lippman’s invective finds its contemporary companion in Dick Cheney’s dismissal of the American public’s growing opposition to the Iraq war; when told that two-thirds of Americans felt the war was not worth fighting, Cheney’s one word response: “So?” (Raddatz, 2008) Mr. Cheney clearly subscribes to the “regimentation of the public mind” playbook, meddlesome outsiders be damned! That citizens in a supposedly functioning democracy might temporarily forget their place in the world and state an opinion is but a minor inconvenience to people like Cheney and Lippman.
Bernays and his allies in business and government set to work on their grand designs. In order to indoctrinate the public to the proper state of passivity, creating “artificial wants and imagined needs” (Chomsky, 2002) was required. This work was big business starting in the 1920s. Manuals of the time stated that industry should seek to create a “philosophy of futility” and “lack of purpose in life” (Chomsky, 2002). They hoped to do this by finding ways to “concentrate human attention on the more superficial things that comprise much of fashionable consumption” (Chomsky, 2002).  These leaders sought no less than the brainwashing of the American public to be mindless consumers. The delivery system or systems for this conditioning have taken many forms across the decades, from print media to radio, television to the internet, and their persistence and ubiquitous nature have made both advertising and the means in which it is delivered, as much as the products they present to us, insuperable parts of the zeitgeist. Today this is arguably most evident in the exalted status and fascination with commercials on Super Bowl Sunday. Talk around the water cooler is not necessarily about the products advertised during the big game (as many people do not remember the merchandise peddled), but the slapstick humor employed in the ads. Perhaps Marshall McLuhan did not have this sort of phenomenon in mind when he famously stated, “the medium is the message,” (McLuhan, 1964) as his “message” had more to do with the unintended consequences of new technology, while the argument put forth here is that the consequences have always been intended; those of passivity and disinterest to anything but the most superficial aspects of life.
The advertising and public relations industries, with ample assistance from government, launched their campaign with a flourish in the late 1920s with the ascendancy of radio. As Mark Pendergrast points out in “Uncommon Grounds: The History of Coffee and How it Transformed our World,” in 1929 “Americans spent $842 million on new radios, a 1,000 percent increase from seven years earlier”(Pendergrast, 1999). Almost every one of those radios was tuned to “The Amos and Andy Show,” which was initially sponsored by Pepsodent toothpaste. Because food and drug products were impervious to the vagaries of the Depression, soon products like Maxwell House coffee were the “prominent sponsors of shows featuring entertainment industry titans such as Bob Hope and Gloria Swanson” (Pendergrast, 1999). The business/government /advertising alliance had their hook; not only were products promoted by well-known show business names, but the most popular items included addictive or habit forming properties (coffee, cigarettes, coca-cola), ensuring a steady stream of consumption regardless of the country’s economic realities.
During the Second World War, high-level US planners began devising ways to indoctrinate Americans to lives of conspicuous consumption, while simultaneously drawing up military plans and foreign policy doctrines addressing the need to protect the way of life that they envisioned. Young American men returning home from war overseas had the GI Bill waiting for them; a chance to educate themselves in order to move into the middle class in short order. Access to VA home loans also proved a boon to the post-war economy, as this hastened the move of millions of families to the newly created suburbs that saw its beginnings in places like Levittown, New Jersey (almost entirely white, as blacks were mostly denied access to housing outside of urban centers). Government programs such as the Tennessee Valley Authority, with its rural electrification projects, and later the Interstate Highway System, a means of more effective transport of goods, provided good-paying jobs to returning vets and graduates benefiting from the GI Bill. The burgeoning middle class now had the means to purchase all the products that were advertised to them through the new medium of television. Never in the history of mankind could a more effective means of indoctrination and propaganda be conceived, as we shall examine later. Television proved to be the major spur to post-war consumer activity. The 1950s saw the largest percentage increase in economic activity in world history, and American consumers had the purchasing power to realize not only their own dreams of upward mobility, but also the business/government/advertising alliance’s hopes for a placated, passive public, all the while boosting the bottom line of virtually every sector of the corporate community. These achievements in prosperity and material comfort, coupled with the invidious potential of television, would prove the underpinnings of the “conservative backlash” that met the social upheaval in the two decades that followed.
While many Americans were indulging in the fruits of a newfound prosperity unimaginable just a half-decade before, with what could easily be described as a patriotic fervor (consumption is good for the country and thus patriotic), the 1950s also saw the confluence of political and economic doctrines which, while appearing to work independently of one another, were in fact quite compatible in aiding the advance of what became known as the Military Industrial Complex and the Cold War, with the concomitant engendering of collective public mindset around an unquestioning, authoritarian brand of patriotism. According to Noam Chomsky, high-level government officials, during and immediately following World War II, “delineated a “grand area” that the U.S. was to dominate, including the Far East and Middle East, with its all-important energy resources” (Chomsky, 2011). President Dwight Eisenhower recognized the Middle East as “the most strategically important area in the world” and “probably the richest economic prize in the world” (Chomsky, 2010). Throughout the latter part of the twentieth century, American military, strategic and foreign policy has adhered to these goals. In fact, the “Clinton Doctrine” extended these strategies by declaring that the U.S. “is entitled to resort to unilateral use of military power to ensure uninhibited access to key markets, energy supplies, and strategic resources” (Chomsky, 2004). Unbeknownst to the vast majority of the American public then and now, U.S. military power was projected in the form of CIA-backed military coups in the Middle East (1953, Iran) and Central and South America (1973, Chile), these being just two of many examples; and, of course, intervention and eventual invasion of Vietnam, which was in keeping with the “domino theory” that communist (i.e. Soviet) influence in strategic markets would play out like an avalanche if left unchecked, a Cold War “card” that is played when convenient even today. The American people may be blissfully unaware of this history of intervention (or the largely market-based reasons for intervention or invasion), but the peoples of the countries affected by these policies show no such compunction to willful ignorance; they know their history and act (and vote) accordingly, a situation problematic for today’s American policy makers and their allies. Thus, it should come as no surprise that in poll after poll, whether in the Middle East or Latin America, the U.S. is regarded as a greater threat to peace than the likes of Iran or Cuba.
While these plans were carried out largely under the radar of the American public, a future Nobel Prize winning economist, Milton Friedman, began training his students at the University of Chicago in his theory of “Economic Shock Treatment.” In her startling account of a half century of economic “shock and awe,” “The Shock Doctrine: The Rise of Disaster Capitalism,” Naomi Klein documents how anyone who stood in the way of the United States and its allies exploitation of the resources (human and otherwise) of countries all over the globe, found themselves disappeared, tortured and killed in the name of the “free market.” A quote from Friedman encapsulates his doctrine: “Only a crisis – actual or perceived – produces real change” (Klein, 2007). Friedman’s disciples from the Chicago School gang advised military coups in Argentina and Chile, among others (Friedman himself met with General Pinochet when his military junta took power) and continue to spread their pernicious ideology today, though not without considerable resistance, particularly in Latin America.  Moreover, the rise of democratically-elected leftist governments in that region of the world can be seen as a direct response to the decades of violence and repression which at its core are closely connected to Milton Friedman’s “pure market capitalism.” Unfortunately for the poor people in these corners of the globe (and the people who advocated and fought for them), Friedman’s policies could not take hold democratically, as only a shocking crisis (coups, natural disasters, etc) could bring about the necessary submission. After a crisis, when a state is at its most vulnerable, government planners, advised by Friedman disciples (often natives who studied under Friedman in Chicago) swoop in and destroy anything resembling a social safety net, while privatizing everything in their path; extremely unpopular measures to the majority who elected  socialist leaders such as Salvador Allende, who had improved conditions for the poor in Chile in the three years before his assassination at the hands of a CIA-sponsored death squad. One may ask, how is this connected to American apathy? Because, for nearly fifty years the American public has been led to believe that our intentions in foreign policy and use of force are benign, that we are fostering democracy in all corners of the globe, and that any resistance to this mission is communism or socialism, and therefore evil; the populace not able to make the important distinction between Stalin’s communism and a social welfare state in 1973 Chile which bears striking resemblance to much of Western Europe today. If you believe your country is morally right, you will support its policies and hold anyone who opposes them in severe contempt. It was relatively easy to foster these beliefs in the 1950s, the country still alive with the warm glow of appreciation for our righteous leaders and brave soldiers of World War II and Korea. However, the 1960s ushered in an era where some Americans, particularly students, not only found fault with the official story, but were willing to challenge that story and the leaders who sang its praises.
“….Americans have more often made photography partisan. Pictures got taken not only to show what should be admired but to reveal what needs to be confronted, deplored – and fixed up.” –Susan Sontag
At the dawn of the 1960s, the presidential election featuring John F. Kennedy and Richard Nixon proved a harbinger for how the mass media would shape public opinion, both for liberals and conservatives and even those who were inclined not to choose sides. Kennedy’s youthful charm and good looks contrasted Nixon’s sweaty, tired countenance during the much-discussed television debates of the 1960 election that were likely the impetus to propel Kennedy to the White House; few would argue that JFK was the first “television president,” as he became expert at using the medium to full effect, in short order. And though Kennedy’s assassination is often viewed as the event most responsible for the politicization of the generation coming of age at this time, one could easily argue that the self-immolation of Buddhist monk Thich Quang Duc, taking place months before JFK’s murder, played an equally important role in newfound political awareness. Photographs of Duc’s death, a protest of the Diem regime in South Vietnam, were widely disseminated and certainly a hot topic of conversation on campuses worldwide; in fact, this incident was one of the first television images associated with the war in Vietnam. Within a year of these events, the Free Speech Movement began at the University of California at Berkeley, and an era of protest and political unrest commenced.
The ability of the media, particularly television, to shape the political discourse is the thrust of Edward P. Morgan’s “What really happened to the 1960s.” The premise of Morgan’s text is worth quoting in full; He argues that:
 “the mass media of the sixties era helped to invite and spread that era’s protest activity, but they did so on terms reflecting broader structures of which they were part. As a result, they simultaneously helped to shape, marginalize, and ultimately contain protest movements. Along with the powerful ideological voices who enjoy significant, if not dominant, access to the media, they have been the major facilitators of our diversionary politics and warlike discourse ever since” (Morgan, 2010).
Morgan provides as evidence, among many others, the protests at the 1968 Democratic Convention. He notes that the protestors, in response to obvious police brutality inflicted upon them, chanted “The whole world is watching.” Morgan says, “The protestors, confident that television coverage of police brutality would turn viewers against the police, found that in fact the majority of the American public responded by siding with the police” (Morgan, 2010). Jerry Rubin said, “Television creates myths bigger than reality” (Copeland, 2010). When portraying violence, the media in general and television in particular focus on the most sensational; the lone violent protestor, for example, which simultaneously satisfies a viewer’s voyeuristic bloodlust and helps them to rationalize their opposition to protest and revolutionary ideology, though it may ultimately be in their interest to side with a populist stance. In this way, television serves both the purpose of concentrating the public’s attention on the superficial aspects of life (consumerism), while also creating a mindset of an oppositional stance to dissent through framing of events that concurrently entertain and appall. Much of the generation which grew up during the depression and were the first to enjoy the spoils of the 1950s growth of middle class prosperity, found the 1960s protestors to be dirty, ungrateful hippies that simply needed to be silenced, get a job and wave the flag. The conservative backlash, or what Nixon referred to as “the silent majority,” only came to oppose the war because of the perceived waste of American human and financial capital in a primitive country they also viewed as ungrateful, with no regard for the toll of the war on the Vietnamese. These notions are arguably best reinforced in images such as an August 1967 Time Magazine cover photo showing an American soldier walking alongside an injured Vietnamese child with the caption, “To Keep a Village Free,” or the horrific footage of the aftermath of the My Lai massacre. Essentially we were there to give them the freedom to be consumers like us, which is extremely difficult to achieve when your country is napalmed and bombed to bits by its so-called liberators. This ideology can be found four decades later in the current call to “spread democracy,” which of course is precisely what World War II planners envisioned, if one is willing to concede substituting  the terms “free markets” and “capitalism” for democracy.
“I admit it – the liberal media were never that powerful, and the whole thing was often used as an excuse by conservatives for conservative failures.” – William Kristol, New Yorker, May 22, 1995
Which brings us to the present, with the War on Terror, the “Made for Television” Gulf War of 1991, and the invasion to “liberate” Iraq. Our contemporary political discourse contains all the elements I have introduced previously in a toxic boil of apathy, mindless consumerism, hatred toward “the other” and jingoistic isolationism. This sorry state of societal affairs made the work of convincing the American public and its representatives in congress that invading the sovereign country of Iraq was a good idea, a relatively facile endeavor. To accomplish the task, the Bush administration employed their old friends in television, who in recent years had become frighteningly competent at propaganda, primarily due to the advent of 24-hour cable news and the consolidation of media in the hands of a powerful few moguls. As Jerry Mander points out in his long-forgotten but important work, “Four Arguments for the Elimination of Television,” the very nature of the technology (television), “with the light constantly flickering upon our retinas, causes a state of hypnosis; not in the usual sense of a catatonic feeling, but much like a passive mental attitude, and since there is no way to stop the images, one merely gives over to them. Thinking only gets in the way” (Mander, 1977). With this sort of technology in the hands of people who eagerly employ political pundits who will dissemble, distort and lie to advance their ideology, the war machine had virtually no opposition. Once the invasion was complete and the “Mission Accomplished” banners came down, the media went to work demonizing any domestic opposition to American policies. This demonization was and is still directed primarily at the academic community and public intellectuals, providing the opportunity for champions of neo-conservatism to engage in a double-whammy of destruction of the commons; tearing down public education, which has always had a very tenuous hold on the foundations of our democracy, and silencing dissent, or at least destroying the reputations of those who dare question the status quo and uphold one of the lynchpins of a free republic. Henry Giroux points to “a growing sentiment on the part of the American public that people who suggest that terrorism should be analyzed, in part, within the context of American foreign policy should not be allowed to teach in the public schools, work in government, and even make a speech at a college” (Giroux, 2003). If any “mission” has been accomplished, it is that which has kept the majority of the American people blissfully unaware of the atrocities committed in their name, and for the sake of “protecting our way of life.” George W. Bush encouraging Americans to “go shopping” as a collective response to the terror attacks of 9/11, demonstrates perfectly how materialism and apathy have not only created a passive public, but one full of fear as well. Dick Cheney infamously said, “The American way of life is non-negotiable,” to which I would counter with a quote from Henry Giroux: “Is it utopian to believe that humans are capable of democratic conversation that builds toward a wider democratic future, or is it utopian to believe that the global system can continue to forge ahead on its current destructive path because its flaws will be effectively eclipsed by some as yet unforeseen technological fix?” (Giroux, 2003). Perhaps the American way of life is non-negotiable, but it is also unsustainable.
“Since memory is actually a very important factor in struggle…if one controls people’s memory, one controls their dynamism. And one also controls their experience, their knowledge of previous struggles.” –Michel Foucault
One of the definitions assigned to conservatism is “reluctance to change.” The conservative backlash that the media/government/business alliance formulated starting in the 1920s, strengthened and implemented in the 1960s, and still uses with great effect today, contains as one of its strongest pillars, fear of change. One of my favorite unattributed quotes is “When we seek permanence, that’s when our troubles begin.” We have become an apathetic, complacent, fearful society. The Occupy Movement has shown us many things, but chief among them are that the conservative backlash is still powerful (protestors still characterized as dirty and lazy) and that Americans long for permanence without struggle, revolutionary or otherwise. It is much easier to be a consumer than a citizen, and we have essentially handed over the keys of citizenship, of participatory democracy, to our corporate masters. Our memory has been erased, if it was ever there to begin with. It will be fascinating to see if the Occupy Movement can mobilize enough of our passive consumers to become citizens for the first time, before it’s too late.

References
Chomsky, N. (2002). On nature and language. Cambridge, UK. Cambridge University Press.
Chomsky, N. (2004). Understanding power. New York. The New Press.
Chomsky, N. (2010). Hopes and prospects. Chicago. Haymarket Books.
Foucault, M. (1996). Film and popular memory, in Foucault live (Interviews, 1961-1984), New York: Semiotext (e), p. 127. French original 1974.
Giroux, H. (2003). The abandoned generation: Democracy beyond the culture of fear. New York. Palgrave MacMillan.
Mander, J. (1977). Four arguments for the elimination of television. New York. William Morrow.
Morgan, E.P. (2010). What really happened to the 1960s: How mass media culture failed American democracy. Lawrence, KS. University Press of Kansas.
Sontag, S. (1977). On photography. London. Picador Macmillan.

Sunday, April 8, 2012

Nature, nurture and survival

Upon arrival in my first Human Development class, I had no specific expectations beyond learning something new, as I had not given much thought to development. Absent a formal, everyday learning environment, though questions and issues regarding development surround us at work and in our personal lives, we pay them scarcely little attention, consumed as we are in the hectic workaday world. It quickly became clear, however, that observation of, reflection on, and nurturing development are key to virtually every human activity. A small percentage of the population are genetically predisposed to excel in the evolution of their fellows; those rare teachers, managers and other leaders spring to mind when one reflects on their own development. However, the majority of us must “develop” our ability to nurture.
Speaking of nurture, the age-old debate “nature vs. nurture” entered the classroom parlance almost immediately, and though we did not have to take sides, I leaned heavily toward the notion of nurture as the primary driving force in development. However, once one begins to delve into research, one can find any number of theories that challenge his/her beliefs and inclinations, which is eminently useful in one's development. My first revelatory moment came in the discovery of the theories of Jean-Jacques Rousseau.
“Everything is good as it comes from the hands of the Maker of the world but degenerates once it gets into the hands of man.” Jean-Jacques Rousseau
Returning to a term used earlier, inclinations, reveals the thrust behind Rousseau's work. He advanced the theory that children are best developed through a “naturalistic education” where a child would have “no other guide than his own reason by the time he is educated”(Gianoutsis, 2008) and that this education is derived from “inclinations, not from habits” (Gianoutsis, 2008) Where I stand in accord with Rousseau is in the notion that inclinations, much like the biology of a plant, for example, are what guide a child in early development, and as they acquire reason they are then able to merge with adult society while maintaining the tools to “ignore society's ills” (Gianoutsis, 2008). Where I differ with Rousseau is his insistence on isolating the child in nature. Of course, communing with nature is clearly desirable, and today's youth do not spend enough time in natural pursuits. However, what Rousseau proposed would be labeled “home schooling” today, and despite the benefits of pedagogy and curriculum specifically designed for the individual child, the pitfalls of social isolation are too great to dismiss. I shudder to think of a world where everyone, or even a majority of the population, is home-schooled. In the early years of the twenty-first century, we already observe the implications of the insulating nature of technology; adding to these difficulties a sequestered educational environment seems a recipe for further societal breakdown. So, on the whole what we are discussing here is Rousseau's belief in the strengths of a child's natural inclinations against the limitations of her/his early development, or the inability to reason. In a sense, Rousseau believed that only when a child could match the experience of the reinforcement of rewarding responses with the association of ideas could they be prepared to face the big, scary outside world. Within his theories I tend to focus primarily on the natural inclinations, which include play. Rousseau states, “I felt before I thought, which is the common lot of man” (Gianoutsis, 2008). He posited that children learn best “through their senses, through investigating and exploring the natural world” (Gianoutsis, 2008). After reading Rousseau and the theorist I will discuss next, I came to realize that I was in fact more firmly in the “nature” camp in the nature/nurture debate.
It is my belief that Rousseau's naturalistic theory of development aligns with at least one aspect in the design of contemporary pedagogical theory; that of the approach employed in the educational system in Finland. The Finns believe that children should not begin their formal education until the age of seven, because before that age they learn best through play. Though Rousseau believed that his subject “Emile” should begin his traditional schooling at age twelve, when presumably he would be at that cherished “age of reason,” I am certain he would be a strong proponent for the methods and biological time line adhered to by the Finns, who are universally regarded as having one of the best educational systems in the world.
My second revelatory experience came while researching for a paper in early childhood development, attempting to find contemporary developmental theorists who were ensconced in the natural or genetic end of the eternal debate. Though Noam Chomsky is hailed as the country's preeminent linguist, he is rarely viewed as an expert in child development. He is also seen by some to be a philosopher, though others would argue, quite sincerely, that he is not a philosopher at all, that linguistics is an entirely different discipline, albeit a neighboring one. I am not going to argue that point; it is little more than a question of definition (Though I hasten to add that Rousseau was viewed first as a philosopher). Nonetheless, I believe that to dismiss his masterful work as having no significant implications for child development is a grievous oversight.
The central focus for my argument for Chomsky as a vitally important developmental theorist comes in the form of his theory of a “Language Acquisition Device.” Chomsky first put his ideas forward in the late 1950s, as part of a critique of behavioral psychology. He argued that the way we actually acquire the use of language, its relationship to experience, and therefore its relationship to the world, is quite different from what traditional contemporary philosophy has always maintained. Behavioral psychologists have tended to characterize the human individual entering the world as an undifferentiated lump of malleable stuff, to be molded and shaped by its environment; through processes of stimulus and response, penalty and reward, the individual developed and learned, including the learning of language. I would dare say that the acquisition and use of language is the most important aspect of our development; this I believe to be self-evident.
Chomsky further argued that the accepted wisdom surrounding development and its characterization as almost entirely environmental could not possibly explain how virtually all human beings, regardless of their intelligence, do something as amazingly difficult as master the use of a language, even when they are not deliberately taught it, as most people surely are not. Also, that they do this at such an extraordinarily young age and in such an extraordinarily short period of time. He explains that for this to happen we must be genetically pre-programmed to do it, and that all human languages must have a basic structure that corresponds to this pre-programming. This is what he calls our “Language Acquisition Device.” Perhaps neuroscientist Terrence Deacon explains it best; “Infants are predisposed to learn human languages, acquiring within a few years an immensely complex rule system and a rich vocabulary at a time when they cannot even learn elementary arithmetic” (Chomsky, 2002). My personal discovery of the ideas of Chomsky has proved a life-changer; not only is his work in philosophy and linguistics so vital to our understanding of human development, but also his scholarly research and writings on backdoor political machinations in the Western world, all part of the public record but largely neglected by mainstream media and suppressed as inconvenient to the powers that be, as well as the use of propaganda as an advertising and public relations tool, add to our understanding of development in ways both enlightening and disturbing. I will address these issues later, as an entirely different form of human development (or, perhaps more accurately, practices that might be an impediment to development). Needless to say, Professor Chomsky has become the singularly influential developmental theorist of my experience.
After studying the work of Rousseau and Chomsky, I felt that I was firmly planted in the natural, biological ways of development. It would be easy for one to argue that both biology and environment contribute to our understanding of the world, how we behave, and how we learn. However, this is a cop-out; we can all agree that we learn naturally AND are nurtured, but we must lean one way or the other, or so I thought. Then, in our Mid-Child/Adolescent development course, we read “A Tribe Apart” by Patricia Hersch. I was completely blown away. These were kids from middle class families who were having sex, taking and dealing drugs, and committing crimes, the types of acts that I have always assumed were committed by children from poor or broken homes; and all came from families that claimed to be Christian. What we discover when reading a book like “A Tribe Apart” is even children from Christian homes can have their growth severely stunted if their parents are neglectful. I thought about what it must be like to be brought into a hyper-competitive world like ours when generations of your family have struggled with poverty, addiction, mental illness, or just poor parenting. Imagine you are one of the millions of people in this country who are unable to find a job commensurate with your education and abilities, and all that is left for you is work that pays just above minimum wage, or at best $8-$9 dollars an hour. In order to pay the bills, you must work two, or perhaps even three jobs. None of these part-time, and in some cases even full-time jobs have health insurance. Because your hours vary with each job, this leaves no time for you to improve your situation by going back to college, or even vocational/trade school. Even if you made all the right choices in terms of cutting expenses, you still might not have any money left to enjoy even the least expensive leisure activities, and even if you did, you would not have the time. There are no funds left after living expenses to save for retirement, and now your elected officials are talking about cutting what little retirement you have been paying into, social security. Owning a home is out of the question, and if you already own one, you would be in danger of losing it if you lost even a little of your meager income, if you were able to hold onto a home with what you make. Even keeping up adequate transportation puts you in a precarious situation month to month, but you are willing to sacrifice and use public transit, if it is available in your area to get you where you need to go. If you have other mouths to feed, well, you are hanging by a thread. And I'm speaking of people who have some education beyond high school; millions more have no choice but menial labor at minimum wage. This is called “living for work” - would you want to live like that? Are you a paycheck or a layoff away from being there? Millions of us are on the precipice.
Now imagine you are a child born to parents who must live for work and, worse yet, they are indifferent in their parenting. The sort of parents who do not read to their children; parents who do not take the time to nurture and truly care for the needs of their kids. Or your parents were abusive, emotionally or physically or both. Perhaps your parents are also not able to maintain their own relationship, and their struggles are magnified through conflict, abuse and eventual break-up of the marriage. Add to this the economic difficulties that so many folks experience in this country and you have a recipe for disaster. These are the teens and young adults that we see on the streets of Portland every day. These are not children born of privilege; they are neglected and abused.
When I decided I wanted to become a teacher, I had no illusions that it would be an easy vocation, or that I would be paid handsomely to do it. However, when I contemplate what people like the ones I have just described go through every day just to live, with little hope for the future in the face of the greed and avarice that surrounds their existence, I not only knew for whom I was devoting my life's work, but also that perhaps I would be asked to serve in a capacity with more responsibility to society at large. Regardless of what path I choose in my quest to serve others as humbly and faithfully as I can, Patricia Hersch's book has crystallized for me the importance of parental nurturing in the development of children: it is not unfair or an overstatement to say that parental nurturing can be the difference between a life of relative comfort and a life of struggle; one of an abundance of health or an early death. But what of those who are left to teach and counsel the neglected and abused children in our society? How are we best to support them?
We as a nation are, to put it kindly, enigmatic when it comes to the education of other people's children. Hilary Clinton was reviled in many circles for suggesting that it might “take a village” to raise a child. For some, particularly in political opposition to Mrs. Clinton, this was akin to socialism. Yet year after year we expect our public school teachers to essentially “babysit” troubled children, while giving them fewer and fewer resources to do it, let alone support. Then, when punishment becomes necessary, these neglectful parents are up in arms about outsiders interfering with the raising of their child. Rick Weissbourd, a Harvard University Graduate School of Education lecturer, in an article in Educational Leadership, seems to resign himself and his fellow educators to the notion that “the public believes that schools are largely responsible for remedying the problems associated with the steady increase in delinquency, disrespect and greed among today's students” (Weissbourd, 2003). He goes on to say that “the moral development of students does not depend primarily on explicit character education efforts, but on the maturity and ethical capacities of the adults with whom they interact” (Weissbourd, 2003). So the question remains, does it take a village to raise a child? And if it does, how can we expect educators to bear the brunt of the moral education (and by extension, the educational development) of our children, while simultaneously slapping their hands and tying them behind their backs through lack of funding and support? We know that classroom discipline is essential in order to develop the minds of students, yet we send teachers to the schools with the most severe discipline problems ill-prepared to keep order, thus ensuring that these schools and their students remain trapped in a vicious cycle of falling educational standards and the concomitant dearth of funding. Because many other countries, with nowhere near the financial wherewithal of the United States, can achieve success in education that surpasses us in virtually every meaningful measure, one can only conclude that it is a matter of priorities and the development of our children is not a priority. The study of human development has solidified my belief that the American way of life will wither and die if we do not find the will to provide every child with a well-rounded education.
Previously I stated my intention to address impediments to human development. I believe these are issues that need to be addressed in order for us to continue to evolve as a species. In my discovery of the work of Noam Chomsky, I found in his book “On Nature and Language’ a chapter discussing the “project of keeping the public uninformed, passive and obedient” (Chomsky, 2002) or another way to view it is the use of propaganda to train the public to become mindless consumers. It started in this country during the presidency of Woodrow Wilson, who established the Committee on Public Information, for which the aim was to win support among a passive public for joining the First World War. As Chomsky discovered, the effort “had enormous success, including scandalous fabrications that were exposed long after they had done their work, and often persist even after exposure” (Chomsky, 2002). This form of propaganda caught the attention of Adolf Hitler, who employed similar practices leading up to and during the Second World War. However, it was the American business community's discovery of the potential for propaganda to “shape attitudes and beliefs” (Chomsky, 2002) that had grave implications for the course of the American century, and perhaps the whole of humanity. Chomsky quotes one of the founders of the PR industry, Edward Bernays (who belonged to Wilson's propaganda agency): “It was the outstanding success of propaganda during the war that opened the eyes of the intelligent few in all departments of life to the possibilities of regimenting the public mind” (Chomsky, 2002). What the business community and political leaders feared more than anything was a true democracy. They viewed the general public as “meddlesome outsiders” who are to keep to the sidelines when it comes to democracy; “spectators, not participants” (Chomsky, 2002). Walter Lippman, arguably the preeminent journalist of the twentieth century, was brought into the fold by industry and government to help in “the conscious and intelligent manipulation of the organized habits and opinions of the masses” (Chomsky, 2002). The task of the media, government and the public relations and advertising industries, as manuals of the time period explain, is to “impose a philosophy of futility” and “lack of purpose in life” (Chomsky, 2002). They further explained that they must find ways to “concentrate human attention on the more superficial things that comprise much of fashionable consumption” (Chomsky, 2002). In other words, for the better part of the last century, Americans have been brainwashed to be mindless consumers; first through the print media and radio, then through television. In a real sense, one could argue that our development as a species was short-circuited. Can we evolve beyond this?
Former President Jimmy Carter thought we should try. In 1979 Carter, his presidency in turmoil and his conscience heavy, spoke to the American people from the Oval Office, in a speech that many analysts, then and now, judged as political suicide. Ironically, the immediate positive reaction gave him a tremendous boost in the polls; however, it was a very short-lived moment in the sun. Carter spoke of “a fundamental threat to American democracy” and “the growing doubt about the meaning of our own lives” (Carter, 1979). He called this a “crisis of confidence.” Carter was not just speaking of energy consumption when he said “too many of us tend to worship self-indulgence, and consumption” (Carter, 1979). He added, “Human identity is no longer defined by what one does, but by what one owns” and “we have discovered that owning things, and consuming things, does not satisfy our longing for meaning” and “piling up material goods cannot fill the emptiness of lives which have no purpose” (Carter, 1979). More powerful words have rarely been spoken by an American politician. And I think it is no small irony that these words reflect precisely what our government officials, along with the PR and advertising industries, hoped to engender in us generations ago so that we would be nothing more than passive “spectators” in the democratic process. This process has continued unabated for decades.
The introduction of television, with its insidious emphasis on advertising, has profound implications for human development, and is a technology that aligns perfectly with what the powerful government and business interests hoped to achieve all those years ago to stifle participatory democracy. In his groundbreaking but now largely forgotten 1977 book, “Four Arguments for the Elimination of Television,” Jerry Mander posits that television “places in our minds images of realities that are outside our experience, causing changes in feeling and utter confusion as to what is real and what is not’ (Mander, 1977). He goes on to make the case that technology does not always support human development; in fact, particularly with television, it may cause devolution of the species. Mander reminds us that “pre-technological peoples were surrounded by nature, and they developed an automatic intimacy with the natural world” (Mander, 1977). This connection with nature forms the core of our development as a species (and this correlates with Chomsky's theory of a Language Acquisition Device and Rousseau's naturalistic education). Twentieth and twenty-first century Americans were “the first in human history to live predominantly inside projections of our own minds” (Mander, 1977), mostly because of television. When we rely on technology to lend form to our experience, we short-circuit our development; thus, when we encounter nature, we must refer to technology to confirm that what we have experienced is real, a negative feedback loop that is perpetuated by still more technology. More recent studies have shown how images and voices from television enter directly into our unconscious minds, unfiltered. Mander showed us that not only is television sensory deprivation, which causes, among other things, hyperactivity, but also that the very nature of the technology, “with the light constantly flickering upon our retinas” (Mander, 1977), causes a state of hypnosis; not in the usual sense of a catatonic feeling, but much like a passive mental attitude; or as Mander says “Since there is no way to stop the images, one merely gives over to them. Thinking only gets in the way” (Mander, 1977).
Aside from the obvious implications for subliminal manipulation in advertising (such as creating “needs” that were not there before), the best example I can find for television's (and technology's) interruption of human development is in propaganda. When a political pundit tells us, despite all evidence to the contrary, that France's health care system is “a disaster” (and provides no facts to support this statement), the viewer is taking that statement into their unconscious mind, unfiltered, and ultimately are much more likely to cast their ballot for a candidate who espouses the same beliefs. I challenge anyone to explain to me how that sort of manipulation, through a medium which has proven to be so effective in shaping the attitudes and opinions of the masses, does not stand in the way of human development and societal progress. “Thinking only gets in the way.”
One can easily make the case that technology, which employs imagery that far too often promotes oversexualization, with particularly important implications for women, stands in opposition to human sexual development. It is alarming how devoid of understanding boys and even men are regarding female sexuality. Given that we are naturally sexual beings, one would think that we could have evolved beyond our patriarchal history and embraced the mysteries of the sexual female, for the benefit of both sexes. Though I believe that religious dogma plays an enormous part in our inability to evolve and develop as sexual beings, it is the objectifying images that we create, through all forms of media but especially effective through the hypnotizing effects of television, which cause the most harm. This is where the quote from Rousseau regarding “everything degenerating once in the hands of man” comes into focus. In “The Second Sex,” Simone de Beauvoir probably put it best as to the lot of women in an oversexualized, male fantasy-dominated culture: ‘One is not born, but rather becomes, a woman. No biological, psychological, or economic destiny explains the figure that the human female assumes in society; it is the whole of civilization that creates this product that we call feminine” (de Beauvoir, 1948). In other words, just as we have created unreal environments that cause confusion when we are finally confronted with reality, we have also created images of ourselves as sexual creatures that do not match our natural inclinations. To say that our pre-technological ancestors would not recognize anything that we call humanity today, sexual or otherwise, would be an understatement. And de Beauvoir made these observations before the advent of television!
The controversial psychiatrist R.D. Laing once said that “the growing incidence of mental illness these days may be explained in part by the fact that the world we call real and which we ask people to live within and understand is itself open to question” (Horwitz, 2003). Personally, I believe that increases in mental illness can be attributed to the hyper-competitive nature of our society, which attaches status and meaning to financial success; when you are one of the losers in that competition, and invariably there are many and can only be more as the reality of economic stagnation plays out, you may find it a struggle to find a purpose for your existence. Allan Horwitz, commenting on misconceptions of mental illness in “Creating Mental Illness,” believes that “much of what we regard as mental illnesses today are simply cultural constructs” (Horwitz, 2003); this parallels Jerry Mander's theory that television has caused people to create realities in their minds that are actually another person's imaginings, real yet not real. Perhaps we are all correct, on some level. Regardless, facing reality seems to be a problem for many in our culture; the reasons are numerous and varied, but it would be difficult to argue that the instances and theories I have presented play a great part in this problem. The inability to face reality can be found in our numerous forms of escapism (television and movies, even sports) and in our mainstream media, which report distortions as fact. When determining what method of therapy aligns closest to my beliefs, as well as what I have learned about human development and counseling theory, and what I believe will be of most benefit in today's mental health environment, I found myself gravitating to Gestalt therapy.
The films of John Cassavetes, my favorite film director, are, in my view, an excellent example of Gestalt filmmaking. Cassavetes' work was often misinterpreted as improvisation, so real were the characters and situations. Cassavetes viewed every frame in his films as a journey of discovery. Gestalt therapy's emphasis is “on what is being done, thought and felt at the moment rather than on what was, might be, could be, or should be” (Yontef, Simkin, 1981). The comparison of Cassavetes' work to Gestalt therapy is an instructive and worthwhile endeavor, which I will continue as a theme for my integrated therapy.
Gestalt therapy is a phenomenological-existential therapy introduced by Frederick and Laura Perls in the 1940s. It teaches therapists and patients “the phenomenological method of awareness, in which perceiving, feeling, and acting are distinguished from interpreting and reshuffling preexisting attitudes” (Yontef, Simkin, 1981). As with existentialism, Cassavetes felt that people are continuously remaking and discovering themselves. Possibly the best example of how Cassavetes' work parallels Gestalt therapy is this observation from biographer Ray Carney in “The Films of Cassavetes”: “His films can only teach us new understandings by forcibly denying us old ones” (Carney, 1994). This is precisely what the Perls were bringing to the world with their theories; understanding the difference between what is “residue from the past and what is actually being perceived and felt in the current situation” (Yontef, Simkin, 1981). Cassavetes never worked in the “what was” or “what should be”: Characters in a Cassavetes film were never static; we never knew their motivations or character fully coming into a scene, as they were always evolving. Whether we want to face it or not, this is who we are as humans, perpetually reinventing ourselves, consciously or unconsciously. One of my favorite quotes, of unknown origin is: “When we seek permanence, this is when our troubles begin.”
It is not at all inaccurate to describe Gestalt therapy as reality-based. This is why I appreciate making the connection to the notion of permanence; we may be able to find some sense of permanence in material things (such as a house or car), but there is nothing permanent about who we are. That is our reality. Gestalt therapy aims to separate us from our longing for permanence, unless of course we are talking about a permanent state of thoughtful self-knowledge. Only in this way can we come to grips with our ever-changing moods and work within that reality to become the people we want to be.
As Gary Yontef illustrates,“the Gestalt therapist works by engaging in dialogue rather than by manipulating the patient toward some therapeutic goal” (Yontef). Unlike Cassavetes, who wanted to shake people out of their vegetable torpor with real-life dialogue and situations, most conventional Hollywood movies are constantly out to manipulate our emotions and present us with characters whose motivations are clear from the start to the finish, and whom have goals that are set and achieved. The manipulation of our emotions often comes in the form of music; particularly so with technology that allows someone to take two often wildly divergent forms of music and create a third. In contrast, the music in a Cassavetes film (if there is any) is purposely disorienting, which is how we encounter music in our daily lives and, more to the point, how we interact with others in the real world. We never know how an encounter with another human being is going to play out, until it happens. As Yontef stresses, “The Gestalt therapist allows contact to happen, rather than manipulating, making contact, and controlling the outcome” (Yontef). Similarly, as Carney shows us, “Cassavetes' characters don't have identities, until they discover themselves and their possibilities in this process of exploring their differences, through dialogue interaction” (Carney, 1994).
Am I suggesting that viewing the films of Cassavetes could be therapeutic? Perhaps, and there is absolutely no doubt that his work makes for an uncomfortable experience for most; I have seen people actually squirm in their seats trying to comprehend all that they see in his films. However, this is precisely what Gestalt therapy purports to do for us; it forces us to extricate ourselves from our past and work within the here and now. But how are we to do that when most of us spend so much our lives trapped within the unreal environments of constructs created by others? It would seem that, first and foremost, all of us need to spend more time in the experiential world and less in the passive, one- dimensional world.
It is indeed a paradox; we possess natural inclinations which are nurtured, or, far too often, not nurtured, in our environment. In concluding this paper, I submit that because we have natural inclinations for development, which can then be nurtured only through the proper environmental circumstances, yet we are operating within environments that are constructed primarily by others and their imaginations and through a medium that distorts what is real, that technology, which has been presented as an important part of our evolutionary process and subsequently our development, is instead an impediment to our development. While I would not necessarily go as far as to say that we should return to a full communion with nature, and reject the whole of technology, I do believe that most of us could benefit from a dose of therapy that wakes us to the reality of who we are, helps us to find meaning and purpose beyond the trappings of consumption, and, for those of us fortunate enough to have abilities in the spheres of helping and leading people, to show others that throughout our lives, at every age, it truly does take a village to develop a human being.
References
Carney, R. (1994). The films of john cassavetes. Cambridge, UK: Cambridge University Press.
Carter, J. (Artist). (1979). President carter's address to the nation. [Web Video]. Retrieved from http://www.youtube.com/watch?v=KCOd-qWZB
Chomsky, N. (2002). On nature and language. Cambridge, UK: Cambridge University Press.
de Beauvoir, S. (1948). The second sex. Paris: Vintage.
Gianoutsis, J. (2008). Locke and rousseau: early childhood education. The Pulse, 4(1), 1-19. Retrieved from http://www.baylor.edu/content/services/document.php?id=37670
Horwitz, A. (2003). Creating mental illness. Chicago: University of Chicago Press.
Mander, J. (1977). Four arguments for the elimination of television. New York: William Morrow.
Weissbourd, R. (2003, March 31). Moral teachers, moral students. Educational Leadership, 60(6),
Yontef, G., & Simkin, J. (1993). Awareness, dialogue and process: essays on gestalt therapy. Gouldsboro, ME: Gestalt Journal Press.