100% life from concentrate
You can scroll the shelf using ← and → keys
You can scroll the shelf using ← and → keys
The “conversation about race” that public figures periodically claim to desire, the one that is always either about to happen or is being prevented from happening, has been going on, at full volume, at least since the day in 1619 when the first African slaves arrived in Jamestown. It has proceeded through every known form of discourse — passionate speeches, awkward silences, angry rants, sheepish whispers, jokes, insults, stories and songs — and just as often through double-talk, indirection and not-so-secret codes.
What are we really talking about, though? The habit of referring to it as “race” reflects a tendency toward euphemism and abstraction. Race is a biologically dubious concept and a notoriously slippery social reality, a matter of group identity and personal feelings, mutual misunderstandings and the dialectic of giving and taking offense. If that is what we are talking about, then we are not talking about the historical facts that continue to weigh heavily on present circumstances, which is to say about slavery, segregation and white supremacy.
But of course we are still talking about all that, with what seems like renewed concentration and vigor. Nor, in a year that is the sesquicentennial of the Gettysburg Address and the semicentennial of the Rev. Dr. Martin Luther King’s “I Have a Dream” speech, are we simply looking back at bygone tragedies from the standpoint of a tranquil present. The two big racially themed movies of the year, “Lee Daniels’ The Butler” and Steve McQueen’s “12 Years a Slave,” are notable for the urgency and intensity with which they unpack stories of the past, as if delivering their news of brutal bondage and stubborn discrimination for the first time.
And one of the strange effects of this country’s anxious, confused, hopeful and delusional relationship to its history of racism is that such narratives often do feel like news, or like efforts to overcome willful amnesia. The astonishing experiences of Solomon Northup, Mr. McQueen’s protagonist, a free man from Saratoga, N.Y., who was kidnapped and sold into slavery in the Deep South in 1841, are not being presented to the American public for the first time. Northup’s memoir was an antebellum best seller, nearly as widely circulated in abolitionist circles as “Narrative of the Life of Frederick Douglass” and Harriet Beecher Stowe’s “Uncle Tom’s Cabin.” A screen adaptation, directed by Gordon Parks and starring Avery Brooks in the title role (played by Chiwetel Ejiofor in Mr. McQueen’s version), was broadcast on PBS in 1984.
Some of the film’s representations of cruelty — whippings, hangings, the sexual abuse of a young female slave named Patsey by her sadistic master — will also stir the memories of those Americans (like me) for whom “Roots” was a formative cultural experience. In 1977, when the mini-series, based on Alex Haley’s book, was first broadcast, it was heralded not only for its authenticity and comprehensiveness, but also for its newness. This was the first time such a story had been told in such breadth and detail, and with so much assembled talent. It continued, two years later, with “Roots: The Next Generations,” which was in some ways more groundbreaking for bringing attention to the often neglected decades of struggle and frustration that fell between the end of the Civil War and the birth of the modern civil rights movements.
Such stories, of course, do not stay told. The moral, economic and human realities of slavery — to keep the narrative there for a moment — have a way of getting buried and swept aside. For a long time this was because, at the movies as in the political and scholarly mainstream, slavery was something of a dead letter, an inconvenient detail in a narrative of national triumph, a sin that had been expiated in the blood of Northern and Southern whites.
D. W. Griffith’s “Birth of a Nation” may look now like a work of reactionary racism, but it is very much an artifact of the Progressive Era, embraced by President Woodrow Wilson and consistent with what were then understood to be liberal ideas about destiny and character of the American republic. In Griffith’s film (adapted from “The Clansman,” a best-selling novel by Thomas Dixon), the great crime of slavery had been its divisive and corrupting effect on whites. After Reconstruction, the nation was re-founded on the twin pillars of abolition and white supremacy.
Which is also to say on the basis of terror and disenfranchisement. But that side of the story was pushed to the margins, as was the harshness of slavery itself, which was obscured by a fog of sentimentality about the heritage and culture of the Old South. This was the iconography of “Gone With the Wind,” and while the pageantry of that blockbuster seems dated (to say nothing of its sexual politics), the old times it evokes are not forgotten, as Paula Deen might tell you.
The appeal of “Roots” lay partly in its status as a long-delayed, always-marginalized counternarrative, an answer to the mythology and romance that had shrouded popular representations of the American past. Much doubt has been cast on the accuracy of Haley’s book (which was marketed as a novel based on family sources), but the corrective power of the mini-series lay in its ability to reimagine the generation-by-generation sweep of American history from a perspective that had not before been synthesized on screens or public airwaves.
In retrospect, “Roots” — which arrived on television a dozen years after the legislative high-water mark of the civil rights movement, in the more immediate wake of Richard Nixon’s Southern strategy and the first “Rocky” movie — may have succeeded so widely with white and black audiences because it simultaneously opened and closed the book on America’s racial history. There were a lot more American families like the one in the mini-series, but also with their own distinctive sagas of captivity, freedom, migration and resilience.
Those intimate stories had never been shared in such a wide and public fashion and the reception of “Roots” had a catalyzing effect on the imagination of many black writers. Until then, slavery had been something of a taboo in African-American literature, whose thematic center of gravity was in the urban North and whose theme was the black experience of modernity. But “Roots,” clumsy and corny as parts of it look now, helped beget radical and ambitious novels like Toni Morrison’s “Beloved,” Charles Johnson’s “Middle Passage” and Edward P. Jones’s “Known World.” They, along with artists like Kara Walker, took slavery as an imaginative challenge and an artistic opportunity.
But there was also a sense, after “Roots” and after “Beloved” claimed its place in the canon, that it had all been said. The white audience, moved by duty, curiosity and sincere empathy, could now move on. The horrors of the past, especially when encountered on television, cast a soothing and forgiving light on the present, where some of us could be comforted, absolved, affirmed in our virtue through the simple fact of watching.
But after such forgiveness, what knowledge? Post-“Roots,” a Hollywood consensus took shape that replaced the old magnolia-scented mythology with a new one, almost as focused on the moral condition of white people, but with a different political inflection. The existence of racism is acknowledged, and its poisonous effects are noted. But it is also localized, in time and geography, in such a way as to avoid implicating the present-day white audience. The racists are clearly marked as villains — uncouth, ugly, ignorant in ways that no one watching would identify with — and they are opposed by a coalition of brave whites and noble, stoical blacks. At the end, the coach and his players, the preacher and his flock, the maid and her enlightened employer shame the bigots and vindicate the audience.
There are variations on this theme, of course, but it is remarkably durable. It links, for example, “The Help,” Tate Taylor’s mild and decorous look at master-servant relations in Mississippi in the early 1960s (based on Kathryn Stockett’s novel), with “Django Unchained,”Quentin Tarantino’s violent and profane (if no less fantastical) examination of the same subject in the same state a little more than a century before. In both cases, a white character (Emma Stone’s writer; Christoph Waltz’s itinerant dentist) helps a black protégé acquire the ability to humiliate the oppressors. The weapon might be a book, a pie or a hail of gunfire, but the effect is the same. Justice is served and everyone cheers.
Some of us, perhaps including the white directors, are cheering for ourselves. Look how bad it used to be. Thank goodness — our own goodness — that it isn’t anymore. And of course it is never just the way it used to be. The abolition of slavery and the dismantling of Jim Crow really happened, against considerable odds and thanks to blacks and whites who took risks that later generations can only regard with awe and patriotic pride. The challenge is how to complete a particular story and leave the audience with the understanding that the narrative is not finished, that the past, to modify everyone’s favorite Faulkner quote, is not quite past.
Recent work by academic historians has emphasized the extent to which the exploitation and oppression of African-Americans — the denial of their freedom as workers and their rights as citizens — is embedded in the national DNA. Walter Johnson’s “River of Dark Dreams” shows how the Cotton Kingdom of the 19th-century Deep South, far from being a backward outpost of feudalism, was a dynamic engine of capitalist expansion built on enslaved labor. “Fear Itself,” Ira Katznelson’s revisionist study of the New Deal, shows how the great edifice of American social democracy, passed with the support of Southern Democrats, rested on and upheld the color line. And while neither white supremacy nor slavery has legal standing or legitimate defenders in America today, it would be hard to argue that their legacy has been expunged, or to confine their scope to the benighted actions of a few individuals.
Racism is part of the deep structure of American life, which is to say a persistently uncomfortable and also a persistently interesting subject, a spur to artistic creation as well as historical research. “The Butler” and “12 Years a Slave” may not be telling entirely new stories, but they are trying to tell them in new ways. Mr. McQueen infuses what looks like a conventional costume drama with the unflinching rigor that has characterized his previous films, “Hunger” and “Shame.” Mr. Daniels, the director of “Precious” and “The Paperboy,” blends melodrama, naturalism and brazen theatricality into a pageant that knowingly flirts with self-parody even as it packs a devastating emotional punch.
Some of that impact comes at the end of “Lee Daniels’ The Butler,” which pointedly asks the audience to consider what has and has not changed. It is not much of a spoiler to say that Barack Obama is elected president, an event that is especially sweet and piquant for the title character, a black man who worked in the White House through the administrations of every president from Eisenhower to Reagan.
And it is certainly not a spoiler to note that in real life, Mr. Obama’s election and re-election have not ushered in an era of colorblind consensus. On the contrary, the fervor of the opposition to the president, and its concentration in the states of the former Confederacy, have at least something to do with the color of his skin. But Mr. Obama’s victories, and the resistance to them, have opened a new and complicated chapter in a continuing story, which means also a new interest in how the past looks from this particular present.
This chapter is being portrayed on screen by filmmakers like Mr. Daniels and Mr. McQueen, and written in journalism, fiction and memoirs by a rising generation of African-American writers that includes Ta-Nehisi Coates, Kiese Laymon and Jesmyn Ward. Ms. Ward’s new book, “Men We Reaped,” is a Southern coming-of-age story that evokes a long tradition of black autobiography going back to slave narratives and (in its title) the words of Harriet Tubman and an old chain-gang work song. Ms. Ward’s stories of black men in tragic circumstances seem both ancient and contemporary, echoing back to the lives of those less lucky than Solomon Northup and connecting with the fates of Trayvon Martin and Oscar Grant III, whose 2009 killing by transit police in Oakland is the subject of the recent film “Fruitvale Station.”
What links these episodes is the troubling reality that now — even now, we might say, with a black president and a culture more accepting of its own diversity than ever before — the full citizenship, which is to say the full acknowledged humanity, of African-Americans remains in question. The only way to answer that question is to keep talking, and to listen harder.
In the popular imagination and in conventional discourse — especially in the context of highly charged news events such as the shooting of Trayvon Martin — prejudice is all about hatred and animosity.
Scientists agree there’s little doubt that hate-filled racism is real, but a growing body of social science research suggests that racial disparities and other biased outcomes in the criminal justice system, in medicine and in professional settings can be explained by unconscious attitudes and stereotypes.
Subtle biases are linked to police cadets being more likely to shoot unarmed black men than they are unarmed white men. (Some academics have also linked the research into unconscious bias to the Trayvon Martin case.)
Calvin Lai and Brian Nosek at the University of Virginia recently challenged scientists to come up with ways to ameliorate such biases. The idea, said Harvard University psychologist Mahzarin Banaji, one of the researchers, was to evaluate whether there were rapid-fire ways to disable stereotypes. Groups of scientists “raced” one another to see if their favorite techniques worked. All the scientists focused on reducing unconscious racial bias against blacks.
“Within five minutes, you have to do something to somebody’s mind so that at the end of those five minutes you will now show a lower association of black with bad. And so this was run really like a competition to see which ones of them might work to reduce race bias and which ones don’t,” Banaji said.
The results were as surprising for what they didn’t find as for what they did. Teaching people about the injustice of discrimination or asking them to be empathetic toward others was ineffective. What worked, at least temporarily, Banaji said, was providing volunteers with “counterstereotypical” messages.
“People were shown images or words or phrases that in some way bucked the trend of what we end up seeing in our culture,” she said. “So if black and bad have been repeatedly associated in our society, then in this intervention, the opposite association was made.”
Banaji, who has been a pioneer in studying unconscious biases, said she has taken such results to heart and tried to find ways to expose herself to counterstereotypical messages, as a way to limit her own unconscious biases.
One image in particular, she said, has had an especially powerful effect: “My favorite example is a picture of a woman who is clearly a construction worker wearing a hard hat, but she is breast-feeding her baby at lunchtime, and that image pulls my expectations in so many different directions that it was my feeling that seeing something like that would also allow me in other contexts to perhaps have an open mind about new ideas that might come from people who are not traditionally the ones I hear from.”
interesting article by larry alex taunton that breaks down some of the walls between the religious & non-religious while also highlighting how much one can gain by being tolerant of beliefs outside of his/her own. give it a read and share your thoughts on the piece in the comments. via the atlantic:
“Church became all about ceremony, handholding, and kumbaya,” Phil said with a look of disgust. “I missed my old youth pastor. He actually knew the Bible.”
I have known a lot of atheists. The late Christopher Hitchens was a friend with whom I debated, road tripped, and even had a lengthy private Bible study. I have moderated Richard Dawkins and, on occasion, clashed with him. And I have listened for hours to the (often unsettling) arguments of Peter Singer and a whole host of others like him. These men are some of the public faces of the so-called “New Atheism,” and when Christians think about the subject — if they think about it at all — it is this sort of atheist who comes to mind: men whose unbelief is, as Dawkins once proudly put it, “militant.” But Phil, the atheist college student who had come to my office to share his story, was of an altogether different sort.
Phil was in my office as part of a project that began last year. Over the course of my career, I have met many students like Phil. It has been my privilege to address college students all over the world, usually as one defending the Christian worldview. These events typically attract large numbers of atheists. I like that. I find talking to people who disagree with me much more stimulating than those gatherings that feel a bit too much like a political party convention, and the exchanges with these students are mostly thoughtful and respectful. At some point, I like to ask them a sincere question:
What led you to become an atheist?
Given that the New Atheism fashions itself as a movement that is ruthlessly scientific, it should come as no surprise that those answering my question usually attribute the decision to the purely rational and objective: one invokes his understanding of science; another says it was her exploration of the claims of this or that religion; and still others will say that religious beliefs are illogical, and so on. To hear them tell it, the choice was made from a philosophically neutral position that was void of emotion.
Christianity, when it is taken seriously, compels its adherents to engage the world, not retreat from it. There are a multitude of reasons for this mandate, ranging from care for the poor, orphaned, and widowed to offering hope to the hopeless. This means that Christians must be willing to listen to other perspectives while testing their own beliefs against them — above all, as the apostle Peter tells us, “with gentleness and respect.” The non-profit I direct, Fixed Point Foundation, endeavors to bridge the gaps between various factions (both religious and irreligious) as gently and respectfully as possible. Atheists particularly fascinate me. Perhaps it’s because I consider their philosophy — if the absence of belief may be called a philosophy — historically naive and potentially dangerous. Or maybe it’s because they, like any good Christian, take the Big Questions seriously. But it was how they processed those questions that intrigued me.
To gain some insight, we launched a nationwide campaign to interview college students who are members of Secular Student Alliances (SSA) or Freethought Societies (FS). These college groups are the atheist equivalents to Campus Crusade: They meet regularly for fellowship, encourage one another in their (un)belief, and even proselytize. They are people who are not merely irreligious; they are actively, determinedly irreligious.
Using the Fixed Point Foundation website, email, my Twitter, and my Facebook page, we contacted the leaders of these groups and asked if they and their fellow members would participate in our study. To our surprise, we received a flood of enquiries. Students ranging from Stanford University to the University of Alabama-Birmingham, from Northwestern to Portland State volunteered to talk to us. The rules were simple: Tell us your journey to unbelief. It was not our purpose to dispute their stories or to debate the merits of their views. Not then, anyway. We just wanted to listen to what they had to say. And what they had to say startled us.
This brings me back to Phil.
A smart, likable young man, he sat down nervously as my staff put a plate of food before him. Like others after him, he suspected a trap. Was he being punk’d? Talking to us required courage of all of these students, Phil most of all since he was the first to do so. Once he realized, however, that we truly meant him no harm, he started talking — and for three hours we listened.
Now the president of his campus’s SSA, Phil was once the president of his Methodist church’s youth group. He loved his church (“they weren’t just going through the motions”), his pastor (“a rock star trapped in a pastor’s body”), and, most of all, his youth leader, Jim (“a passionate man”). Jim’s Bible studies were particularly meaningful to him. He admired the fact that Jim didn’t dodge the tough chapters or the tough questions: “He didn’t always have satisfying answers or answers at all, but he didn’t run away from the questions either. The way he taught the Bible made me feel smart.”
Listening to his story I had to remind myself that Phil was an atheist, not a seminary student recalling those who had inspired him to enter the pastorate. As the narrative developed, however, it became clear where things came apart for Phil. During his junior year of high school, the church, in an effort to attract more young people, wanted Jim to teach less and play more. Difference of opinion over this new strategy led to Jim’s dismissal. He was replaced by Savannah, an attractive twenty-something who, according to Phil, “didn’t know a thing about the Bible.” The church got what it wanted: the youth group grew. But it lost Phil.
An hour deeper into our conversation I asked, “When did you begin to think of yourself as an atheist?”
He thought for a moment. “I would say by the end of my junior year.”
I checked my notes. “Wasn’t that about the time that your church fired Jim?”
He seemed surprised by the connection. “Yeah, I guess it was.”
Phil’s story, while unique in its parts, was on the whole typical of the stories we would hear from students across the country. Slowly, a composite sketch of American college-aged atheists began to emerge and it would challenge all that we thought we knew about this demographic. Here is what we learned:
They had attended church
Most of our participants had not chosen their worldview from ideologically neutral positions at all, but in reaction to Christianity. Not Islam. Not Buddhism. Christianity.
The mission and message of their churches was vague
These students heard plenty of messages encouraging “social justice,” community involvement, and “being good,” but they seldom saw the relationship between that message, Jesus Christ, and the Bible. Listen to Stephanie, a student at Northwestern: “The connection between Jesus and a person’s life was not clear.” This is an incisive critique. She seems to have intuitively understood that the church does not exist simply to address social ills, but to proclaim the teachings of its founder, Jesus Christ, and their relevance to the world. Since Stephanie did not see that connection, she saw little incentive to stay. We would hear this again.
They felt their churches offered superficial answers to life’s difficult questions
When our participants were asked what they found unconvincing about the Christian faith, they spoke of evolution vs. creation, sexuality, the reliability of the biblical text, Jesus as the only way, etc. Some had gone to church hoping to find answers to these questions. Others hoped to find answers to questions of personal significance, purpose, and ethics. Serious-minded, they often concluded that church services were largely shallow, harmless, and ultimately irrelevant. As Ben, an engineering major at the University of Texas, so bluntly put it: “I really started to get bored with church.”
They expressed their respect for those ministers who took the Bible seriously
Following our 2010 debate in Billings, Montana, I asked Christopher Hitchens why he didn’t try to savage me on stage the way he had so many others. His reply was immediate and emphatic: “Because you believe it.” Without fail, our former church-attending students expressed similar feelings for those Christians who unashamedly embraced biblical teaching. Michael, a political science major at Dartmouth, told us that he is drawn to Christians like that, adding: “I really can’t consider a Christian a good, moral person if he isn’t trying to convert me.” As surprising as it may seem, this sentiment is not as unusual as you might think. It finds resonance in the well-publicized comments of Penn Jillette, the atheist illusionist and comedian: “I don’t respect people who don’t proselytize. I don’t respect that at all. If you believe that there’s a heaven and hell and people could be going to hell or not getting eternal life or whatever, and you think that it’s not really worth telling them this because it would make it socially awkward…. How much do you have to hate somebody to believe that everlasting life is possible and not tell them that?” Comments like these should cause every Christian to examine his conscience to see if he truly believes that Jesus is, as he claimed, “the way, the truth, and the life.”
Ages 14-17 were decisive
One participant told us that she considered herself to be an atheist by the age of eight while another said that it was during his sophomore year of college that he de-converted, but these were the outliers. For most, the high school years were the time when they embraced unbelief.
The decision to embrace unbelief was often an emotional one
With few exceptions, students would begin by telling us that they had become atheists for exclusively rational reasons. But as we listened it became clear that, for most, this was a deeply emotional transition as well. This phenomenon was most powerfully exhibited in Meredith. She explained in detail how her study of anthropology had led her to atheism. When the conversation turned to her family, however, she spoke of an emotionally abusive father:
“It was when he died that I became an atheist,” she said.
I could see no obvious connection between her father’s death and her unbelief. Was it because she loved her abusive father — abused children often do love their parents — and she was angry with God for his death? “No,” Meredith explained. “I was terrified by the thought that he could still be alive somewhere.”
Rebecca, now a student at Clark University in Boston, bore similar childhood scars. When the state intervened and removed her from her home (her mother had attempted suicide), Rebecca prayed that God would let her return to her family. “He didn’t answer,” she said. “So I figured he must not be real.” After a moment’s reflection, she appended her remarks: “Either that, or maybe he is [real] and he’s just trying to teach me something.”
The internet factored heavily into their conversion to atheism
When our participants were asked to cite key influences in their conversion to atheism–people, books, seminars, etc. — we expected to hear frequent references to the names of the “New Atheists.” We did not. Not once. Instead, we heard vague references to videos they had watched on YouTube or website forums.
Religion is a sensitive topic, and a study like this is bound to draw critics. To begin with, there is, of course, another side to this story. Some Christians will object that our study was tilted against churches because they were given no chance to defend themselves. They might justifiably ask to what extent these students really engaged with their Bibles, their churches, and the Christians around them. But that is beside the point. If churches are to reach this growing element of American collegiate life, they must first understand who these people are, and that means listening to them.
Perhaps the most surprising aspect of this whole study was the lasting impression many of these discussions made upon us.
That these students were, above all else, idealists who longed for authenticity, and having failed to find it in their churches, they settled for a non-belief that, while less grand in its promises, felt more genuine and attainable. I again quote Michael: “Christianity is something that if you really believed it, it would change your life and you would want to change [the lives] of others. I haven’t seen too much of that.”
Sincerity does not trump truth. After all, one can be sincerely wrong. But sincerity is indispensable to any truth we wish others to believe. There is something winsome, even irresistible, about a life lived with conviction. I am reminded of the Scottish philosopher and skeptic, David Hume, who was recognized among a crowd of those listening to the preaching of George Whitefield, the famed evangelist of the First Great Awakening:
“I thought you didn’t believe in the Gospel,” someone asked.
“I do not,” Hume replied. Then, with a nod toward Whitefield, he added, “But he does.”
country music star brad paisley and rap icon james smith aka ll cool j created a buzz with their recent collabo “accidental racist” (listen to it below & check out the lyrics here):
the thing is most of the buzz that i’ve heard hasn’t been positive. some people view the song as a source of unintentional comedy; for others, it’s a source of shame and anger. while i think the two artists had good intentions in making the song, here are 4 lyrics from the track that symbolize where they went wrong:
“lookin’ like i got a lot to learn but from my point of view/i’m just a white man comin’ to you from the southland” (brad paisley): paisley starts out fine in acknowledging that he has a lot to learn (we all do on some issues). his reasoning though for what he doesn’t know is problematic. the “i’m just a white man” line echoes the “i’m just a simple man” trope found throughout country music. however, paisley’s use of it here to explain his “accidentally racist” behavior doesn’t fly. it’s 2013 where many people, especially those with paisley’s financial means, have a wealth of knowledge literally at their fingertips if they want it. whatever his intentions were for wearing a dixie flag t-shirt in the song, even a so-called simple, white man from the south (and paisley’s west virginia barely qualifies) should be fully aware that the racist history associated with the flag (a history that even paisley acknowledges in the song that we can’t just rewrite) will make the shirt offensive to many people. choosing not to avail yourself of such knowledge or even worse, knowing better and not acting on it, isn’t an accident. it’s a failure (in this case, potentially a harmful one).
“i try to put myself in your shoes and that’s a good place to begin/but it ain’t like i can walk a mile in someone else’s skin” (paisley): again, paisley starts off right by trying to put himself in another person’s shoes. however again, he downplays his capacity for change in the matter with the rest of the line. of course, you can’t literally walk in another person’s skin (unless you’re eddie murphy). still, pointing that out here makes it seem like the understanding he’s seeking is beyond his scope when in reality, it’s not. a good part about empathy is that it can allow us to feel/understand another person’s struggles without fully experiencing it for ourselves (if we open ourselves up to do so).
“dear mr. white man” (ll cool j): the start of uncle l’s verse might seem innocuous to some, but for me, it hearkens back to the forced one-way flow of respect given by blacks to whites dating back to american slavery times. sadly, this reading of the lyric actually fits the humble, subservient tone mr. smith takes with the rest of his verse (it’s ironic that he later equates himself to quentin tarantino’s django when throughout the song he sounds more like stephen, sam jackson’s character in the movie). smith’s flaccid approach is surprising coming from the same man who did this. worse though, he seems to place himself on unequal footing when in this particular dialogue, a widespread establishment of equal standing is needed to address our nation’s racial problems.
“if you don’t judge my gold chains/i’ll forget the iron chains” (ll cool j): here, smith tries to make a deal with his white counterparts where if they stop judging him based on stereotypes, he’ll forget their ancestral sin of slavery. while the rhyme might make it seem like an equal exchange, it’s far from it. the most troubling part about it is the premature concession to just forget the iron chains and many other atrocities associated with america’s enslavement of black people. the purpose of remembering slavery isn’t to hold a grudge. history, regardless of the pain/embarrassment it may stir up, provides us with proper context for how we became the individuals/society we are today. such context is important when discussing race and the socio-economic issues linked to it (without it, we might confuse the progress in things like affirmative action as a final solution to an ultimately more complex problem).
in the end, the biggest mistake made by the “accidental racist” singer & the “accidental uncle tom” rapper isn’t as much about racism as it is about ignorance. paisley tries to use an “aww, shucks” attitude to avoid the responsibility demanded of him by his access to knowledge but it doesn’t work. for ll, his overly accommodating approach + premature “let bygones be bygones” attitude undermines the time/effort needed to truly heal 200+ years’ worth of wounds.
there’s a great quote from malcolm x that comes to mind: “don’t be in such a hurry to condemn a person because he doesn’t do what you do, or think as you think or as fast. there was a time when you didn’t know what you know today.” i don’t think anyone should simply condemn paisley or smith for their missteps (especially when they might be attempting something that other mainstream artists don’t on the regular). nevertheless, i do hope that the different reactions to the track will help them know better and do better in the future.
I was fortunate to grow up in the Bronx of New York City, and more fortunate to go to boarding school and college (and graduate school…and law school) afterwards. Being able to do so has had its countless blessings and opportunities, but it has also exposed me to a pet peeve: the use of the word “ghetto.”
I often hear the term from Blacks who grew up in more fortunate areas as they light-heartedly chastise each other for inappropriate behavior, interests or possessions. Although it can be applied to a plethora of things, I hear it most in reference to dress, dialect, loud voices, arguing/fighting and conspicuous consumption. Is there any truth to these stereotypes? Sure. You’d be hard pressed to find a stereotype that didn’t come from some truth, however minute. But the truth has nothing to do with its use.
While E. Franklin Frazier may be able to explain it better, I suspect that by pegging what they consider to be deplorable actions as ghetto, the Black bourgeoisie attempts to separate themselves from the stigmatized Black identity that has permeated American media since The Birth of a Nation. Blacks of all backgrounds (Caribbean, African, American), actually, have tried to distinguish themselves from the stereotypes. But instead of challenging the stereotypes, the Black community has accepted the stereotypes and convinced themselves that they don’t apply to them.
So, let’s back up.
What’s a ghetto? Well, unlike its common connotation, the ghetto is simply a section of a city that is predominantly occupied by a group of people ostracized for social (think racism; religious intolerance; etc.) or economic reasons. Consider checking out Theater of Acculturation. The Ghetto has been around for centuries and is not reserved for Blacks. Ghetto is not loud. Ghetto is not poor. Ghetto is not uncivilized. Sure, there may be loud, poor, and less civilized people in the ghetto. But, are there not poor people throughout America and the world? Do college kids not get rowdy?
What about the Black ghetto? While the Black ghetto of America has its vices like any other place, the Black ghetto has been responsible for culture changing music, dance, food, fashion and people. When you say that’s ghetto, do you mean Justice Sonia Sotomayor? General Colin Powell? Do you mean me?
Since growing up in the South Bronx of New York, I have been quiet, respectful, caring, hard-working, ambitious and eloquent (I also don’t drink or smoke…ever). I’m ghetto, but not the way it’s falsely used.
Yesterday, I heard two Black girls from well to do backgrounds loudly joke about a party—that they planned on going to—being ghetto, and I thought to myself as I studied for my exam, “maybe it’s just bourgeoisie…”
No, I would never try to attribute a set of behaviors to a particular demographic or geographic region. I just charge that you, despite the contemporary minstrelsy disguised as music and other media influences, resist the urge to do so.
The greatest rapper ever and one of the best American musicians ever is Jay-Z (more number #1 albums than any individual with no number #1 songs).
After him, it’s Dylan, Dylan, Dylan and Dylan.
Much respect to Tupac, Nas, Eminem, Joe Budden, Busta Rhymes, TI, Kanye, LL Cool J, Scarface, Ghostface and all the other great rappers of my time. Don’t want to imply that minstrelsy line applies to them.
Dwight Draughon is a ghetto boy with a BA from Princeton, MA from American and anticipated JD from Howard. He’s a dude that lives his life with a bunch of rules headlined by one: “I will NOT lose!”
1. I won’t touch civility so as to not offend anyone, but you get the point Back
click here for more 100% life.
sam richards, a senior lecturer of sociology at penn state university, believes that empathy is central to his field of expertise. in his ted talk, he puts this relationship in action by encouraging his audience to stand in an iraqi insurgent’s figurative shoes.
related: dr. richards heads up the world in conversation project which “explores the relationships between people of different cultural and ancestral groups through small group dialogues.” click here for more info.