ASMSA Expanding Global Programs

At the beginning of September, I was fortunate enough to go with a group from my school to Japan. The ASMSA (Arkansas School for Mathematics, Sciences and the Arts) group went to Tokyo, then up to Hanamaki (Iwate Prefecture) to sign a sister city agreement with Hanamaki Kita High School, and then down to Osaka to sign a global learning partner agreement with Tennoji High School. 

If you are interested in learning more about the trip, my school, and our global programs, please check out this article. It even has a few choice quotations from me! Feel free to ask any questions you may have in the comments section or email me. A few photos are below.

Me in front of the Kamakura Giant Budda (大仏). It is about forty feet tall.

Me in front of the Kamakura Giant Budda (大仏). It is about forty feet tall.

Local newspaper article from our Sister High School signing ceremony.

Local newspaper article from our Sister High School signing ceremony.

One of the many beautiful floats in the Hanamaki Autumn festival.

One of the many beautiful floats in the Hanamaki Autumn festival.

Lovely Osaka skyline view from our hotel.

Lovely Osaka skyline view from our hotel.

 

 

Mount Vernon's Environment as Washington’s Greatness

George Washington is a rightly revered figure in this country. However, I argue here that, unless you understand the Mount Vernon estate, you cannot fully appreciate how impressive a thinker and manager Washington was. Put another way, Mount Vernon's landscape and environment exemplify Washington’s genius better than any of his other accomplishments.

A variety of perspectives exist concerning Washington. While some have elevated the first president to god-like status (see “The Apotheosis of Washington”), others have taken a more nuanced view. Thomas Jefferson, for example, once said that Washington’s mind was not “of the first order,” but later wrote, “On the whole, his character was, in its mass, perfect, in nothing bad, in few points indifferent; and it may truly be said, that never did nature and fortune combine more perfectly to make a man great, and to place him in the same constellation with whatever worthies have merited from man an everlasting remembrance.” (Excerpt from Jefferson’s letter to Dr. Walter Jones)

When lauding the General, most people focus on Washington leading the Continental Army to victory in the Revolutionary War, heading the Constitutional Convention, or serving as the nation’s first President. These are undoubtedly remarkable achievements. But Washington’s most impressive gifts were less in leadership than in his incredible managerial mind. The environment and various interactions with it at Mount Vernon best display this.

This is a big assertion, so what I want to do with this blog post is highlight a few examples from the Mount Vernon grounds that display Washington’s keen vision and managerial sense.

dung house

Sometimes this organizational mind trended toward mundane details (but still very important). Mount Vernon had what is believed to be the first American structure devoted to composting. The “repository for dung” was an open-walled structure that housed manure and other organic materials that could be turned into fertilizer. In a 1785 letter to George William Fairfax, Washington said that the best farmer was “Midas like, one who can convert everything he touches into manure, as the first transmutation towards Gold.” Using manure as fertilizer is not terribly special, but Washington had a special sense of its importance, and that was all part and parcel of controlling to farm production at every step. 

Washington carefully crafted the plantation and its environment not only to suit his taste but as an expression of his wealth and incredible agricultural mind. For example, Washington had a phenomenal greenhouse. Greenhouses do not seem special to us today, but Washington’s was an impressive one of his own design. And his winter visitors would have been amazed to taste fresh coffee, oranges, lemons, and limes, and many of them likely had never seen a palm tree in person. During the winter, an enslaved laborer maintained a fire around the clock that warmed the greenhouse with radiant floorboard heating, keeping all of the sensitive tropical plants alive. Having a greenhouse was expensive, in terms of both money and labor, and Washington went through great effort to make sure that his was both pleasing and extraordinary. (The photo is from an insurance engraving of how the greenhouse looked during Washington’s life. Image credit Jason Steinagle.)

Front Gate and Bowling Green

Similarly, Washington took great pains to make sure that his house was viewed only on his terms. For example, while the mansion house Washington inherited (much tinier than his eventual additions would make it) had a single, straight driveway, Washington eventually changed that to a double, bell-shaped path. And in the middle of those paths was his bowling green, an immense stretch of grass. Again, this does not seem remarkable to twenty-first century viewers, but it was an ostentatious display of wealth at the time. The green had to be mowed with hand scythes (of course by enslaved persons), and they used stone rollers to ensure that the lawn stayed flat and not lumpy. The bowling green therefore represented a Sisyphean undertaking—in the summer, by the time mowers reached the end they had to start over at the beginning.

Mount Vernon lawn over the years. Image credit Luke Pecoraro.

Mount Vernon lawn over the years. Image credit Luke Pecoraro.

And Washington wanted viewers from the Potomac to have a similarly spectacular view. He planted trees so that these perfectly framed the mansion house from the river. As you can see in the photo (the current grounds, as best possible, recreate the estate as Washington left it in 1799), the view to the mansion house is flanked on both sides by trees that obscured the house from view until a river traveler had the perfect view. Trees conceal the house until it peaks from behind their foliage at just the proper moment, revealing itself in all its intended grandeur.

View of Mount Vernon from the Potomac. Image credit Jason Steinagle.

View of Mount Vernon from the Potomac. Image credit Jason Steinagle.

Mount Vernon Back Porch

It gets better. Washington thought that the natural hill interfered with this carefully constructed Potomac view and therefore had his enslaved workers shave down the hill in front of his back porch so that the landscape would not disturb the curated experience. You can see the curvature in the photo to the right. Look at how the slope dips down from left-to-right. 

Mount Vernon Tulip Poplar

 

One final fun connection—there are about a dozen trees still on the Mount Vernon estate that were alive during Washington’s life. Washington planted some of these very explicitly, such as two tulip poplars (some of the very tallest trees on the estate) placed at identical sides of Washington’s driveway to achieve the greatest effect. My school, the Arkansas School for Mathematics, Sciences and the Arts, has a tulip poplar descended from one of Washington’s trees.

George Washington’s brilliance is thus best understood through the meticulously curated and managed Mount Vernon estate. This is not to be misunderstood as saying that creating the “perfect” mansion grounds were Washington’s most important accomplishments. Not at all. But Washington’s mind was at its most powerful—and, frankly, unmatched—when he worked as a manager and organizer. It is why a high school in Arkansas is honored to have a small piece of the Mount Vernon landscape on its campus.

 

 

Plaque marking the Tulip Poplar at ASMSA. Image credit Corey Alderdice.

Plaque marking the Tulip Poplar at ASMSA. Image credit Corey Alderdice.

Remembering Mount Vernon's Enslaved Population

This past week I was fortunate enough to participate in the Gilder Lehrman Institute (GLI) “Era of George Washington” summer teacher seminar. The GLI very generously fully funds around thirty of these seminars all across the country (with one in Edinburgh!) every summer. At each one, about two dozen k-12 teachers, k-12 librarians, park service rangers, etc. get together with a senior scholar and a GLI master teacher to discuss a particular historical topic and learn how to better teach that topic.

Mount Vernon Sunset

Gordon Wood, Alva O. Way University Professor emeritus at Brown University, led the “Era of George Washington” seminar along with Gloria Sesso, GLI master teacher. Perhaps the best part of the seminar was its location: Mount Vernon. Not only did we get to do a number of private tours that went well beyond what a typical visitor might see, but we also had free run of the grounds after hours. You really cannot beat enjoying a sunset from George Washington’s back porch—simply magnificent. 

I am going to write at least one more post about Mount Vernon in the near future, but I wanted to start off with a post about the cemetery for the plantation’s enslaved population. Located not too far from Washington’s final tomb, the cemetery had three different posted markers, and, in conjunction, the three demonstrate changing interpretations of slavery.

1929 Marker

The first marker, placed in 1929, said it was in memory of “the many faithful colored servants” of the Washingtons. It is inappropriate to criticize the marker for using the word “colored” to describe the slaves of African descent, because that was considered an appropriate term at the time—see, for example, the NAACP (National Association for the Advancement of Colored People, founded in 1909). 

What is problematic, however, is calling the enslaved workers “faithful.” These individuals typically resisted their enslavement in a variety of ways, and many tried to escape (some were successful, some not—the most noteworthy failed attempts to escape George Washington were of seven men and women who were re-enslaved after the Revolutionary War when the British returned them to Mount Vernon; seventeen had run away to the British earlier in the war). But, perhaps most insidiously, the marker plays into the trope of the benevolent slave master who presumably was a sort of father figure to the enslaved. They, in turn, loved and respected the master and were “faithful” for the master’s kind treatment. The “mammy” stereotype, as an example, well fits within this particular interpretation.

But, unsurprisingly, it’s a bad interpretation. Calling the enslaved “faithful” to the Washingtons does damage to their history, perspective, and historical agency. And it seems particularly distasteful at a cemetery. 

Times change, though, and there are other markers there that do a better job. I particularly want to stress how impressed I am that the Mount Vernon Ladies Association (MVLA; it owns Mount Vernon) has not removed the 1929 marker. It would be very easy to try and cover up an unpleasant past, but the MVLA is respectful of the past and leaves the 1929 marker to help educate people about why past interpretations have been eschewed for modern ones. Clearly, the MVLA has made significant strides over the course of nearly a century. Kudos to the organization.

1983 Marker

Students from nearby Howard University, an historically black university, helped the MVLA establish the second marker in 1983 after a minor kerfuffle in the Washington Post that the original marker had become overgrown and forgotten.  Chastised and embarrassed, the MVLA has had a permanent marker at the site since then. The 1983 marker is beautiful and moving, but it too has language that can be unpacked.

The 1983 marker has shifted from calling the enslaved “colored” to instead calling them “Afro Americans.” More importantly, any sort of language about them being “faithful” was excised, as were all references to their masters at all. The memorial is about the enslaved peoples of Mount Vernon, and the new marker appropriately focuses exclusively on them. The actual structure of the marker rests upon a three tiered dais, with each level representing one of "faith," "hope," and "love." Love, being the greatest of these, is at the top.

Archaeological Team Marker

Interestingly, however, the enslaved workers are called “slaves.” While this was considered appropriate language in the past (even likely five years ago), in recent years language that emphasizes the humanity of enslaved peoples has become preferred. A shift from calling them “slaves” to calling them “enslaved people” focuses first on their personhood while emphasizing that enslavement was a violence done to the enslaved. 

The current-day sign explaining the archaeological dig in the cemetery thus uses this language and also mentions William Lee by name. Lee, sometimes called Billy Lee (he preferred “William”), served as Washington’s valet for many years, including during the entire Revolutionary War. Mentioning Lee by name further helps recover the past of enslaved peoples and also attempts to keep their humanity as in tact as possible. The newest sign thus represents an important shift, as does the archaeological effort to locate every grave within the cemetery.

Gravesite

You can see in the final photo a small section of the dig. I added a red line to highlight the gravesite just above it. The grave is a slightly darker yellow than the surrounding soil because a deep grave would stir up darker yellow Virginia clay. Even well over a century later, this difference can be seen just a few inches below the surface. The Mount Vernon archaeological team is doing a laudable job to help uncover and preserve the history of enslaved individuals on the property. 

In all, the Mount Vernon Ladies Association should be commended for shifting both their language and attitudes in ways that better respect the lives of Mount Vernon’s enslaved population. And the evolution of graveyard markers helps demonstrate that shift, reminding us that our understandings of the past can sometimes tell us more about ourselves than they do past peoples.

Celebrating Student Research

With this blog post, I want to celebrate research papers written by three of my U.S. history students this semester. The students below** each wrote superb essays highlighted by strong research, clear prose, and insightful arguments. Each paper may be downloaded from the link available on the paper's title.

One of the class objectives on my course syllabus was: “Demonstrate an appreciation for the historian’s craft, including the ability to develop and critically evaluate arguments based on evidence, especially primary sources, and separate long-held assumptions and myths from historical interpretations that are supportable by evidence.”

To that end, at least once a week this school year we worked with primary sources (we use several excellent document readers to supplement our not-so-excellent textbook). With the research paper, I expected students to put what they had learned all year to practice and become historians themselves.

The actual paper could have been on any topic in U.S. history, was to be four-to-five pages, have original, primary source research, be placed within the historiography, and argue for a larger idea within U.S. history. (I always push my students to find “the big idea,” just as my mentors did to me.) Each of these student essays did that.

Quite a few of my students wrote very good papers, but these three in particular stood out to me. Our school does not offer any sort of formal awards for our U.S. history courses, but I wanted to recognize these students for their excellent work. It is lamentable how few avenues exist to praise research done by undergraduate and high school history students. This is my small attempt at a corrective measure.

**     **     **     **     **

Carson Cato, “Sputnik and the American People”

Cato’s (he goes by his last name) paper stood out for a number of reasons. Most especially, I was impressed with the way he used New York Times articles as a proxy for U.S. public opinion on Sputnik. While his conclusion is not abnormal—Sputnik caused the U.S. public to feel “trepidation and inferiority”—the research truly shows this idea. Moreover, the paper is well written, well placed within the historiography, and thoughtful in terms of its treatment of the connection between the media, government, and public opinion.

Calista Keck, “The Lynching of Jesse Washington”

Cali’s paper details the 1916 lynching of Jesse Washington in Waco, Texas. Even though the event was horrific, Cali excelled in crafting a clear narrative full of argumentative vigor. With sharp research, especially including visual sources, Cali demonstrated that, while the residents of Waco normalized the assault and considered it justice, a significant portion of the rest of the country, led by the NAACP, found the event reprehensible. The end result was greater public recognition of the practice of lynching, which eventually helped lead to the practice’s downfall.

Landon Middleton, “Christian Socialism from 1890 to the 1920s”

Landon’s project really took off after we studied the Red Scare of the 1950s. His original research question centered around why anti-communist forces of the 1950s put “In God We Trust” on the U.S. currency and generally believed that socialism and Christianity were entirely antithetical. That question led Landon down the rabbit hole until he arrived at the fin-de-siècle Christian socialism movement. His final essay combined contemporary writings with secondary sources to trace the evolution of Christian socialism from being a social movement to becoming a political movement. In the end, I was most impressed with how he wrote with nuance in describing historical Christianity and socialism, showing a panoply of beliefs among past peoples.

 

**As a reminder, I teach at the Arkansas School for Mathematics, Sciences, and the Arts, a public, residential high school for academically advanced 11th and 12th graders. Since each of these students is a minor, I secured both their and their parents’ permissions to post their essays and identify them by name.

A Magnificent Moment

Being an educator is frequently an enervating experience. We work long hours, feel pressure from multiple fronts, and do so for little pay. But we love it, most of us. We seize upon particular moments where everything comes together like the carefully designed ending of a novel. I had one of those moments a few weeks ago.

Over spring break, one of my U.S. history students went to France with her family and visited Normandy Beach. In class, we were in the middle of a unit on World War II, which straddled the break. When she returned, she came to my office hours somewhat sheepishly, and started talking with great feeling about her trip.

My student told me how moving it was to visit the site of the D-Day landing, and how standing on the beach was a watershed moment. She told me that, as she stood at the liminal space where the waves lapped at the shore, she stared up at the hills where Nazi machine gun nests would have been. After describing this moment to me, she said, “I don’t think I would have gotten it if not for your class—I wouldn’t have been able to understand what it must have been like and put myself in those soldiers’ shoes on both sides.”

This was, of course, quite a moving moment for me as well. We always wonder whether our lessons actually mean anything to our students. For me, the notion of perspective likely remains my most fundamental idea in history education—if you cannot understand an event from the multiple perspectives of all historical actors involved, you do not truly understand an event. But to hear my student telling me that her experience on Normandy Beach mattered so much because of the idea of perspective? It was, in all honesty, a tremendous moment as a teacher.

What she did next blew me away. My student pulled out a small bag and removed from it a small glass jar (pictured right). The jar had sand from Omaha Beach that she had carried for me all the way across the pond. (If you are wondering, the sand is incredibly fine, almost like a dust. It is, for lack of a better word, beautiful, perfect sand.)

For one of the few times in my life, I was speechless. Not only had she thought of me during that experience, but it had been impactful enough that she went out of her way to bring back something of that memory to share with me. It was simply a magnificent moment as an educator.

I know this comes off as a “humble brag,” but I do not really care. The interaction meant a great deal to me, and I wanted to preserve that feeling here and share it with others. And, I wanted to hope that each of you have a similar experience sometime soon, even if I know you probably will not. Because, if you are a teacher, you know that these moments are few and far between—that is why we need to hold onto them so tightly.

Down with Textbooks (or at least one of them)!

My school has a lousy United States history textbook. Professors Mark C. Carnes and John A. Garraty are both well respected and lauded historians, but that does not mean that The American Nation is a good book.

Okay, maybe “lousy” is a bit too strong, but I can confidently say that the book has numerous problems. Some of these complaints are probably somewhat minor.

For example, the book spends two full pages on the 1997 movie Titanic (pp. 614-615), but only two sentences (and a painting!) on the actual 1912 sinking (p. 618). The two-page insert on the movie attempts to make a connection to changing sexual values at the time, but is rehashing Leo and Kate’s tryst the best way to do that?

Other issues range from completely puzzling to annoyingly humorous. One passage equates Afghanistan war veterans suffering from Traumatic Brain Injury (TBI) with World War I veterans who suffered from “shell shock” (p. 610). These conditions are not the same, and it does a disservice to our students to conflate the two and remove the historicity from “shell shock.”

Less harmful, but much more boring, in an earlier chapter the textbook contains essentially a full page on “Higher Education in [Colonial] New England” (pp. 70-71). My first thought was to question why this is valuable to students (I still do not know). After reflection, the section seems to be no more than an excuse to praise Harvard as a shining beacon of intellectual achievement while dismissing Yale as a lesser institution (at least in its origins). Coincidentally, Professor Carnes did his undergraduate studies at Harvard. Hmm.

Some of the book’s problems are more noteworthy, however. In particular, I have significant problems with its treatment of the black experience in U.S. history. Early twentieth-century lynchings are not even mentioned in the text (only in a photo caption on p. 579), and Ida B. Wells is not included at all. Contrast that with the five paragraphs on “Crack and Urban Gangs” in the 1980s.

The fact that a section on “Crack and Urban Gangs” exists at all is troubling (p. 837), especially considering crack did not become a problem until a few years AFTER President Reagan declared the War on Drugs (see Michelle Alexander, The New Jim Crow, p. 5). Spending several paragraphs talking about drive-by shootings and how “Black on black murder had become a significant cause of death for African Americans in their twenties” is not just problematic—it is a problem.

Why is that worthy of inclusion in such detail (no less the wrong details, in my opinion), but lynching basically goes unmentioned?

The two-page insert on President Obama (pp. 824-825) probably bothers me the most. The two photos it uses are: (1) Obama, at 2 years of age, being held by his mother; and (2) Obama smoking marijuana in college in 1980. Now, do not misunderstand me—I actually sort of adore the “pot-smoking Obama” photo. I do not, however, think it is the best photo for inclusion into a textbook, especially considering other presidents are not portrayed this way at all. (They are essentially telling us that they could not find any photos of Ronald Reagan other than his charming cowboy look (p. 819). The man spent how long in Hollywood and there are not any photos of him drinking and carousing? Hmm.)

The worst part of the section on Obama is that one of the “Questions for Discussion,” after making sure the reader knows Obama had mixed-race parents, asks students, “President Obama identifies himself as black. Do you agree?” This is truly an awful question not only because it does not MATTER whether students agree about Obama’s racial self-identification, but because it makes students think that it is their right to question persons of color, necessitating those people to authenticate their racial identity. I squirm just thinking about trying to moderate that discussion in class.

And, as an environmental historian (I will admit my axe to grind), I am a bit surprised that the environment gets largely left out as an explanatory factor in U.S. history. I did not expect the book to be driven by environmental history (such as in two excellent U.S. history surveys: Down to Earth by Ted Steinberg, and Republic of Nature by Mark Fiege). But I did hope, several decades into the field’s existence, that a textbook might include the field in some real way by cutting down on its seemingly excessive economic history. (Or the textbook could have removed one of the SIX chapters, out of thirty two total in the book, that essentially cover the years 1878-1914. By comparison, the years 1914-1960—two world wars, the Great Depression, the beginning of the Cold War, etc.—also get six chapters.) Environmental history is not a fringe sub-discipline anymore, especially considering the flagship journal, Environmental History, had the second-highest impact factor from 2000-2010 of any history journal!

All that said, I will admit that, in general, I am not a huge fan of textbooks. I think that, while they are the most economical way to deliver large chunks of information to students, they are frequently boring, expensive, overly focused on details, and any number of other undesirable things. Textbooks seem to teach our students that learning history is like preparing for bar trivia—memorize enough names and dates and you too can be a historian. (Non-sequitur: I love bar trivia.)

Perhaps the biggest problem with a textbook, however, is that it is difficult for students to realize that textbooks have perspectives and biases too (just like we all do). The textbook presents itself, by its very existence, as an unassailable tome of knowledge. It is perfect. And thus when the students get to a three-quarter page photo of a sixty-year-old John Garraty running a marathon in 1980 (why is this included in a section on aging Baby Boomers? p. 845), they do not question why Carnes and Garraty found space for that but not any number of the other things they omitted.

Perhaps the one the textbook does get right, in the end, is that I have ample opportunities to point out to my students how some historical interpretations are not as good as others.

Review of "Unverified"

Since this is the internet, I feel the need to post a “spoiler alert.” This blog post is about the documentary film “Unverified,” and may “spoil” things for you if you have not seen the film. Read at your own discretion.

 

Last week, Bradley Bethel, former University of North Carolina academic adviser in athletics turned documentary filmmaker, released “Unverified: The Untold Story Behind the UNC Scandal.” The film’s website heralded the documentary’s purpose as such:

Beginning in 2011, the story of UNC’s 'fake classes' made national headlines as a massive athletics scandal. Caught between university deans unwilling to accept responsibility and news media eager to implicate athletics, UNC’s academic counselors for athletes found themselves accused of complicity and without the means to defend themselves. Bradley Bethel was a reading specialist for UNC athletes and was outraged by the way the press portrayed his colleagues. Refusing to remain silent, he set out to defend those falsely accused and give them a platform to tell their side of the story. In the process, he realized the problem was even bigger than the media. Following Bradley over the course of a year, UNVERIFIED challenges the headlines and tells a story more complicated and heartbreaking than the one we’ve heard in the news.”

Bethel approached me several months ago about my willingness to watch the film when it came out and “review” it on my blog. (I feel that calling this a “review” gives too much authority to me.) I want to emphasize at the outset that Bethel made zero qualifications about the type of review I should write or its content. The thoughts in the following post are completely my own.

Previously, I wrote about Bethel and how I was disappointed in a Daily Tar Heel editorial about him and his film. The disclaimer at the beginning of that post is still appropriate here (in short: I am a UNC alumnus, love UNC athletics, and still have not met Bethel outside of the internet, overwhelmingly Twitter).

For most people, Bethel first entered the conversation about the UNC Scandal with a blog post titled “Truth and Literacy at UNC.” In that and many subsequent posts, Bethel attacked the veracity of claims made by Mary Willingham, a former reading specialist in the UNC athletics department, and Jay Smith, a chaired professor of history at UNC. (An independent investigation with external reviewers later demonstrated that Willingham’s claims about athlete illiteracy were false.) Bethel took on other people as well, but Willingham and Smith drew the majority of his fire.

Perhaps the cruelest cut in Bethel’s film is that Willingham plays an exceedingly minor role and Smith, even if pictured, is not mentioned at all. By failing to give Willingham and Smith significant roles, Bethel effectively marginalizes the role the two played in the whole ordeal. E tu, Brute?

The film was, in all honesty, not what I expected it to be. While I anticipated a film that probed the media’s treatment of the UNC scandal—which it did in many ways—what Bethel produced is actually a much more personal film. If you are expecting a documentary that does nothing but dissect media inaccuracies over and over, this is not your film. (For a primer on the scandal, even if it is one whose facts Bethel somewhat disputes in his film, see here.)

Early on in the documentary, Bethel recounts a story where he failed to stand up to a childhood bully and how he has felt guilty about that his entire life. He vowed as a child that, if presented with an opportunity to defend his friends again, he would not run from the bully.

Much of the film, then, is about following Bethel as he interviews various figures and defends his friends and fellow academic counselors Beth Bridger and Jaimie Lee, and to a lesser degree former senior associate athletic director John Blanchard. Bridger and Lee were terminated for their supposed role in UNC’s paper class scandal, and Blanchard announced his retirement in 2013 during the midst of the scandal. Bethel is, however, most certainly the central character of the film. At one point, he states, "I know of good people within or associated with athletics whose integrity has been questioned and for some whose careers have ended because of being mischaracterized." The film is his chance to tell his friends’ side of the story and defend their integrity. 

For the most part, Bethel handily succeeds at this goal. He skillfully presents a variety of viewpoints (including many current and former athletes) from those involved in all aspects of the scandal, excepting the media and UNC’s current administration, who we are told all refused to be interviewed (more on that later).

The points the film makes over and over again: How could those in athletics have known what was actually happening in the African American Studies Department? Moreover, why would those athletics folks have ever thought to question anyone in academics, let alone a department head? (ESPN analyst, lawyer, and former dook** basketball player Jay Bilas especially makes this second point in the film.)

These basic questions and others eluded UNC administrators and Kenneth Wainstein (the former federal prosecutor paid $3.1 million to investigate the scandal), “Unverified” contends, because it was easier to blame athletics, low-level employees, and protect academics. The media tied into this by not questioning their sources appropriately (especially Willingham), and presenting a sensationalist view designed to get clicks on the internet. It is, after all, easier to sell a morality play to the public than to present nuanced stories with fewer clear villains and heroes.

The film thus starts off talking about the media’s sensationalism and ends up being more about questioning whether Kenneth Wainstein and those in power at UNC were fair to everyone involved and exhibited due process. As UNC journalism professor Adam Hochberg points out in the film, anyone terminated with cause because of the scandal was fired by the university and not the media. 

The documentary’s central point ends up being that Bethel thinks his friends Beth Bridger and Jaimie Lee lost their jobs not because they did anything wrong, but because they were easy targets. Firing low-level support staff who make $40,000 a year is a lot easier, he contends, than asking tenured full professors and UNC deans why they did not have a better control over their academics. (Former UNC Chancellor James Moeser even says  in the film that the AFAM department got a bit of a pass because nobody in the administration wanted to be seen as being harsh to the “black” department—my quotations, not his.)

Bethel ends his film with the revelation that the NCAA’s notice of allegations (where the institution laid out its interpretation of UNC’s wrongdoings) did not mention Bridger, Lee, or Blanchard at all. The film is, therefore, really about trying to tell a narrative that gives back power to normal people who had their power and careers wrested away by large bureaucracies and the media. (Bridger claims that she was actually fired without cause just because her name appeared several times in the Wainstein report.) 

In this way, “Unverified” is a complete success. It tells a nuanced story and gives voice to normal people (even if it does not always ask those normal people the hard questions). While media outlets often only play short clips of interviews, the film frequently lets the camera roll, giving aggrieved parties a chance to vent their frustrations and explain how they feel they have been misrepresented. But the documentary deftly avoids becoming a “gripe session,” and instead moves with pace to focus on a larger narrative (though that narrative focus seems to switch halfway through from focusing on the media to focusing on UNC’s administration and Kenneth Wainstein.) 

One especially nice moment that probably best illustrates Bethel’s point about media sensationalism comes during his interviews with former football player Deunta Williams. Williams claims that ESPN’s show “Outside the Lines” misrepresented him and his comments, and he was especially upset that “Outside the Lines” took a lot of film of him in his home, driving around, and said they would use this show how successful he was. When the show aired, however, it only called Williams a “fast food worker.”

In reality, Williams is a restaurant owner, high school football coach, and real estate investor. He has several employees and seems to be doing well, from the documentary. Unlike ESPN, “Unverified” does show Williams walking around his house and working hard, and does show him driving around town in his new Audi. Where the media “talks the talk,” Bethel shows, he and his movie “walk the walk.”  

While in general I was impressed with “Unverified,” I did have some concerns or critiques. Bethel makes it clear that various media outlets sensationalized or reported falsehoods about the UNC scandal, but this is not done in as focused a way as I would have expected or liked. Instead, we get references scattered all over the film. (Although, to his credit, Bethel does make two media inaccuracies clear: (1) the UNC scandal was academic, not athletic in nature; and (2) his fellow academic support counselors have been misportrayed.) If you were not fairly familiar with the UNC scandal going into this, you would not necessarily get some of Bethel’s finer points. A short segment at the beginning correcting inaccurate media claims would have been helpful. 

Also, at times I wondered whether the focus of the film was really about “The Untold Story Behind the UNC Scandal” or instead about Bradley Bethel. I am not at all suggesting that Bethel comes off as arrogant, narcissistic, or self-aggrandizing. But there were times—such as both his interviews with Hochberg—where it almost felt like the viewer was intruding on Bethel’s therapy session. While focusing the film on Bethel helps tie some of the documentary’s larger narratives together, it also means that viewers get a lot of Bethel and his seemingly inner thoughts. 

And, finally, one awkward scene occurred when Bethel called Joe Nocera, the New York Times reporter who wrote about the UNC scandal. On speakerphone (and on camera), Bethel asked if Nocera wanted to be interviewed for the film. Nocera was dismissive and somewhat rude, but he emphatically declined to be interviewed for the film. So, why was that speakerphone call taped and included in the documentary? If Nocera did not want to be in the film then his wishes should have been honored. 

These are, in the end, fairly minor complaints. “Unverified” is a good documentary that ultimately is about how people in power get to make decisions that influence the rest of us. It is about authenticity and narrative in journalism, and how, while the connection is nebulous, the media can greatly affect seemingly innocent people in profound ways. And the movie is about standing up to entrenched power structures that can bury “the little guy” because standing up to those power structures is the right thing to do.

I think two tweets from Bethel sum up his thoughts and general perspective after completing the film: 

Who could argue with a desire that journalists—and by extension all of us—were more open about the biases and perspectives that we all carry?

My final thought of the film is this: anyone who watches “Unverified” will be happy that Bethel gives voice to those who have been stripped of their dignity and reputation by large, bureaucratic organizations that frequently seem more concerned with protecting their own power and authority than doing what is right. In that way, the documentary is especially a job well done. Moreover, all viewers, no matter their thoughts about the scandal, will finish with a more nuanced view of what happened and a keener eye toward recognizing both media sensationalism and how they fit into the power structures in their own lives.

 

**As a unrepentant Tar Heel, I just cannot bring it of myself to type out the most commonly accepted spelling of that university in Durham, NC.

Selling Nature: Mountain Valley Water

My latest research project centers on Mountain Valley Water, a premium bottled water company located in Hot Springs, Arkansas. In 1928, the company became the first nationally-distributed bottled water, with its distribution network stretching from California to New York City. And Mountain Valley proudly proclaims that everyone from presidents to celebrities to racehorses have quaffed the beverage. (Eisenhower once mentioned the company by name in a press conference, and the famed Secretariat was even a patron.)

My newest position at the Arkansas School For Mathematics, Sciences, and the Arts is a great gig, but with a 5/5 teaching load I don’t have an overabundance of time for research. Not only is Mountain Valley a local company, meaning it was fairly easy to find primary sources, but its corporate identity well fits into my research interests.

The longer paper argues that Mountain Valley’s history represents interconnected issues of nature, health, and capitalism. For this shorter blog post, however, I just wanted to share two interesting advertisements I had found. They give a brief glance into this company’s fascinating history and these larger themes.

The first one is from the Arkansas Gazette, printed on 6 November 1939. Hot Springs, essentially from the outset of its human discovery, has possessed a reputation for being a healthy place. The springs that bubbled forth were revered as a natural cure for any number of diseases, especially rheumatism. As can be seen in the advertisement, Mountain Valley emphasized not only that consuming its water could improve human health, but also that the product was the “natural aid” to do so.

In another ad, this one from the Courier-Journal (Louisville, Kentucky), printed on 11 November 1939, Mountain Valley leveraged the same notions. On a basic level, the ad describes how, long before Euro-American settlers conquered the area, American Indians knew of the water’s supposedly curative properties. Through this line of argument, the advertisement augmented previous health claims with a notion of permanence.

But by hearkening back to Indian land usage and environmental understandings, the company drew upon popular notions of Nativeness to emphasize a connection to the natural world. As Shepherd Krech argued in The Ecological Indian (1999), popular stereotypes of Indians portrayed them as both “ecologist and conservationist,” particularly as “noble savages.” (The etymology of “savage” is originally from the Latin, meaning woodlands—silva.) In this case, advertising that American Indians used the area to cure illnesses bolstered claims of the springs' natural powers and emphasized Mountain Valley’s connection to the environment.

Unsurprisingly, a great many other Mountain Valley Water advertisements exist—brochures, pamphlets, newspaper ads, etc.—the company even ran a Time magazine campaign in 1940. But since its founding in 1871, as I hope to show in longer, published formats, the company developed an identity predicated on connecting a salubrious natural world to wholesome bodies. Healthy environments in this case meant healthy bodies, and hopefully healthy profits.

Images are courtesy of the Garland County Historical Society's archives.

Withering Heights: The Battlefield Geography of Antietam

I’m very pleased that my colleague John Hess has agreed to do a guest blog post to help commemorate the anniversary of the Civil War battle at Antietam/Sharpsburg. Fought near Antietam Creek close to Sharpsburg, Maryland on 17 September 1862, the battle created over twenty-two thousand casualties and remains the single most violent day in U.S. history.

John visited the historic battlefield site this summer and took a number of photographs that reveal history in a way that written description alone often cannot. Simon Schama helped us realize in Landscape and Memory (1995) that the natural world can profoundly influence popular memories and histories. But a twist on that idea can be true as well: we can forget what landscapes look like to the detriment of how we remember and understand historical events.

Anne Kelly Knowles (Middlebury College) has used GIS technology to change how we view Civil War history, particularly the Battle of Gettysburg. For example, by melding topography and top-down geography she has questioned whether Confederate General Robert E. Lee could actually see much of the battlefield, possibly helping explain some of his militarily ineffective decisions. In a similar way, we now turn to John’s guest post for how his photos can help explain why the battle at Antietam unfolded the way it did.

**note: Obviously landscapes and environments can and do change over time. The authors recognize that, while the battlefield likely looked very similar to how it does now, it did not, in 1862, look exactly as it does today.

**     **     **     **     **

One hundred and fifty-three years ago today, Robert E. Lee’s Army of Northern Virginia met the Union Army of the Potomac, under the command of General George B. McClellan, near Antietam Creek in a rural area of western Maryland. After victories over Union armies during the Peninsula Campaign and the Second Bull Run, Lee embarked on an invasion of the North in the fall of 1862. He hoped that Confederate victories on Northern soil would bring European recognition of the Confederacy, particularly from Great Britain and France. The invasion began well enough for Lee. Dividing his army into three columns, he sent General Stonewall Jackson to capture an important ammunition depot at Harpers Ferry, Virginia, while the rest of the army moved into Maryland. Lee’s invasion may have worked against the ever-cautious McClellan, but fate then seemingly intervened. Two Union soldiers found a copy of Lee’s orders wrapped around some cigars and immediately forwarded those to McClellan. The Union general now possessed the ability to destroy the Army of Northern Virginia, but he once again moved slowly. On September 17, 1862, the armies met east of the small town of Sharpsburg, Maryland, along Antietam Creek. McClellan outnumbered Lee almost two-to-one, as rebel regiments were still marching from Harper’s Ferry. Nevertheless, the Confederates held geographic advantages on the battlefield and what followed was the bloodiest single day in American history.

Above: A panorama of the northern part of the battlefield. This pictures looks west towards Sharpsburg.

Above: A panorama of the northern part of the battlefield. This pictures looks west towards Sharpsburg.

The engagement began with Union attacks on the northern edge of the battlefield (shown above). Union regiments advanced out of the north woods southward towards Confederate positions over a broad, flat field. Marching north to south (right to left in the picture), the terrain provided ideal for keeping formation. But that flatness exposed Union regiments to withering fire from rebel artillery and provided no cover from flying shrapnel.

Above: The view northward across the old cornfield.

Above: The view northward across the old cornfield.

The Union advance produced one of the bloodiest engagements of the Civil War. In the picture above, which looks northward into where the cornfield once stood, you can see how open the terrain was. Advancing Union soldiers would have made easy targets to Confederate artillery batteries. Now imagine a large cornfield, with stalks nearly as tall as a grown man. Confederate soldiers waited on the southern edge of the cornfield and met Union soldiers with a wall of soft lead as they moved out of the cornfield. A young Union private, marching in line out of the cornfield, would have had to adjust to the relative brightness after the cornstalks blocked out much of the sun. Coupled with massed Confederate rifle fire, the battle in the cornfield would have been confusing and chaotic. The infamous cornfield ultimately changed hands several times during the early morning hours and the fighting produced thousands of casualties on both sides; in some units, 60% of the men were killed or wounded in just a few hours.

Above: Looking west towards the Confederate positions in the West Wood.

Above: Looking west towards the Confederate positions in the West Wood.

After the Union eventually secured the cornfield, a fresh division under the command of General John Sedgwick joined the battle and the new forces pivoted to the west with hopes of rolling up the Confederate flank. Sedgwick’s division of some 5,000 men advanced into the West Woods where it met withering fire from three different directions: artillery from the west, along with infantry and cavalry units to the south and southwest. The close confines of the West Wood, as seen above, combined with the smoke of battle to produce organized chaos. Soldiers and commanders would have had little idea what was going on beyond their immediate vicinity. Additionally, the forest broke up unit formations, meaning it was difficult, if not impossible, for Union regiments to attack in mass, negating their numerical superiority. As a result, Northern units suffered horrendous casualties. The Philadelphia Brigade, for example, lost some 500 men in less than twenty minutes of fighting in the West Wood.

The advance into the West Wood also brought the fighting to a small, white-washed church built by a sect of German pacifists: the Dunkers. Since it was so near to the forest, the little church became a small battlefield unto itself, as Union and Confederate regiments fought to control this small, but important geographic landmark just south of the West Wood.

Above: The reconstructed Dunker church. It was a focal point of fighting as the battle shifted to the south and west during the morning hours.

Above: The reconstructed Dunker church. It was a focal point of fighting as the battle shifted to the south and west during the morning hours.

As fighting raged in the West Wood, two Union divisions advanced (east to west) in the center of the battlefield. Some 10,000 Union soldiers attacked approximately 2,500 Confederate soldiers defending the center of the line in the sunken farm road seen below..

Above: Looking south along the sunken farm road. Union regiments advanced from the left

Above: Looking south along the sunken farm road. Union regiments advanced from the left

The sunken road stretched for several hundred yards in the middle of the battlefield and provided a natural trench for Confederate regiments. As a result, rebel soldiers could fire and reload in relative safety, as long as Union soldiers were kept at bay.

And for several hours, the attacking Union divisions were repulsed again and again as they attacked the improvised trench. As you can see below, Confederate soldiers facing east towards the advancing Union regiments had clear fields of fire and, once again, Union riflemen had no real cover against Confederate fire. Silhouetted against the sky and lined up in formation, Union soldiers made easy targets for the defenders. The result was a bloodbath. The 2,500 Confederates held off the 10,000 Union attackers for hours from what became known as “Bloody Lane.”

Above: The open field over which Union regiments advanced towards Bloody Lane.

Above: The open field over which Union regiments advanced towards Bloody Lane.

Above: The clear fields of fire possessed by the Confederate defenders.

Above: The clear fields of fire possessed by the Confederate defenders.

Above: The view of a young Confederate rifleman would have had at Bloody Lane.

Above: The view of a young Confederate rifleman would have had at Bloody Lane.

Eventually, however, Union regiments got around the Confederate flank of Bloody Lane and could fire down the gulley’s length. The terrain that had so benefited the Confederates for hours during the morning now became their undoing. Now without cover, the Confederate defenders were nearly annihilated. With the center of their defense broken, the remaining Southern regiments withdrew towards Sharpsburg and fighting died down in the center.

Well south of the fighting in the north the battle continued at a small bridge that crossed Antietam Creek, perhaps the most famous landmark of the battle. A Union corps under the command of General Ambrose Burnside unsuccessfully tried to force the crossing throughout the morning with small, piecemeal attacks, but was repulsed each time.

Above: The bridge over Antietam Creek, known as “Burnside’s Bridge.”

Above: The bridge over Antietam Creek, known as “Burnside’s Bridge.”

The Confederates held a major geographical advantage at Burnside’s Bridge, as it was ultimately called. Confederate regiments held the high ground on the western side of the creek. As you can see below, the heights provided them with a commanding defensive position from which they unleashed a deadly hail of fire upon the attacking Union soldiers.

Above: The view from the Confederate positions on the west side of the creek.

Above: The view from the Confederate positions on the west side of the creek.

As an attacking Union soldier, the Confederate position was undoubtedly daunting. Union riflemen had to assemble on the east side of the creek within range of Confederate rifle fire. They then had to cross the bridge under constant fire and fight their way up the relatively steep heights in the hot sun. The pictures below only begin to capture the challenged faced by Union soldiers. As a result, the outnumbered Confederate regiments managed to hold off the Northern units throughout the morning.

Above (two photos): The perspective of a Union soldier on the east side of Antietam Creek and as he would have advanced across the bridge.

Above (two photos): The perspective of a Union soldier on the east side of Antietam Creek and as he would have advanced across the bridge.

Only in the afternoon did Burnside finally attack across the bridge in force and Union regiments finally ousted the Confederate defenders occupying the heights. The superior terrain of the Confederate position, much like at Bloody Lane, had held up the Union attack for hours and prevented a quick Union victory.

At this point, the Union battle plan had not gone according to plan, despite outnumbering the Confederates nearly two-to-one. Nevertheless, disaster faced the Lee and the Army of Northern Virginia. Burnside had outflanked Lee’s army and advanced his corps towards Sharpsburg. If he could capture the town, the Confederate army would be trapped in Maryland with little hope of escape. The war might come to a quick end.

The few Confederate units standing in Burnside’s way took up positions on a rise in between Antietam Creek and Sharpsburg. This rise, pictured below, provided a solid defensive position, but by now the Confederates were so heavily outnumbered that the advantage in terrain did not halt the Union advance. Union regiments quickly pushed up the hill and advanced towards Sharpsburg.

Above: The final Confederate positions south of Sharpsburg. This is where Confederate regiments fells back to after Union forces finally captured Burnside’s Bridge.

Above: The final Confederate positions south of Sharpsburg. This is where Confederate regiments fells back to after Union forces finally captured Burnside’s Bridge.

 The battle, and perhaps the war, appeared lost until, in a move straight out of a Hollywood movie, thousands of Confederate reinforcements under the command of General A.P. Hill arrived from Harpers Ferry, some seventeen miles away. They arrived from the northwest at the last moment and blunted the final Union assault in the late afternoon. Reluctant to take more casualties, Burnside withdrew his corps and the fighting ended that day.

McClellan, always overly cautious, refused to attack again the next day despite holding a major numerical advantage. Then two days after the end of the battle Lee and the Army of Northern Virginia slipped back across the Potomac River and McClellan, despite prodding from Lincoln, refused to follow.

The clash of the Union and Confederate armies at Antietam provides an excellent demonstration of the importance of terrain in a battle. Heavily outnumbered, Confederate regiments took advantage of the chaotic West Wood, the improvised trench at Bloody Lane, and the heights at Burnside’s Bridge to hold back superior Union numbers. Despite being outnumbered almost two-to-one, Lee and the Army of Northern Virginia secured a tactical draw in the battle because of an excellent use of defensive terrain and due to the general advantage held by the defense during the Civil War (there were also the usual problems of bad Union generalship). But the result was the single bloodiest day in American history, with some 3,600 killed on the field and another 20,000 injured. More importantly, the battle looked enough like a victory for the Union that President Abraham Lincoln finally issued his preliminary Emancipation Proclamation. The war to save the Union, caused by division over slavery, became a war to end slavery in America.

U.S. gun violence and #blacklivesmatter

Two different incidents on Twitter caught my eye recently, and I have wondered if they are related.

ESPN talking head Bomani Jones tweeted a link to a Guardian article titled, “Horror, live for all to see: another week in American gun violence.” The article was specifically about two recent events in the United States: two journalists were shot to death on live television by a disgruntled former employee, and a 14-year old boy held his class and teacher hostage with firearm. But, more broadly, the piece was about the culture of gun violence in the U.S. that leads to 88 deaths per day due to shootings (about 32,000 a year).

Jones editorialized, “the world now gawks at us like we did south america and the middle east in the ‘80s and ‘90s. and it should.” 

In an incident that superficially seems unrelated, NBA player Kendall Marshall was criticized on Twitter by a fan for using the “#blacklivesmatter” hashtag made popular following Michael Brown’s death from police shooting in Ferguson, Missouri. (I will not name the fan because s(he) is not a public figure.) That fan thought that Marshall should instead use the “#alllivesmatter” hashtag. Moreover, that fan thought Marshall was only using #blacklivesmatter to increase his “street cred.”

Marshall sarcastically replied, “street cred babyyyyy.” 

Both overall U.S. gun deaths and deaths by police shooting have a racial tinge. The Pew Research Group claims that, though blacks represent just 13% of the U.S. population they comprise 55% of the shooting homicide victims (homicides were not quite 2/3 of all gun deaths over the studied time period). In terms of police shootings, CUNY assistant professor Peter Moskos claims that blacks are 3.5 times more likely to be killed by a police officer than whites. However Moskos did clarify that, when adjusted for the homicide and felonious crime rates, whites were more likely to be killed by police than blacks. (Methodological quandaries abound with all of these measurements.)

Drawing meaning from these numbers is difficult at best, but in terms of the population blacks are more likely to die from a firearm in the United States than whites. The potential reasons for that are varied, and a Google search will turn up quite a few of those. Many of those explanations are politically colored, and thus I will not proffer my own.

What does seem obvious to me, however, is that gun violence is a significant problem in this country. I have no idea how to fix that, but we as a nation should want to try. And we must realize that, amidst all those gun deaths, the black community bears a disproportionate amount of the carnage. It is no surprise that #blacklivesmatter became popular.

In the end, I think the Guardian story provided me an overall context to the debate about #blacklivesmatter vs. #alllivesmatter. Of course all lives matter—saying otherwise is nonsensical. But, considering the context that blacks are indeed more likely to die from gun deaths, is it any surprise that so many people have found it necessary to insist that black lives do indeed matter?

The #blacklivesmatter campaign is not an about saying that only black lives matter, but instead an insistence that black lives be considered part of all lives. Thus the phrases #blacklivesmatter and #alllivesmatter should be synonymous (even if #alllivesmatter started largely in opposition to #blacklivesmatter as an attempt to derail that movement). But, political disharmony being what it is, often proponents of the two phrases see themselves as antithetical to the other.

The Guardian article and Bomani Jones’ commentary combined with Kendall Marshall’s confrontation with a fan demonstrate several things to me: (1) gun violence in the United States is a serious problem; (2) black bodies disproportionately bear that violence; (3) we need to de-politicize the idea of stopping gun deaths; (4) we need to respect that, no matter the reasons why they are more likely to be shot, the black community is right to be hurt, demand change, and insist that their lives matter as much as white’s.

I have no idea how to fix any of these problems, and I fear that venturing a guess how to do so would show my ignorance in one way or another. Really, I guess I am just sad that we would let so many of our fellow countrypersons, of all races but especially minorities, die without making an honest attempt, as a nation, to do something to change that.

On LeBron James, Statistics, and the 2015 NBA Finals

Wednesday, 17 June was the saddest day of 2015 for me—no more basketball until NCAA and NBA seasons start back in the fall. To get me out of my post-basketball doldrums I wanted to do a blog post on the 2015 NBA Finals.

There are probably professionals doing what I am here (and doing it better), but I still wanted to crunch the numbers on the Cleveland Cavaliers team and individual statistics. Specifically, I wanted to try to put what LeBron James did in context. His performance was, not being hyperbolic, transcendent. (FYI most of the numbers below, unless stated otherwise, are taken either directly from ESPN’s box scores or calculated by me using those box scores.)

Because his All-Star teammates Kyrie Irving and Kevin Love (not to mention Anderson Verajao) mostly did not play in the Finals series due to injuries (Irving played most of game 1 while below 100% healthy and left in overtime with a broken kneecap), James was forced to take on an incredibly high workload. He did his best to “carry” his team to a championship. Even though he lost, I think his play deserves a deeper look.

ESPN noted on its stats Twitter page (@ESPNStatsInfo) that James was the first player in NBA history to lead the Finals series in points, rebounds, and assists. He averaged 35.8 points, 13.3 rebounds, and 8.8 assists—nearly a triple double average!

Moreover, James put up those numbers against arguably one of the best NBA teams of all time. The numbers gurus at Nate Silver’s 538 Sports have a rating system called Elo (borrowed from chess), and the 2015 Golden State Warriors (the opponent of James’ Cleveland Cavaliers) ended up with an 1822 Elo rating. That’s the second highest team Elo score in NBA history behind Michael Jordan’s record 72-win 1996 Chicago Bulls (team Elo of 1853).

And it is not like top-notch talent surrounded James either. 538 Sports ranked his supporting cast 59th out of the last 60 Finals teams (two teams a year for the last 30 years). Ouch.

How odious were LeBron James’ teammates in the Finals? While many have tried to dismiss James’ stat line for being inefficient, he was actually arguably more efficient shooting the ball than his teammates over the course of the series.

1.png

First off, the obvious answer. Did James shooting a low percentage hurt his team? We can turn to game-by-game +/- scores for that. +/- is a statistic borrowed from hockey that very simply measures whether a player’s team won or lost during his minutes on the court. Outscore your opponents 55-50 during your on-court time during a game? You get a +5 for the game.

Chart 2

In all but one game LeBron James had a better +/- than his teammates (higher numbers are better), and for the series the team was 18 points worse with James on the bench. Considering James played 275 out of 298 possible minutes (only resting an average of 3 minutes 50 seconds a game, evening including two overtime games), that statistic is meaningful.

James’ 275 minutes on the court? Outscored by 25 points (for the series going down a point every 11 minutes). James’ 23 minutes off the court? Outscored by 18 points (for the series going down a point every 1 minute 17 seconds).

With shooting splits of 40/31/69 James certainly could have shot the ball better. But his teammates combined for 38/29/71 shooting splits, just marginally worse but worse nonetheless.

Compared to his teammates, James’ Effective Field Goal Percentage (weights for three-point shots made) was virtually identical to his teammates: 43.1% vs. 43.2%. The same is true for his True Shooting Percentage (weights points scored vs. field goals and free throws attempted): 47.7% vs. 47.9%. (All of these numbers are somewhat poor.)

Let’s say we call all that a wash—LeBron James’ shooting efficiency was about the same as his teammates. If that is the case, how can we criticize James for being inefficient unless it is to criticize the whole team? James shooting the ball certainly was not any worse of an option than his teammates as a whole. (Considering the difficulty of shots he had to take it was probably better. More on that below.)

It would not be unfair to quibble about whether specific players should have gotten more shots in relation to James, of course. Center Timofey Mozgov shot 55% for the series and Power Forward Tristan Thompson shot 50% (the guard trio of Matthew Dellavadova, JR Smith, and Iman Shumpert put up the bulk of the team's poor shooting numbers  with combined 29/28/69 shooting splitstruly miserable). But Mozgov and Thompson got a great many of their made field goals off of assists (neither can consistently create his own offense) and offensive rebound putbacks (frequently off of shots that James missed close and drew a second defender). Increasing their workload would have been difficult and likely would have lowered their shooting percentages (fewer easy shots as described above).

No, we need to recognize that Lebron James shooting the ball typically was the team’s best option because of two reasons:

1.     James’ teammates could pick more optimal times to take shots, but James had to “carry the load” so to speak

2.     James’ teammates got to play with him, and that makes a difference

First, it needs to be pointed out that LeBron James had an astronomically high usage percentage (Usg%) of 46.7% in the NBA Finals. That means that almost half of his team’s possessions when he was on the court ended up in him “using” the possession (attempting a field goal or free throws, or turning over the ball). For comparisons sake, for the playoffs overall, James’ Usg% was 37.6%, leading the NBA, and the next highest Usg% was 2015 MVP Stephen Curry’s at 31.0%. Anything over 30% is very, very high. Anything over 40% is almost unheard of. James 46.7% looks like a number only found in video games.

The Cavaliers’ team game plan depended heavily on James to create scoring opportunities for himself and his teammates. Why? Because his teammates simply could not do so with All-Stars Kyrie Irving and Kevin Love on the bench. Which brings us to the second point.

LeBron James’ teammates got the benefit of playing with him, while he did not. The Cavaliers totaled 95 assists as a team during the NBA Finals, but James had over half of those—53 assists over the six Finals games compared to 42 by his teammates. That means that James assisted on about 45% of all the shots his teammates made.

(The team rate was about 48%. But remember they were passing to James AND each other—LeBron James, as good as he is, cannot assist to himself. Also, James only turned the ball over 21 times over the course of the series compared to his teammates’ 52 times. And while his teammates got to pass to James and other teammates more open because of the defensive attention James received they still threw it to the defense more and came up with fewer assists, even with a usage rate 7% higher than James.)

James is arguably the best passer in the NBA today (I would argue he is among the best in NBA history), and that makes the game easier for everyone around him. In addition, since the Warriors knew James would be the focal point of the Cavaliers’ offensive game plan (again, a 46.7% Usg%), he typically drew much more defensive attention than his teammates—the Warriors’ best defenders, double teams, etc. Not only did James get his teammates more easy, assisted baskets than they got him and each other, but his presence on the court made the game much easier for them than it was for him.

All that to say, LeBron James just completed perhaps the greatest NBA Finals performance in history. Against one of the best teams of all time, playing with one of the worst supporting casts of the last 30 years, James did everything he could. He played an incredible number of minutes each game. And it was not just his minutes that soared—he shouldered the burden of shooting and creating for his teammates who, as mentioned, could not really do so themselves. And while some have criticized his efficiency, it was relatively better for James to shoot that much and at least as efficient as the rest of his teammates combined.

We should sit back and appreciate what just happened. While James’ team lost the series four games to two, numbers suggest that without him the Cavaliers would have gotten swept in blowout fashion. No other player in the NBA could do what LeBron James just did, and perhaps no player in NBA history ever has.

LeBron James and the Cavaliers may have lost the Finals, but by getting to watch that performance the fans won.

Guest Post for University of Washington Press

The University of Washington Press recently asked me to do a guest blog post Q&A to promote their new book Proving Grounds: Militarized Landscapes, Weapons Testing, and the Environmental Impact of U.S. Bases, edited by Edwin Martini.

The project's UW Press editor, Ranjit Arab, introduced the blog post as such:

"The essays in Proving Grounds: Militarized Landscapes, Weapons Testing, and the Environmental Impact of U.S. Bases give us the most comprehensive examination to date of the environmental footprint of U.S. military bases both at home and abroad. Though critical of the military’s presence across the globe, the book does point to a few examples where the armed forces were actually ahead of the curve—at least compared to the private sector—in terms of self-regulation. Still, the majority of cases in Proving Grounds look at the damaging consequences—both intended and unintended—of building bases and testing weapons, from wiping out indigenous plant and wildlife to the contamination resulting from the disposal of Agent Orange after the Vietnam War.

In Chapter 2, historian Neil Oatsvall looks at how deeply policymakers engaged with environmental science at the dawn of the nuclear testing era. Contrary to popular belief, he finds, U.S. leaders actually did take scientific considerations seriously as they tried to take a lead in the burgeoning nuclear arms race. However, though their intentions may have been well-meant, given the limits of their environmental knowledge at the time, they were clearly in over their heads. We asked Neil to elaborate on this contradiction."

Read the whole post here.

When Academic Free Speech Goes Wrong: The Jerry Hough Edition

Duke University chaired professor of Political Science Jerry Hough recently created discord with his “controversial” remarks about race in the comments section of a New York Times editorial. Responding to that editorial on “How Racism Doomed Baltimore”, Hough attempted to draw a distinction between different ethnicities and explain why some groups were rioting in Baltimore and not others.

In doing so, Hough demonstrated how NOT to use academic free speech. By labeling himself as a professor at an elite university, Hough attempted to use his academic standing to strengthen his ideas with a sort of gravitas that was completely undeserved. When speaking as an “expert” you simply cannot let personal opinion masquerade as well reasoned, argued, and supported ideas.

In a six-paragraph mess, Hough claimed that Asian Americans were not rioting as a group in Baltimore because “they didn’t feel sorry for themselves” when confronting racism “at least as [bad]” as what black citizens had faced, and instead “worked doubly hard.” He then made what he thought was a lucid point about how, in his estimation, “Every Asian student has a very simple old American first name that symbolizes their desire for integration. Virtually every black has a strange new name that symbolizes their lack of desire for integration.” To top it all off, when questioned about his noxious string of garbage, Hough drew back his bowstring for one more zinger: “In writing me, no one has said I was wrong, just racist.”

What bullshit, Professor Hough.

I am telling you that you are wrong (as I expect many other people did had you possessed the sense to listen). Your comments simply do not measure up in the face history, this country’s demographics, or commonsense.

First off, the assertion that “Asians were discriminated against at least as badly as blacks” cannot be supported with U.S. history. Asian Americans certainly have faced considerable racism in the United States. The Chinese Exclusion Act of 1882 did not have a spontaneous genesis, but came from decades of anti-Chinese sentiment and action. And it is no surprise that only Japanese-Americans were interned in mass numbers during World War II and not Italian- or German-Americans (can you imagine anyone interning Joe DiMaggio?).

Even considering this considerable racism (just look at a sampling of the visual culture from WWII), anti-Asian sentiment in this country pales in comparison to the triangular slave trade. The forced migration and chattel slavery of millions of blacks is one of the most horrific mass acts human beings have ever perpetrated on each other. Add to it the blatant institutional racism this country had for many decades, such as from Jim Crow laws, and the claim that “Asians were discriminated against at least as badly as blacks” is simply wrong.

Second, the idea of names marking a desire for “integration” makes no logical sense. Would Hough argue that President Barack Obama and his family never tried to integrate because of his name? Moreover, Hough assumes that a normative U.S. culture can only exist as a white, Euro-American culture. What he really implies with his use of “integration,” deriding “virtually every black” for “a strange new name,” is that only by choosing European-derived names can non-whites ever truly be real Americans.

Moreover, though I have no hard evidence (and neither does he), I would suspect that his characterization of names by ethnicity falls flat across the entire population. Perhaps his assertion holds true within the very small, typically wealthy subset of the population that is Duke University students—I would not know, because as a UNC alumnus I try to avoid that campus in Durham. But “every Asian student” in the country, or even a large majority? That seems very unlikely. (And he clearly has not been to a toddler playgroup lately—lots of white people name their kids strange things too, and I know as a stay-at-home parent.)

Finally, he makes demographic claims without a hint of evidence. He asserts an “enormous” number of interracial relationships exist between Asian Americans and whites while concurrently arguing that “black-white dating is almost non-existent.” If that were true (it is not), we could not be surprised given our country’s history—Martha Hode’s White Women, Black Men is an excellent work on the subject. However, Hough presents no evidence for his claim. That is because he is flat wrong.

The reality is that while white-Asian marriages are more common than white-black marriages (about 8% of all U.S. marriages are interracial, with white-Hispanic being the most common interracial marriage type in the nation), the numbers are relatively close. For example, in 2010, of the more than quarter-million new interracial marriages, about fifteen percent were white-Asian while about 12 percent were white-black.  That number is statistically significant in its difference, but not the gaping chasm that Hough suggests.

It is the height of irresponsibility to leverage your academic career into the authority to champion unsupportable ideas. This is exactly what Jerry Hough did with his comments. Do I think his ideas are racist? Yes. But that is not the point I am trying to make here.

What I am saying is that if we are going to value academic free speech and defend its merit in a public and higher-ed culture that increasingly devalues academic freedom, then we need to practice responsible free speech with sound arguments, clear logic, and good evidence. Jerry Hough did none of these things, and that is a significant problem.

I am not saying I have to agree with you, but you do have to make even the slightest amount of sense.

 

Addendum: My friend Allie Mullin read this blog post and said that Hough seems to want “to compare races in essentially an Oppression Olympics. From an academic standpoint that's the only quantification he makes, and it lacks factual grounding and ignorance of intersectionality." I agree with her.

In Defense of Lecture... Well, sort of

My former advisor, Matthew Booker, sent me an interesting blog post by Grant Wiggins about high school history teachers and lecturing in class. According to a survey conducted by the website, history teachers were more likely to lecture than teachers in any other discipline, with a majority lecturing for half the period or more (sometimes the whole period).

 Wiggins basically comes down on the issue of lecturing by saying that there are better ways to achieve our pedagogical goals as history teachers, and most of what we want to convey in lecture can be conveyed to students via printed materials. He even goes so far as to assert that he “can only see two good reasons for lecturing at length”:

1.     “You have done original research that isn’t written down in a book”

2.     “You have rich and interesting knowledge based on research that can overcome confusions and missing elements in the current course”

 I have mixed thoughts about lecturing. In class, there is no way to convey as much information as quickly as we can through lecture. For sure, students can and should get some (or perhaps all) of that information via their readings. But readings are not interactive. And it is hard for textbooks to model critical thinking and demonstrate how to analyze and use evidence to build historical interpretations—skills highly valued in our discipline.

 My thinking on lecture changed dramatically my last semester of TAing for a professor in her last semester before retirement. Even in a 70-person lecture class she made it a point to draw students into discussion while lecturing. The mix kept students on their toes but also disseminated lots of information. Part of her success was because she was very good at it, but the technique was great. Since then whenever I lecture I always use this interactive approach.

 My guess is that many educators find lecture a necessary evil. One defense I will make of high school history teachers lecturing is this: Frequently those history courses have 30+ students, and when you have that many high school students in a room a lot more of your time becomes classroom management than teaching. If we (as a society) could do a better job of getting that number down to 15-20 (or fewer!) students then teachers would have more flexibility to do collaborative activities and have more interactive discussions.

Anyway, all that to say, straight lecture probably is not the best classroom practice, even if it is sometimes seems necessary. However, mixing lecture with discussion, Socratic-style teaching, historical role-playing and games, etc. can be effective, in my opinion. Mixed methods help keep students interested and can cater to different learning styles. And, to be honest, it helps if you are good at it. We have all sat in classes with good lecturers who we wanted to hear talk and in classes with bad lectures where we probably nodded off.

 When lecturing becomes less about conveying information and more about involving students in the creative aspects of the historical discipline then it is worth including in our classrooms. Please leave supporting or dissenting opinions in the comments! I am very interested in hearing what others have to say.

Whose Scandal?: The UNC scandal and Bradley Bethel

I was disappointed by a recent editorial in the Daily Tar Heel, the award-winning student newspaper of my alma mater, the University of North Carolina. The editorial concerned Bradley Bethel, a former UNC learning specialist who worked to support athletes academically, and his work putting the university’s recent academic-athletic scandal in what he feels is proper context.

A quick note on biases: I am extremely proud of my undergraduate degree from UNC, I worked very hard for that degree (and took none of the aberrant classes), I still am an avid supporter of UNC athletics (all sports, not just the revenue ones), and Bethel and I sometimes communicate on Twitter (we have never communicated outside of that medium).

All that out of the way, the editorial unfairly maligns Bethel’s previous work and his current film project, “Unverified”. That film seeks to challenge what Bethel rightly perceives as a sensationalist media narrative. In the introductory video on the film’s Kickstarter page, Bethel says of the recent UNC scandal, “Now, the true story is not entirely pretty. Some of the facts will be embarrassing for the university. But it is a story much different than the media’s sensationalized narrative.”

Contrast that to the Daily Tar Heel’s opinion that begins, “A film dedicated to proving that UNC’s athletic-academic scandal was imagined by headline-hungry journalists is difficult to take seriously.” The editorial continues to call Bethel’s film “delusional” and an “embarrassment.”

These comments are patently unfair.

Much of Bethel’s point of view comes from challenging claims made by Mary Willingham, a former reading specialist at UNC. Many have even called Willingham a “whistleblower” in the UNC scandal for her work even though, as far as I know, she meets none of the criteria of such. (She did not identify the academic misconduct—there had already been several investigations into the misdeeds by the time she became a national name.)

Willingham drew great public attention for releasing a study of 183 UNC athletes that supposedly demonstrated 60% of them read between fourth- and eighth-grade levels, with perhaps as much as a tenth of those athletes reading below a third-grade level.

Bethel has rightly challenged Willingham’s methodology and conclusions. Three external reviewers came to similar conclusions that discredited Willingham’s findings. It later came to light that Willingham had likely plagiarized significant chunks of her MA thesis, which further challenges her academic credibility. Willingham also seems to have violated FERPA laws.

For his trouble, Bethel has had his mental health questioned by Willingham’s co-author (for their book Cheated), Jay Smith, distinguished professor of history at UNC. Smith is a truly excellent historian (check out his impressive CV here), but writing the provost to express concerns about Bethel’s mental health is, frankly, ghastly.

All that to say, even though prominent media members have used Willingham as a source without questioning the veracity of her findings, she is far from an ideal resource. (Those media members especially include CNN’s Sara Ganim, Pulitzer Prize winner for her work on the Jerry Sandusky scandal, and the Raleigh News & Observer’s Dan Kane.)

And I have yet to see an evidence-based refutation of Bethel’s larger points: Willingham and Smith erred in their claims about athlete literacy at UNC, most media members took those claims without scrutiny and made sensationalized claims, and the UNC academic-athletic scandal as a whole has been largely misunderstood because of that. If you contend Bethel is so embarrassing, please first point out how he is wrong.

I am not trying to say that nothing bad happened at UNC—far from it. Though I love my alma mater dearly, I have been deeply, tremendously embarrassed by the events that took place. It is clear that significant academic improprieties occurred. That misconduct was, being as charitable as possible, at least characterized by institutional capture by athletics personnel (though it should give pause that athletes accounted for less than half of the aberrant enrollments in fraudulent classes). Like Bethel, I am not trying deny that substantial wrongdoing occurred.

I do, however, think that the Daily Tar Heel editorial is the most recent unfair attempt to disparage Bethel. At times I disagree with Bethel (who wouldn’t?), but I have found his work to be meticulously researched and argued. He probably goes overboard sometimes. He is not, however, delusional or an embarrassment.

At the end of the movie The Dark Knight, Jim Gordon says of Batman, “He’s the hero Gotham deserves, but not the one it needs right now.” I don’t know if any of that applies to Bethel—hero, as needed or deserved—but I do know that bringing dissenting facts to light in a respectful fashion is always needed. That’s what Bethel has done and intends to do with his film.

Respectfully providing evidence-based refutations of widely held beliefs is at the heart of academic discourse. Who would be embarrassed of that?

Five Reasons Not to Embargo

To embargo your dissertation or not to embargo? The question has been debated within the historical discipline. Recently I had a conversation on Twitter with Michael D. Hattem (@MichaelHattem) on the subject, particularly over his guest AHA blog post. On Twitter he made the perfectly reasonable assertion:

With that in mind, I have written this blog post with a few quick reasons why I think graduate students in history should consider NOT embargoing their dissertations. I do not expect this to end the discussion at all, but I hope to help provide a counterpoint to a strong narrative that asserts any grad student who cares about an academic future should embargo.

(1) Other scholars do have a harder time finding and reading your work

My own dissertation is available open access through the University of Kansas ScholarWorks site. (Go read it!) Since I finished it in May 2013, dozens of people have downloaded my dissertation including, as of this post, eighteen views outside of the United States from seven different countries. I do not know eighteen people outside the U.S. who might be interested in my work! Perhaps those folks would have emailed me to ask for a copy of my dissertation, but I am doubtful of that.

Another quick story: I am friendly acquaintances with a prominent scholar in my field. Out of the blue that scholar sent me an email last year asking for more information about a source I cited in my dissertation. Would that scholar have emailed me to ask for my dissertation? Perhaps—it is definitely possible. But the ease of getting a copy of my work made that person reading it more likely. Put simply, embargoing your dissertation means that fewer people will read it.

(2) Embargoing creates a culture of fear

Every graduate student I have talked to who has said they are embargoing is doing so because they fear that not doing so will somehow make their dissertation unpublishable and thus hurt their job and career opportunities. The fear is palpable in their comments, thick with worry during such a difficult and uncertain moment to be seeking employment in our profession.

Graduate school, at times, seems designed to psychically damage young, bright, hardworking people. This discussion plays into that by helping to convince new PhDs that the reason why they have not found a job is because they are not working hard enough, not publishing enough, not doing something they should be doing, etc. For many fields this is nonsense—as has been stated time and again, there are simply too many applicants for too few positions. This means that many good candidates will not get any position, let alone the dreamed tenure track job. Do not let fear convince you that your difficulty on the job market is purely because of some step you are not taking.

(3) I have real doubts that most editors care

I have never heard of a press or series editor who cared whether a potential author embargoed or not. Even more, I have never even heard of one asking me or anyone else under any context whether we did. In this discussion I have heard from a number of academics at all stages of their careers that “editors care.” Who are these editors? Why do they care? Because…

(4) No matter the embargo length, it is not forever

This means that any potential book would be out after or only shortly before the embargo ended. Most embargoes are slated to go one to three years. It is nigh impossible to get a first book out within three years time, as the amount of work revising a dissertation (intended to demonstrate to a committee that you are ready to join the profession) into a academic monograph (intended to make a scholarly contribution to your field) is substantial. Moreover, those substantial revisions mean that when the embargo does end the book will be something very different than the dissertation ever was. As an example, one friend is starting completely from scratch with the book manuscript, using much of the dissertation’s research but adding to it so substantively with new research and arguments that merely revising would have made a total mess of things.

Even the AHA’s recommendation of six years means that, presuming someone was fortunate enough to secure a tenure track job immediately out of graduate school and had to have their first monograph out before tenure review in 5 years, the dissertation embargo would end within a year or two of the book’s printing anyway.

In the historical discipline the embargo simply does not hold the work out of the public realm for long enough to make much of a difference. In other disciplines this point can completely change. For example, in the natural sciences there are frequently patents and that application process involved. Such people have very, very different concerns than a history PhD looking to publish a monograph. Since the embargo will likely end before the book is published, the supposed gains from embargoing seem moot.

(5) Embargoing wastes time

I put this reason last, but I do not think it is entirely inconsequential. Embargoing may only take a few hours of your time to get the appropriate signatures and turn in the correct forms, but why spend that time if you do not have to do so? Spend that time instead reading another book in the historiography, revising an article, or even preparing your book proposal for a press.

Or… spend that time doing something entirely unrelated to work. Graduate school seems frequently makes people feel guilty about not working every minute of their lives. That feeling does not end with PhD conferral, either. Instead of dealing with the embargo, read for pleasure, watch a movie, or have dinner with friends and family. Start practicing what it will be like to be a professional who balances a career and a personal life right now, because hopefully that is what you will be soon.

 

At the end of his post, Hattem notes, “conflicting anecdotal evidence and a lack of metrics exacerbate the problem and call for caution and individual choice.” Absolutely fair in some ways—all I have presented above is anecdotal.

But I would assert this: Cowing to fear of the unknown is no way to live your life. Caution is one thing, but there is a difference between reasonable caution like looking both ways before you cross the street and unreasonable caution such as an unwillingness to walk on street grates for fear that you might fall in (one of my wife’s phobias). If you are going to embargo make sure you are doing it for concrete reasons that directly affect you and not merely because of some career or job market bogeyman you fear. 

My Rules for Job Market Sanity

Like a lot of people, I’ve been thinking about the job market a lot lately. Really, I have been thinking about it a lot for the last three years. To that end, I’ve come up with a list of rules that I try to follow to keep myself sane. These may not be for everyone, but they have been a way for me to cope with the big bundle of rejection that is the academic job market.

1) Know your profile

This one is a little more complicated than it will sound, but here goes: You need to know what sort of institution will be most interested in you as a candidate. No matter your dreams of working at an elite institution, if you have no published works to your name, you are extremely unlikely to get an Ivy League gig (even if you went to an excellent, Ivy League-caliber school). No matter your dreams of working at a small liberal arts college with lots of interaction with students, if you have never taught a single course (or have very limited teaching experience), you’re extremely unlikely to get a job at a school that truly values teaching.

These are not happy things to think about, but they are part of being realistic about who you are as a candidate and what the job market is like. Market yourself appropriately.

The flip side of this is that you have to recognize when you are a good candidate. Just because you have a good profile and good application materials it, very sadly, does not mean you will get a job (or even interviews!). Do you best not to get discouraged and keep working hard. This is much harder—knowing that you can be a good candidate and it still not lead to your employment. It really is true that there are too many good candidates and too few jobs.

2) Be honest about how hard you’re willing to work on the job search process

If you’re willing to spend the time you can probably apply to dozens (or more) jobs. You can write a new cover letter for each position and finely tailor each document to the position. One friend on a search committee told me that they had some teaching philosophy statements that were so carefully tailored to the school that some candidates researched not only all the courses on the books but which courses had been taught recently and by whom. Those candidates then laid out a detailed plan for how their courses would fit into the department. That took a lot of time and effort!

To be blunt, I am not willing to work that hard for almost any application. Everyone has to decide what makes sense for them.

The flip side of this, of course, is that no matter how hard you work it may not end up in you getting a position. Hard work is not a determining factor past a point. Just because you put 10+ hours into an application it does not mean that you will get an interview.

3) Know that you probably do not understand the dynamics of a search

The more time I deal with search committees (having been on both sides of the process), the more convinced I am that candidates have little sense of what is actually going on with a search committee. Even being part of a search committee and in the room during deliberations is not always enough to have a full sense of everything that is going on.

Without revealing too many details, I was crushed not to get an interview for a particular job because I was very familiar with the institution, department, and faculty. I thought I had tailored my application materials perfectly (I was willing to put in the work on this particular job). It turns out, talking to a faculty member later, that the department/institution had decided to go off in a direction that was totally unexpected to me (and my friend on the faculty, who was tangentially involved in the search), and I never could have fit what they actually wanted.

No matter what the job ad says or what you think you know about the department, faculty, and institution, you do not fully understand the search. Just accept it.

4) Know that the job market is neither 100% merit-based nor is it completely random.

Someone much wiser than me told me this, and I have found it to be so very true. Some of this is related to no. 3 above, some of it is this very (VERY) nebulous idea of “fit,” and some of it is the mere fact that different people value different things out of a colleague. If you are a good candidate and give it enough time you will get opportunities and, if you have a little luck, get a position.

5) Be happy, truly happy, when your friends get interviews and job offers

This is one that I said out loud my first year on the job market but did not mean until halfway through my second year. If you are lucky you will have talented friends who are also on the job market with you, and hopefully they will get interviews and job offers. You will lose out on many, many positions to people you have never met and may never meet. Wouldn’t you be happier if one of your friends got that job offer instead of someone you do not know? It is a tough idea to accept in your heart, but accepting the logic of it is the first step.

6) Finally, know that not getting a position does NOT mean you are a bad scholar or teacher

We live in a rough time to be on the academic job market. As I said in no. 1, even good candidates can miss out on interviews and job opportunities. One of my friends has a good teaching record, multiple publications (one forthcoming in the top journal in his/her field), and has had multiple prestigious fellowships (including a Fulbright). This friend has gotten no interviews to my knowledge, and it completely baffles me. I would think this person would be one of the first snatched off the market, but it has not happened.

Just because one year (or even two) does not give you a job offer, it does not mean that the future will be the same. Doing so may mean putting other career or maybe even family opportunities on hold, and you will have to balance what potentially landing an academic job in the future is worth to you. But you just cannot feel bad at a few (or even a lot of) strikeouts.

Remember, jobs are like spouses—you only need one to say yes.

We Still Need to Kill the Conference Interview

In my head I’m thinking about some variation of the Eagle’s “Hotel California” here (please kindly ignore that I buggered the number of syllables):

In the conference hotel lobby they are gathered for the feast,

They stab it with blog posts and tweets, but they just can’t kill the beast.

There are a lot of reasons why the conference interview needs to go, and I’m far from the first to write about it. In terms of justice, we’re asking the most financially vulnerable in our profession to shell out big bucks. It’s not just about the candidates, though. One of the most persuasive arguments I’ve read is David Perry’s that the conference interview format is a financially poor decision for colleges and universities. He has some great links in there to read other perspectives on why the conference interview should end.

I’ll be honest that I don’t really have much new to add other than this: a colleague of mine recently received an invitation to interview at the American Historical Association (AHA) next month. The pure joy at receiving an interview invitation quickly receded to dread when my colleague realized the interview was being held at the AHA job center, necessitating conference registration. The colleague holds a non-tenure track teaching position, and thus falls into the “employed” registration category of $220 dollars. My colleague was not planning on registering before (just go do the interview and leave), but now must find the money for this extra expense.

I initially wrote a much longer post about the financials of attending a conference, but I deleted most of it to focus on the question of registering for a conference where you have an interview. Some may say that you should register anyway because that’s what you do. But unless you’re really excited about lots of particular panels (which doesn’t always happen at the big conferences), what’s the point? To get into the exhibit hall? To pay extra to go on a field trip or to attend a luncheon?

Perhaps some readers have a dissenting viewpoint, and I’d like to hear it if they do. I’m willing to be swayed. In my current thinking, however, it seems that having the interview at the AHA’s job center instead of a hotel suite is just transferring some of the cost from the interviewing institution to the interviewees. And that stinks.

Conference interviews really don’t have much to do with the attendant conferences, and taking those interviews is expensive enough already. Why would adding an extra cost ever be a good idea?

How far does public service go?

Whenever I see a New Yorker article by Jill Lepore, I know it’s a piece I want to read. She’s a Bancroft Award-winning distinguished professor at Harvard for a reason.

Her latest, “The Great Paper Caper,” is a fascinating chronicle of U.S. Supreme Court Justice Felix Frankfurter (in office 1939-62) that not only accounts his time on the bench but, more importantly, questions what should happen with the papers of Supreme Court justices after they retire. As alluded to in the title, a great chunk of Frankfurter’s papers were stolen from the Library of Congress.

This incredible wrongdoing is compounded by the fact that, unlike other public servants working at public institutions, nothing compels the justices to release their official papers to the public. Lepore notes that the Federal Records Act (1950) excludes the Supreme Court, and subsequent additions, like the Presidential Records Act (1978), have not changed that. This means that access to the complete Frankfurter papers had been (and still is) limited, and any lost documents cannot even be remotely replaced for the public.

The “great paper caper,” then, is not only the theft of Frankfurter’s papers from the Library of Congress, but also the potential “theft” from the public by justices who would not release their unedited papers without legal compulsion.

Before delving into the “point” of this blog post, I want to say that there are few things more reprehensible in the scholarly world than stealing documents from an archive. The act is on par or worse than plagiarism, data falsification, and other research fraud or academic misconduct. I actually got a little sick to my stomach reading the article’s opening vignette about the stolen papers.

After my initial revulsion subsided, I started thinking about the wisdom of allowing Supreme Court justices to decide the manner in which their papers are released to the public or even if those ever will be. Lepore does an admirable job presenting an even-handed account of why the current system may be preferable to one where federal law mandates the release of judicial papers, but I remain entirely unconvinced.

If we find it appropriate not only to release the full papers of every president, but even to go as far as to record every Oval Office conversation, what excuses do we truly have for the Supreme Court? As Richard Nixon showed us, when presidents are allowed to censor the historical record regrettable things can happen. Lepore’s article gives a few instances where former justices have censored their papers in ways that are detrimental to the public welfare.

In the end, I think that if you’re holding an office as lofty as United States President or Supreme Court Justice, your public service does not end when you leave office. Part of the deal—part of what you owe the country—is a full accounting of your actions in the historical record. As much as is possible, your personal life should remain as personal as you want it to be. Your actions in an official capacity, however, are no longer yours—they belong to the nation and its peoples.

Obviously this opinion is influenced by my training and career as a professional historian. I owe a great debt to many governmental library holdings, without which my scholarship simply would have been impossible.

More than that, however, I would argue my belief is informed by being a civic-minded citizen. If we know anything about U.S. policymakers it is that their deliberations are often complicated—their decisions are rarely as simple as they seem to the public at the time. We deserve, as members of a democratic republic, to have to capacity to hold our elected officials (and their appointments) to a full reckoning. That is impossible without access to the papers created in the official capacities of their duties.

In addition, I believe that the best way to keep people trustworthy is to make sure that they have no occasions to be untrustworthy. Even great, honorable persons can be tempted to commit wrong when they know they can get away with it. The ability to heavily censor or not release papers gives members of the Supreme Court the ability to do so. Perhaps this is an incredibly pessimistic view of justices’ moral fiber, but I believe it is a simple recognition of human desire to escape punishment when it easily can be achieved.

Some may argue that justices need to be free from the fear of recrimination so that they can render proper, constitutional verdicts. But if they are scared of the deliberations behind their decisions being made public, shouldn’t they also be afraid of making those sorts of decisions? Justices owe the release of their papers to the public. Perhaps more importantly they owe it to themselves so that they can fulfill their charges at the highest level.

Whether you agree or disagree, feel free to leave a comment.

Farming as a trend

Almost a year ago The New Yorker started appearing in my mailbox. To this day I have no clue who purchased the subscription for me, but I’ve become a bit addicted to the weekly offering. An article by Alec Wilkinson on the new magazine Modern Farming caught my eye. I will admit, I have never read an issue of the award-winning Modern Farming, so what follows is more a rumination on Wilkinson’s piece than anything else. 

Like many New Yorker pieces it is as much a character study as anything else, this one of Ann Marie Gardner, Modern Farmer’s founder and editor. The article adroitly compares Gardner to the magazine. One person reviewing Modern Farmer said, “I wonder who the ideal reader is. My assumption is that it’s people who will never farm.” Another remarked, “There was not anything actually written by a farmer.” Both descriptions fit Gardner.

Later the reader is presented with a story of Gardner buying chickens for dinner. The local farmer selling her the chickens slaughters the birds on the spot for Gardner, causing her anguish. At the end, Wilkinson writes, “Sniffling, she wrote a check for $84.93, and took the chickens, which I had to carry, because when she touched them she discovered that they were still warm.”

I’m not going to pretend that I grew up on a farm (because I did not), but I did grow up with family friends who were farmers. I remember quite fondly what a barn full of curing tobacco smells like, but I never spent my summers picking it like my mother or grew up on a farm full of it like her mother.

I do know, however, that at its core farming is about killing some beings so that other beings can live. This is obvious when eating meat—children’s author E.B. White of Charlotte’s Web fame once called hog slaughter first-degree murder while simultaneously acknowledging how delicious bacon tastes. The same can be true for plants, however. Wheat cannot be eaten while it is still alive, nor many other crops. We know this on a visceral level, but why are so many of us still so squeamish when we are reminded that our dinner used to breath and eat just as we do? (My wife absolutely refuses to handle raw meat.)

Do I have a larger point that Wilkinson’s piece does not make? I am not sure. But I do know that farming is trendy right now. Organic produce is all the rage, and so is eating local, slow food, etc. These are not bad things (even if “organic” becomes commoditized like many entities in this country). Cooking reality shows are too numerous to count these days (I happen to be a big fan of Top Chef). Yet if we, as a nation, increasingly care so much about our food, why is historian Matthew Booker researching why many people have “lost faith” in food? Perhaps this “foodie” trend is just that—a trend.

Perhaps we should be less interested in the idea of farming and more interested in what farming actually is. Farming is death. Faming is tedium. Faming is a business. Farming is being confronted with tough choices. It is also many other things, and none of them are inherently bad. Knowing that, it is more than a little strange that the editor and founder of a magazine titled Modern Farmer would go out of her way to get fresh, local chickens for a dinner party and then get weepy because her dinner was slaughtered while she waited (FYI I recognize the gendered element of this portrayal).

Agriculture has become the latest intersection of culture and environment where a great many people in our society feel that they have a stake or expertise (or both). What wilderness was a century ago agriculture is today. That is not necessarily a bad thing. I just wonder if we are concerned about our cultural interactions with agroecosystems because doing so is trendy or because we truly appreciate what agriculture is and the importance it has on the world and our lives. Maybe it does not matter either way.

We all have to eat, you know.