Thursday, January 12, 2012

Suggested reading list for HIS 101

As we gear up for a new semester, it occurred to me that I should post a suggested reading list for my students. Now, I know that because of your busy schedules, most of you have a hard enough time finding the time to read required books, but allow me the indulgence of telling you what I think you should also be reading (or listening to if you have an audiobook membership somewhere).



Upton Sinclair, The Jungle.
     This classic tells the story of immigrants in Chicago at the turn of the twentieth century. It's      fascinating, horrifying, edifying....makes us really glad to live when we do and not back then. This book also supposedly helped passage of landmark legislation that protects all of us--the creation of the  Food and Drug Administration.
Jack Finney, Time and Again.      
      Finney uses a rather implausible time travel device to bring middle class life in New York City in the 1880s to the pages of this remarkable book. If you can suspend your disbelief on how our hero gets to the 1880s, the book will reward you amply with fabulous period detail about daily life in the nation's most important city. Students who have read this book in classes where I have required it rave about it!
 
John Steinbeck, The Grapes of Wrath. 
      Steinbeck's masterpiece tells the story of a family caught up in the horrors of the Great Depression. Like The Jungle, it too makes us happy not to have lived in the 1930s. If you have never read this book, you really must. 

Monday, January 9, 2012

Humanities and the current push for "STEM" subjects

By now, every parent of a college-age or soon-to-be-college age student has heard the pundits decrying America's paucity of math-science-technology majors. We'll be left behind, they cry. The U.S. will become a has-been economy if we don't train more engineers, they warn.

This kind of talk is all very well and good, but it fails to note at least two important factors: first, not all students are going to be good at math, science, or technology-related fields; and, second, college is not meant to be vocational training. A good college education should teach students how to think, how to reason, how to make a fact-based argument, how to appreciate what has come before.

Students who major in history, political science (an oxymoron, perhaps?), literature, sociology, and other such fields learn skills that will be valuable in any occupation. They learn to read carefully to retain content, to question and keep open minds to what they may find, to research thoroughly, to pull facts and details from disparate sources to create a new argument. How can these not relate to any career anyone follows? How can these not be good for the modern American economy?

Saturday, December 17, 2011

"Step foot"?

Like most college professors, I am now grading many papers. And this brings up all sorts of grammar and usage-related questions and leads me to ask again what is being taught these days? One of the new phrases I'm finding is this:

"from the time the first settlers step foot on the continent..."

Step foot? Where did this come from? And why are teachers not correcting this? It's set foot.

One can either step, or one can set foot, but it makes no sense to say that one steps foot. I am mystified where these things come from, why they take hold, and why they're not strangled immediately by teachers.

Here's another one that amazes me. "He lead the miners to the lead mine."

Is no one correcting the incorrect spelling of the past tense of the verb "to lead"? I see this a lot.  It's led.

And why does everyone these days write dates incorrectly? It's not December 17th, 2011. It's December 17, 2011. I even see college administrators do this frequently. It looks so silly with that ordinal hovering over the comma. That alone should let people know that it's not correct.

And permit me one more: there's no such word as alright.  It's two words: all right.


All right, enough from me.

Thursday, November 17, 2011

Of received wisdom and turkeys

My subject today is turkeys--no, not the Thanksgiving kind. I am talking today about wild turkeys and received wisdom. Do we not usually call someone who is clueless or oblivious a "turkey"? I imagine we've all done that at one time or another.

But I was fortunate enough to catch a documentary last night on our local PBS station about a man named Joe who raised sixteen turkey eggs that he found. He managed to hatch them all, lost only three in their early months, and brought the other thirteen to adulthood. He was, in essence, their mother.

It is a truly beautifully made story about the raising of these turkeys, but aside from just being interesting, it carries a strong yet subtle message: challenge received wisdom.

Here's how: in spending every day with his brood, walking through the woods and examining the flora and fauna through their eyes, Joe learned that they were anything but stupid, anything but clueless, anything but oblivious. They had amazing instinctual knowledge and they had a language. They had distinctly different sounds for different animals they encountered. Our dismissive colloquial use of the term turkey is clearly rooted in a misconception about the intelligence of these birds.

So how does that connect with received wisdom? I often talk with students about historians' need to question historical truisms. What I mean by that term is stories like Washington throwing a silver dollar across the Potomac, or the idea that Pocahontas saved John Smith, or that the Salem witch trials were merely the result of some girls' collective hysteria. If historians never questioned these "old chestnuts," how would we as a people ever learn the truth about our past?

So, the collective wisdom about turkeys being bird brains is obviously untrue (and we know that now because someone took the time to investigate it), neither Washington nor anyone threw a silver dollar across the Potomac, John Smith was probably spared because he acted bravely in the face of imminent death, and the Salem witch trials had more to do with old grudges and uppity women.

Question what we think we know. Maybe it's correct, but maybe it's not.

"My life as a turkey" on PBS

Sunday, October 30, 2011

Mud and Fire: the connection

While preparing this week's class on early San Francisco, I wanted to look into the status of the streets in the wild, boom-town early years of the Gold Rush. Not to my surprise, I learned that the streets of San Francisco in the early 1850s were not much better than the streets of other cities of the time. In fact, because of the frequent soggy weather in that part of the world, they were probably worse.

As in all but the most commercially dense areas of the biggest cities, city and town streets were mostly unpaved. A favorite joke of the time told about a man who was up to his neck in mud on a New York street. When an alarmed neighbor ran out to assist him, he cheerily called out, "no worries, I have my horse under me." That this was a popular joke, tells us that many people could clearly identify with this man's predicament.

In San Francisco, "instant city" that it was (I take that phrase from a 1975 book by Gunther Barth), paving of streets was not a major priority until fires moved the issue up the priority list. The streets were so bad that one enterprising person put up a sign at the intersection of Clay and Kearney reading: "This streeet is impassable, not even jackassable."

What connection between muddy streets and fires, you ask? Good question.

It seems that as a stop-gap (something to be expected in an instant city more interested in making money off gold rush miners than quality of life matters), the city fathers put down planks over the mud.

Now, it doesn't take the brilliance and training of an urban planner to know that there are going to be big problems with that--those planks are guaranteed to be slippery when wet and a fire hazard when dry.

And that's exactly what happened. Between 1849 and 1854, San Francisco suffered through six major fires, the flames of which were certainly encouraged by that dry planking on the streets. Only after the sixth disastrous fire did the city begin laying its first pavement.

Friday, October 14, 2011

Short answer exam writing advice

In my classes these days, instead of essay answers, I ask students to write several short answers instead. The questions I ask are designed to be answered in two to five well planned sentences. I insist that before writing anything, students sit and think a while, making a brief list. Only after that list is completed should they think about composing the answer.

Why? Because we live in the age of Twitter, in the age of four-second sound bites. Have you ever seen a long blogpost and groaned? Few people have the time or the inclination to read long essays anymore.

So, am I doing my students a favor by insisting that they think carefully and then write concisely? I think so.

Here is a sample question (taken from my Roosevelt to Reagan course material) with a sample of how I expect students to tackle such a question.

Question: Assume that you are on a debate team. The topic is “Resolved: that the Cold War was not inevitable. You are to argue the positive, that it was not inevitable. What factors would you include in your argument? [no need for lengthy explanations: just list the factors you would use to argue that the Cold War was not necessarily inevitable.]

There’s the question. Now here’s an example of how you might want to do a quick list during your thinking time. [this is by no means a complete list, but shows how you would start doing your thinking and planning before writing your two to five sentence answer.]

A. Had been allies
B. Americans should have understood Soviet need for security
          25 mil dead
          Russia invaded
C. Overwrought intelligence
          Nitze
          Doolittle Report
D. Leaders exaggerated dangers
         Khrushchev
                Truman, Ike, Kennedy

Hint: While making this list, I thought of item C last. So, it was on my list last. Then, looking over the list, I realized that it would be better to bring this up sooner in my answer, so I moved it up. This is what I mean by thinking and planning out the answer before you start writing.

Monday, October 10, 2011

Expressing ourselves

About thirty years ago, the phrase, "express yourself" was all the rage. While I was never quite sure what that meant, I think it was an attempt to encourage people to voice their individuality....or something.

Whatever it was then, it's still important now. I bring this up because of an email exchange I had with an old friend, Peter Ruscitti. He wrote something to me that expresses far better than I can the essence of why I keep harping on proper grammar, usage, and syntax with my students. Here's what Peter wrote:

"In an information age, it’s not enough just to know something.  You have to know how to express it."  

This may seem obvious, but it's not. Think about it for a second. How many Americans today believe that what you say (or write) is the only thing that matters, ignoring the fact that how you say it (or write it) will determine how well your message gets through.

There's an easy way to learn how to become a better writer, too. It's not hard, and this is not a secret. 

The way to become a better writer--with minimal effort--is to read good books. Yup, you need to read as much as you can and as often as you can. 

But when you read, pay attention to how the author phrases things--pay attention to the rhythm and the clarity. Pay attention to the word choice. And, yes, pay attention to the punctuation. 

If you do, you'll find yourself instinctively developing the same good habits of composition when you sit down to write. 

All you have to do is pay attention. In this information age, isn't that everything? 

Thursday, September 29, 2011

New data on number of Civil War deaths

I'm always yammering at my students that we need to re-examine "received wisdom," the facts that are handed down from generation to generation without critical analysis. Now, a Binghamton University professor is doing just that. Associate Professor J. David Hacker, using demographic research, has significantly revised upward the estimated number of Civil War deaths and by doing so, has also significantly shaken up received wisdom. If Hacker is correct, the traditional number used--618,222--is far too low. I've always used 620,000 to 640,000 when I teach the Civil War in my HIS 100 classes. But Hacker thinks that even that number is too low. His new research shows that the number should be at least 650,000 and may be as high as 850,000. So, he'll average the two and finds 750,000 to be the most accurate number.

Similar to the approach I have always advocated in my HIS 100 classes, Hacker thinks that the Civil War deaths have affected far more people than just the soldier indicate. And he agrees that the new estimate proves that Civil War military deaths are still greater than all the military deaths in all the other American wars combined. "A higher death toll," he says, "implies that more women were widowed and more children were orphaned as a result of the war than has long been suspected. The war touched more lives and communities more deeply than we thought, and thus shaped the course of the ensuing decades of American history in ways which we have not yet fully grasped."


Now, add to the numbers of people affected at the time the numbers of people affected since then. By that, I mean that we should imagine all the generations of people who were never born because their great-great-great-great grandfathers died on a Civil War battlefield, in a prisoner-of-war camp, or in a military hospital. That would be the really meaningful number, but I have no idea how that could be estimated.   

Wednesday, September 21, 2011

For my "Roosevelt to Reagan" students and anyone else interested in American reaction to the Holocaust


I have just come across the announcement of a symposium that was held in NYC this past Saturday that may answer some of your questions about FDR's inaction on the plight of European Jews during World War II. I only wish I could have gone! Here's the posting copied and in its entirety:

"New research on President Frankin D. Roosevelt's controversial views regarding Jews and other minorities will be presented by scholars at a conference in New York City on September 18, 2011.
The conference, "While Six Million Lived: America and the Jewish Refugee Crisis, 1933-1939," is the ninth national conference of The David S. Wyman Institute for Holocaust Studies.  It will take place on Sunday, September 18, 2011, at the Fordham University School of Law, 140 West 62 St., New York City, from 10:00 am to 5:00 pm.
Highlights of the conference:
10:00 am - Dr. Rafael Medoff on "FDR, 'Jewish Blood', and Immigration"
10:45 am - A ceremony honoring the leaders of the Virgin Islands for their 1939 effort to aid
Jewish refugees;  U.S. Rep. Donna Christensen (D-Virgin Islands) will accept an award and speak.
11:15 am - Prof. David S. Wyman on "The Search for Havens"
1:00 pm -  "Cartoonists and the Plight of German Jewry, 1933-1939," a panel discussion with cartoon historian Craig Yoe and illustrator Sal Amendola
2:30 pm - Prof. Laurel Leff on "The American Medical Community and Jewish Refugee Doctors"
3:30 pm - Prof. Stephen Norwood on "American Universities and Nazi Germany"
Dr. Ari Babaknia, the eminent physician and scholar, will chair the conference. Prof. Thane Rosenbaum, scholar and novelist (and Fordham faculty member) will serve as Master of Ceremonies.
For more information, please call the Wyman Institute at 202-434-8994 or visit www.WymanInstitute.org." '


I have contacted the conference organizer to see if I can get access to the papers presented. I'll update you as soon as I hear anything. Research that uncovers new ways of considering the actions of historical actors is always exciting!

Wednesday, September 14, 2011

The most abused word in the English language: "there's"

I have posted about this before, but I feel the need to do so again.

What is it with people--educated people, even professional broadcasters--using the singular verb "there's" followed by a clearly plural word? Here's an example I just heard on National Public Radio that sent me running to my computer and this blog:

A highly regarded reporter was just telling the studio broadcaster about what's happening in Kabul, Afghanistan. She is on the scene and was describing the aftermath of the recent attacks. Here's what she said:

"There's several bodies on the floor."

Now, this has become so commonplace in the last ten years, that nobody (except me and a few other grammarry-types) even notices.

"There's several bodies"?  I'd love to ask this reporter if she would ever say "There is several bodies." The answer, of course, would be "no."

You might think I'm making a mountain out of the proverbial mole-hill here (pardon the cliche), but if you use that type of language, don't you sound a bit less than educated? Why would you choose to do that, just because everyone else does, when you could choose to say it correctly (and just as easily) like this:

"There are five bodies on the floor." It doesn't take any longer.

Whenever you hear this type of thing--and I promise that if you listen for it, you'll be shocked at how widespread it is--and once you realize how terrible it sounds,  you'll stop doing it. It's not hard and doesn't cost anything.

And there are my thoughts for the day.

Wednesday, September 7, 2011

Ten Years After 9/11 might be a good time to reflect on what has changed.

While the 9/11 attacks are still very fresh in the minds of most people in my generation, I have come to realize that most of my students were very young in 2001 and have different memories of that fateful and terrible year.

I recall my daughter, who was an undergrad at UMass Amherst at the time, telling me that there was so much in-class discussion of the tragedy in the days following, that students were starting to get tired of it all. They were more eager to have instructors get back to the necessary coursework than to have yet-another session for students to express their feelings.

And so I have assumed that most young people today would feel the same way and would roll their eyes at much commemorative tenth anniversary talk--until I realized that most students today were much younger than my daughter at the time--and that they have very different memories.

Since I teach constitutional and legal history, I thought it might be interesting for my students to learn what has changed in our legal world since 9/11. So, here I present an article from today's New York Times (please note that I have italicized the title--that's what you should do, too) by Adam Liptak, a noted constitutional writer, talking about how much has changed in these last ten years relative to civil liberties. Liptak concludes that: "criminal law itself changed surprisingly little in the wake of the attacks. What did change was how law enforcement conceived its mission." 

So, those of you interested in law school, and those of you interested in a career in law enforcement, this should be of particular interest to you. Read on!

September 7, 2011
Civil Liberties Today
By ADAM LIPTAK

There is a place for alarmism when threats to civil liberties are concerned. Too much worry about our freedoms is better than too little, particularly in the face of a government shrouded in wartime secrecy after the Sept. 11 attacks.

But there is also a place, a decade later, for sober reflection. By historic standards, the domestic legal response to 9/11 gave rise to civil liberties tremors, not earthquakes. And even those changes were largely a result of reordered law enforcement priorities rather than fundamental shifts in the law.

Consider the USA Patriot Act, which was short for this Orwellian mouthful: Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act of 2001. The law, more than 300 pages long, sailed through Congress seven weeks after the attacks with scant dissent. It quickly became a sort of shorthand for government abuse and overreaching.

The Patriot Act undeniably expanded the government’s surveillance powers and the scope of some criminal laws. But this was, in truth, tinkering at the margins and nothing compared with the responses of other developed democracies, where preventive detention and limitations on subversive speech became commonplace.

“In comparative perspective, the Patriot Act appears mundane and mild,” Kent Roach, a law professor at the University of Toronto, writes in a new book, “The 9/11 Effect: Comparative Counter-Terrorism.”

The story is different as one moves beyond domestic criminal law. Detentions at Guantánamo Bay, extraordinary renditions and brutal interrogations all tested the limits of the appropriate exercise of government power in wartime. The American government held people without charge for almost a decade, engaged in torture as that term is understood in international law, and sent people abroad for questioning to countries known to engage in what everyone must agree is torture.

But criminal law itself changed surprisingly little in the wake of the attacks. What did change was how law enforcement conceived its mission.

Almost immediately after the attacks, Attorney General John D. Ashcroft announced “a new paradigm.” Preventing terrorist acts, he said, was now more important than punishing crimes after the fact. There were echoes here of “Minority Report,” the 1956 Philip K. Dick story (and 2002 movie) that depicted a world in which the police catch criminals before they can act, based on their thoughts rather than their actions.

The new paradigm encouraged the arrests of people thought to be dangerous for, as Mr. Ashcroft put it, “spitting on the sidewalk,” or for immigration offenses, or as material witnesses. It increased surveillance of religious and dissident groups. It ramped up the use of a law barring even benign support for organizations said to engage in terrorism, putting pressure on activities long thought to be protected by the First Amendment. And it inserted informants into Muslim communities, giving rise to a culture of suspicion and charges of entrapment.

The number of people directly affected by these changes was, in the greater scheme of things, small. The indirect chilling effect on free speech, association rights and religious freedom was impossible to measure. But by the standards of the Alien and Sedition Acts of 1798, the Palmer raids of 1920, the internment of Japanese-Americans during World War II and the McCarthy era, the contraction of domestic civil liberties in the last decade was minor.

Arrest Early, Charge Broadly

As they generally have in the past, the courts acquiesced in the government’s efforts to combat terrorism. True, the Supreme Court placed some limits on the executive branch’s ability to hold prisoners at Guantánamo Bay. But decisions in criminal and immigration cases tell a different story.

“The courts have been failing terribly,” said Susan N. Herman, the president of the American Civil Liberties Union and the author of “Taking Liberties: The War on Terror and the Erosion of American Democracy,” which will be published in October.

The Supreme Court, she said, routinely refuses to hear cases in which lower courts uphold the government’s position in cases involving national security. “They’re not interested in civil liberties challenges,” she said of the justices. “They’re only interested when the government loses.”

The goal of stopping terrorism before it happens caused federal law enforcement officials to make early arrests and then to rely on charges that required little proof of concrete conduct. Prosecutors often charged defendants accused of involvement in terrorism with conspiracy or “material support” of groups said to engage in terrorism.

Those laws were already in place, said Robert M. Chesney, a law professor at the University of Texas. “The difference is,” he said, “they just weren’t being used.”

After the Sept. 11 attacks, things changed. In just the first five years, prosecutors charged more than 100 people with providing material support to terrorist groups. That support often took tangible form, like providing weapons, and it generally seemed directly linked to the advancement of violent ends.

But some prosecutions were based on sending money to groups that engaged in both humanitarian work and violence. And last year, in Holder v. Humanitarian Law Project, the Supreme Court ruled that it could also be a serious felony merely to urge terrorist groups to use peaceful means to resolve disputes. Such speech, the court said, amounted to material support and could be made criminal notwithstanding the protections of the First Amendment.

Chief Justice John G. Roberts Jr., writing for the majority, stressed that the material-support law applied only to speech directed by or coordinated with terrorist groups. People “may say anything they wish on any topic” without running afoul of the law, the chief justice said, so long as they are speaking independently.

Aggressive use of material support and similar laws, critics responded, chipped away at two principles that had been thought settled for about half a century. One was that mere membership in a subversive organization cannot be made a crime. The other is that the abstract advocacy of even the violent overthrow of the government must be tolerated under the First Amendment.

The Humanitarian Law Project decision “is akin to the kind of criminalization in the McCarthy era of speech and guilt by association,” said David D. Cole, a law professor at Georgetown who represented the challengers in the Humanitarian Law Project case as a lawyer with the Center for Constitutional Rights.

A second law already on the books, this one allowing the arrest and detention of material witnesses — people said to have evidence of others’ crimes — was misused, critics say, as a shadow preventive detention regime. Instead of using the law to make sure people with information about the wrongdoing of others would turn up to testify, these critics said, prosecutors used the law to hold people themselves suspected of links to terrorism.

Guilty Until Proven Innocent

Laws concerning immigration offenses were also used to detain people suspected of terrorism, according to a 2003 report from the Justice Department’s inspector general. The report said that the usual presumptions of the legal system were turned upside down after the attacks. People detained on immigration charges were considered guilty until proven innocent and were often held for months in harsh conditions after they were ordered released.

In decisions in 2009 and May of this year, the Supreme Court blocked two lawsuits seeking to hold Mr. Ashcroft accountable for what the plaintiffs said were abuses in the use of the material-witness and immigration laws.

“It should come as no surprise,” Justice Anthony M. Kennedy wrote for a five-justice majority in one of them, “that a legitimate policy directing law enforcement to arrest and detain individuals because of their suspected link to the attacks should produce a disparate, incidental impact on Arab Muslims, even though the purpose of the policy was to target neither Arabs nor Muslims.”

In the decade since the attacks, the government also became notably more aggressive in the use of informants and sting operations, sowing distrust in some parts of Muslim communities. In one such operation, an imam in Albany was ensnared in a fictitious plot involving shoulder-launched missiles and the assassination of a Pakistani diplomat in New York.

Defending the 15-year sentence meted out to the imam, Yassin M. Aref, prosecutors said the new paradigm of prevention justified the tactics. “The Federal Bureau of Investigation has an obligation to use all available investigative tools,” prosecutors wrote in a 2007 appeals court brief, “including a sting operation, to remove those ready and willing to help terrorists from our streets.”

Protections ‘Seriously Diluted’

Not all new tactics in combating terrorism in the United States were based on existing laws. “In electronic surveillance, you did have a big change,” said John C. Yoo, a law professor at the University of California, Berkeley, who became known for his aggressive legal advice and expansive view of executive power as a Justice Department official in the Bush administration.

In 2002, for instance, a special federal appeals court, the United States Foreign Intelligence Surveillance Court of Review, granted the Justice Department broad new powers to use wiretaps obtained for intelligence operations in criminal cases. “This revolutionizes our ability to investigate terrorists and prosecute terrorist acts,” Mr. Ashcroft said at the time.

After revelations concerning the warrantless wiretapping of international communications, Congress largely endorsed the program. Those legal changes, joined with striking advances in technology, have allowed the government broad ability to gather information.

“The Fourth Amendment has been seriously diluted,” said Professor Herman, who teaches at Brooklyn Law School. She added that she was struck by “the amount of surveillance that’s been unleashed with less and less judicial review and less and less individualized suspicion.”

Both the Bush and Obama administrations have been criticized by liberals as employing excessive secrecy and, in particular, for invoking the state secrets privilege to shut down civil litigation challenging things like rendition and surveillance programs. By international standards, though, the public has learned a great deal about secret government activities.

“That so many of the abuses committed by the executive in the wake of 9/11 have come to light is another sign of American exceptionalism,” Professor Roach wrote, “as manifested by the activities of a free press that is unrestrained by official secrets acts found in most other democracies.”

Opinions vary about whether efforts to fight terrorism in the United States have inflicted collateral damage on political dissent, religious liberty and the freedom of association.

“If you look at it historically,” said Professor Yoo, “you might say, ‘I can’t believe we’re at war,’ when you see how much speech is going on. Civil liberties are far more protected than what we’ve seen in past wars.”

Professor Cole was less sanguine.

“Since 9/11, the criminal law has expanded, ensnaring as ‘terrorists’ people who have done no more than provide humanitarian aid to needy families, while privacy and political freedoms have contracted, especially for those in Muslim communities,” he said. “On the one hand, the past 10 years have shown that criminal law can be used effectively to fight terrorism; on the other, it has also demonstrated that the demand for prevention can all too quickly lead to the abuse of innocents.”
 


Thursday, September 1, 2011

Grammar Goody--Comma rules, part II

Yippee! Another comma rule.


Put a comma after an introductory phrase or clause. To wit:

Although she never put on her shoes before taking out the garbage, her toes managed to survive nicely without breaks or sprains. 

Without that comma after garbage and before her toes, you'd have a really weird couple of words there:

...garbage her toes...

That clearly would not be good or clear.

This rule used to be hard and fast. Now it's softening somewhat so you will see this type of comma not used sometimes when the introductory clause or phrase is short. But I advise you to ALWAYS put it in. It's never going to be wrong if you do.

So, I would write a sentence like this next one WITH the comma (even though it's not considered wrong to do so without it in some circles):


Although short, the clause was effectively placed and had maximum impact.

The bottom line on introductory clauses or phrases, always set them off from the rest of the sentence with a comma.

Tuesday, August 30, 2011

Grammar Goody : When to use commas

You're stumped on how and when to use commas, right? That's because if you were taught it at all, no one  held your feet to the fire and actually made you do it. So, gradually you forgot.

Fortunately for you, commas are among the easiest of punctuation marks to learn the proper placement of.

This last sentence reminds me of an old story about Winston Churchill, the great British prime minister during World War II. He was once scolded for ending a sentence with a preposition (as I just did above when I wrote the "proper placement of"). Churchill had had enough of such pedantry, and he replied saying something like "that is a criticism up with which I will not put....)

But I digress. Back to commas. We'll start with rule #1 today and continue with others in succeeding days (I know, I know, you just can't wait, but you'll have to be patient...)

1. Put a comma before and if the and is connecting two independent clauses, as in this:


She decided to go to the wedding, and she knew exactly what she would wear. 


So, what's an independent clause? It is a part of a sentence that can stand on its own as an independent sentence. In our example above, we have two independent clauses.

(1) She decided to go to the wedding (could stand as a complete sentence on its own)
AND
(2) she knew exactly what she would wear (also could stand as a complete sentence on its own).


Now, that comma may seem unnecessary, but it it's not. It's required in good writing. 


This rule is also cool with or, but, or nor--not just and. 


Going back to my lame example sentence above, the comma placement would be the same if we substituted but for the and. 


She decided to go to the wedding, but she had no idea what to wear. 





Friday, August 12, 2011

Coffeehouses and politics


While preparing for my History of the American City course, I came across this stunningly beautiful historic painting of the Tontine Coffeehouse in lower Manhattan. It was painted by Francis Guy in 1795 and shows the hustle and bustle of NYC at the turn of the nineteenth century. If you were a mover and shaker and you moved or shook anything, you did it at the Tontine Coffeehouse (on the left of this painting). Unlike today's middle- and upper-class Starbucks, this was a truly class-less (not tasteless) institution where anyone could and did go. It functioned like CNN, eBay, the Iowa caucuses, the currently bonkers stock exchange, and one central blog spot. People went to the Tontine to get the latest news, buy or sell things, talk politics, buy stocks or wheel and deal.

Don't you just wish you could be transported back to this scene to see what was going on?  (I'll assume the answer as a yes, but I would also wish for you a twentieth century gas mask to take along: remember, nobody had deodorant, there was no garbage collection, and people used outhouses...)

Monday, July 18, 2011

Movie Censorship in the Wall Street Journal


Last Thursday's Wall Street Journal published an article by Bruce Bennett on pre-Code movies and their censorship in New York State. Click on the link next to the Journal above to read the online version of the article. I supplied some of the information used by Mr. Bennett and I am quoted in the article.

It's nice to know that all those hours logged in the archives pay off when non-academic readers learn about historical research!

Thursday, May 5, 2011

Ever wonder why I harp on the Consitution so much in class?

If you've ever wished I would just stop talking about the Constitution so often in class, here's my explanation for taking so much of your time on this subject:

"Fewer than half of American eighth graders knew the purpose of the Bill of Rights on the most recent national civics examination, and only one in 10 demonstrated acceptable knowledge of the checks and balances among the legislative, executive and judicial branches, according to test results released on Wednesday."   [Source: NYTimes, 5 May 2011]

It is clear that there is decreasing attention paid to the Constitution in American elementary and secondary education. And, if we don't know how the government is supposed to work, how can we hold our government accountable when it fails to live up to its goals? How will we know if rights are being denied?

This is serious stuff.

And so I'll go on lecturing on the Constitution and making students actually read the darned thing.

Tuesday, May 3, 2011

What is criminal justice?

By that provocative title I mean what constitutes justice for those charged with crimes? Recently, the US Supreme Court heard several cases about what constitutes minimum standards of representation for the indigent.

Another case has just come before the US Supreme Court asking that very question.

Richard Rosario seemed to have a pretty airight alibi when he was accused of murder in the Bronx in 1996. He claims to have been 1,000 miles away from the  murder scene. But two eyewitnesses picked him out of a lineup. Rosario can show that he was in Florida for the entire month surrounding the murder. And dozens of people volunteered to vouch for Rosario's whereabouts in Florida but prosecutors did not follow up, relying instead on the eyewitnesses. Now, my criminal justice major students will know that eyewitness identification is notoriously unreliable, and uncorroborated eyewitness testimony is the single leading cause of wrongful convictions. It is also, according to experts, the evidence best refuted by alibi.

That should have boded well for Rosario. Yet the prosecutors went ahead with his murder trial. His court-appointed lawyer asked for and got money to send an investigator to Florida to check out the alibi witnesses but never followed through. When a new court-appointed lawyer was assigned to Rosario's case, she mistakenly believed that the investigator-funds request had been denied. Rosario was convicted.

On appeal, the alibi witness information was alllowed but the judge refused to overturn the conviction, saying that Rosario's defense had been "skillful" and that the lack of alibi testimony at the murder trial was not material to his appeal because the lawyers' mistake had not been intentional.

What? you may be wondering. What difference does it make that the mistake was not intentional? It still resulted in Rosario having a less-than-rigorous defense. A federal district judge later agreed with the appeals judge that individual mistakes do not matter in judging whether a defendant was adequately represented, only the overall performance matter. But if that individual mistake could cost the defendant the trial, should that not be taken into consideration? Apparently not, according to a full federal court review that took place last year.

Now the case is before the Supreme Court. We'll have to wait to hear what the nine think about the adequacy of counsel for the poor.

Source: Adam Liptak, The New York Times, May 2, 2011.