Biden should spend less time with historians and more with moderates
A liberal president enters the White House in a time of national crisis. He campaigned as a moderate but soon reveals his intent to govern from the left of the center-left. His bold agenda has plenty of fans among journalists and academics who celebrate the expansion of the welfare state. They write stories and deliver soundbites likening the new chief executive to FDR. The end of Reaganism, they say, is at hand.
I’m referring, of course, to President Barack Obama. Shortly after his election in 2008, Time magazine portrayed him as Dr. New Deal, complete with fedora and cigarette holder. “It would seem that Obama has been studying the 1932 Great Depression campaign of Franklin D. Roosevelt,” wrote E.J. Dionne in his syndicated column. “Conservatism is Dead,” announced the New Republic. “It has been that kind of presidency,” gushed Jon Meacham in 2009. “Barack Obama, moving as he wishes to move, and the world bending itself to him.”
Take a moment to recover from that last bit of purple prose. Then recall that two years after Obama’s victory, Republicans won the House. In 2014, Republicans kept the House and won the Senate. And two years after that, Republicans won complete control of the federal government. Conservatism didn’t die—the New Republic did. (It’s been reborn as a monthly.)
Now the same wonks and historians who compared Obama to the architect of managerial liberalism downplay his tenure in office as overly cautious, modest, and risk-averse. They’ve settled on a new, new FDR: Joe Biden. And Biden is ready to play the part. Even if it means risking Democratic control of Congress.
Biden met recently at the White House with a group of historians who, according to Axios, share his view that “It is time to go even bigger and faster than anyone expected. If that means chucking the filibuster and bipartisanship, so be it.” Biden’s “closest analogues,” Michael Beschloss told the news outlet, are FDR and LBJ. E.J. Dionne says Biden represents “a new disposition through which pragmatic forms of government activism add up to a quiet political revolution.” And Biden “loves the growing narrative that he’s bolder and bigger-thinking than President Obama,” writes Mike Allen. No doubt he does.
You would think that, in the midst of all the pandering and praise, the scholars who talked to Biden might have provided him some actual historical perspective. Every president Biden is said to recall, including Reagan, had to endure numerous setbacks, crises, unforced errors, and unanticipated consequences of their own policies. By 1938, the New Deal was exhausted, the economy hadn’t recovered from the Depression, and FDR won his final two terms largely on the basis of his international stature. LBJ’s landslide in 1964 was followed by a shellacking in 1966 and the collapse of the Democratic coalition in 1968. The GOP lost 26 seats in the House in 1982, forfeited control of the Senate in 1986, and when he left office Reagan handed his vice president a giant deficit, the Savings and Loan debacle, and a zealous special prosecutor.
The historians urging Biden to go big on policy aren’t analysts. They are partisan cheerleaders. If they stepped back, they would see that Biden is weaker than the presidents he admires and that vulnerable Democrats are warning the majority against overreach.
The Biden team gave Axios four reasons the president is ready to ditch the filibuster and push through a $3 trillion infrastructure and green energy bill, changes to election law in the “For the People Act,” and possibly an immigration amnesty. Biden has (1) “full party control of Congress, and a short window to go big,” (2) “party activists” are “egging him on,” (3) “he has strong gathering economic winds at his back,” and (4) “he’s popular in polls.”
But the same evidence could also be read as an argument for caution and restraint. Biden has less support in Congress than any of the presidents he emulates. (Reagan never controlled the House, but often had a majority of conservative Democrats plus Republicans.) At the moment, Biden’s party has 219 seats in the House and 50 in the Senate—meaning he can lose just two votes in the lower chamber and none in the upper one. It’s one thing to enact significant legislation on a partisan majority. It’s something else to enact such legislation on a partisan majority of one during a time when a positive COVID test upsets the whip count.
Nor is following “party activists” a certain route to political success. Economic winds change direction. And while Biden is popular, his disapproval rating in the January Gallup poll was second only to Donald Trump’s. Negative partisanship drives Biden to steamroll the Republicans. It also exposes him to political rebuke.
Some Democrats are beginning to express qualms with various aspects of Biden’s approach. Maine Democrat Jared Golden was the only member of his party to vote against Biden’s American Rescue Plan. Henry Cuellar of Texas was among the first congressmen to draw attention to the crisis on the southern border. Filemon Vela, also of Texas, announced his retirement the other day, a few months after his vote share dropped to 55 percent from 60 percent in 2018. Several House Democrats have said they disagree with Nancy Pelosi’s outrageous plot to expel Iowa Republican Mariannette Miller-Meeks and replace her with Rita Hart, who lost by six votes last year. And West Virginia senator Joe Manchin has yet to cosponsor the election bill at the center of the Democrats’ campaign to end the filibuster.
In these early days, Biden’s presidency has been less a transformation than a continuation of the partisan stalemate that has existed since the end of the Cold War. Parties win elections, misread electoral victories as ideological endorsements, overreach, and pay for it at the polls. The Democrats for whom the bill will come due first are well aware of this dynamic. They may not be as good on television as Jon Meacham or Michael Beschloss, but they have plenty of insight into the aspirations and concerns of swing voters. Biden may want to have them over to the East Room. Before they are out of work.
The Magna Carta created the moral and political premise that, in many ways, the American founding was built upon. The Magna Carta came to represent the idea that the people can assert their rights against an oppressive ruler and that the power of government can be limited to protect those rights. These concepts were clearly foundational and central to both the Declaration of Independence and the United States Constitution.
First, a bit of history about Magna Carta — its full name was Magna Carta Libertatum which is Latin for “Great Charter of Freedoms.” But, it became commonly known as simply Magna Carta or the “Great Charter.” It was written in 1215 to settle an intense political dispute between King John of England and a group of barons who were challenging King John’s absolute right to rule. The terms of the charter were negotiated over the course of three days. When they reached agreement on June 15, 1215, the document was signed by the King and the barons at Runnymede outside of London.
This was a time when kings asserted the absolute right to rule, and that they were above the law and that they were personally chosen to rule by God. At this time, even questioning the King’s power was both treasonous and an act of defiance to God himself.
The Magna Carta limited the king’s absolute claim to power. It provided a certain level of religious freedom or independence from the crown, protected barons from illegal imprisonment, and limited the taxes that the crown could impose upon the barons, among other things. It did not champion the rights of every Englishman. It only focused on the rights of the barons. But, it was an important start to the concept of limiting the absolute power of governments or kings that claimed God had given them the absolute right to rule.
Magna Carta is important because of the principles it stood for and the ideas that it came to represent — not because it lasted a long time. Shortly after signing the charter, King John asked Pope Innocent III to annul it, which he did. Then there was a war known as the First Barons War that began in 1215 and finally ended in 1217.
After King John died in 1216, the regency government of John’s nine-year-old son, Henry III reissued the Magna Carta, after having stripped out some of its more “radical” elements in hopes of reuniting the country under his rule. That didn’t work, but at the end of the war in 1217, the original Magna Carta’s terms became the foundation for a peace treaty.
Over the following decades and centuries, the importance of Magna Carta ebbed and flowed depending on the current king’s view of it and his willingness to accept it, or abide by it its concepts. But subsequent kings further legitimized or confirmed the principles of Magna Carta — often in exchange for some grant of new taxes or some other political concession. But the path towards limited government and individual rights had been planted and continued to grow.
Despite its relatively short political life as a working document, Magna Carta created and memorialized the idea that the people had the right to limit the powers of their government and they had the right to protect basic and important rights. By the end of the Sixteenth Century, the political lore of Magna Carta grew and the idea of an ancient source for individual rights became cemented in the minds of reform-minded political scholars, thinkers and writers.
Obviously, it wasn’t as written in 1215 a document that protected the rights of the average Englishman. It only protected English barons. But the concepts of individual rights and the limitations of governmental power had grown and were starting to mature. Magna Carta was the seed of those powerful concepts of freedom and constitutionally limited government. By the 17th and 18th Centuries, those arguing for reforms and greater individual rights and protections used Magna Carta as their foundation. These ideas are at the very center of both the Declaration of Independence and the United States Constitution.
As English settlers came to the shores of North America, they brought with them charters under the authority of the King. The Virginia Charter of 1606 promised the English settlers all the same “liberties, franchises and immunities” as people born in England. The Massachusetts Bay Company charter acknowledged the rights of the settlers to be treated as “free and natural subjects.”
In 1687, William Penn, an early American leader, who had at one point been imprisoned in the Tower of London for his political and religious views, published a pamphlet on freedom and religious liberty that included a copy of the Magna Carta and discussed it as a source of fundamental law. American scholars began to see Magna Carta as the source of their guaranteed rights of trial by jury and habeas corpus (which prevented a king from simply locking up his enemies without charges or due process). While that isn’t necessarily correct history, it is part of the growth of the seed of freedom and liberty that Magna Carta planted.
By July 4, 1776, the idea that government could, and should be, limited by the consent of its citizens and that government must protect individual rights was widely seen as springing forth from Magna Carta. The beautiful and important words penned by Thomas Jefferson in the Declaration spring from the fertile soil of Magna Carta:
We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. — That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed — That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness.
Obviously, Thomas Jefferson’s ideas of liberty and freedom had developed a great deal since Magna Carta was penned in 1215. But, it is impossible to read Magna Carta and the Declaration of Independence and not see the common DNA.
When the Founders debated, drafted and ratified the U.S. Constitution, it is also clear they were creating a set of rules and procedures to limit and check the power of government and to guarantee basic, individual rights.
The Fifth Amendment to the Constitution which guarantees “no person shall be deprived of life, liberty, or property, without due process of law,” is a concept that comes from Magna Carta. Our constitutional guarantees of “a speedy trial” as found in the Sixth Amendment are also founded in the political thought that grew from Magna Carta. The Constitution’s guarantee of the “privilege of the writ of habeas corpus” (Art.1, Sec. 9) is also a concept that grew from Magna Carta.
Even the phrase “the law of the land” comes from Magna Carta’s history. And now we use that phrase in the United States to describe our Constitution which we proudly label “the law of the land.”
To this day, Magna Carta is an important symbol of liberty in both England and the United States.
The Declaration of Independence and the U.S. Constitution are in my estimation the two most important and influential political documents ever written. What they did to provide promote and protect the freedom, opportunity and security of the average person is almost impossible to overstate. As British Prime Minister William Gladstone said in 1878, “the American Constitution is the most wonderful work ever struck off at a given time by the brain and purpose of man.”
I believe Gladstone was correct. But, Magna Carta was an important development in political thought and understanding about government power and individual rights. It is difficult to imagine the Declaration of Independence or the U.S. Constitution without the foundational elements provided by Magna Carta.
It was only a matter of time before cancel culture scored a hit in its fight to eliminate the Founders from our collective memory. On Tuesday, the San Francisco board of education, by a vote of six to one, opted to rename more than 40 schools named for historic figures whose lives can no longer stand up to woke scrutiny.
The action results from a resolution adopted in May 2018, which laid the groundwork to scrub from schools the name of anyone who “engaged in the subjugation and enslavement of human beings; or who oppressed women, inhibiting societal progress; or whose actions led to genocide; or who otherwise significantly diminished the opportunities of those amongst us to the right to life, liberty, and the pursuit of happiness.”
Among the names being struck, George Washington—first president of the United States, leader of the revolution that secured the nation’s independence from Great Britain and the man whose leadership established a model followed by every one of his successors over the more than 200 years since he was first inaugurated.
The stated reason for the move is that Washington, as everyone readily admits, was a slave owner. By contemporary standards, that’s apparently enough to cancel out everything else he did. Slavery is an odious, inexcusable practice—and always has been. One cannot help but feel, however, that the energies being spent attacking the father of our country and other Founders over their participation in it would be better spent marshaling the forces necessary to eradicate slavery where it still exists.
All this was predictable. Washington was once a venerated American institution, standing apart from every other president and every other leader. He was the reason the United States came into existence and the reason it survived its infancy. No leader, before or after, can match his record of accomplishment, which is why we, his grateful descendants, used to observe his birthday as a national holiday.
That all began to change during the last great period of social unrest, back in the 1960s, when for the sake of efficiency the celebration of Washington’s birthday was moved to the closest Monday as part of an effort to add a few more three-day weekends to the calendar.
Once the first president’s birthday was moved, the effort to obscure it became that much easier. To avoid adding a federal holiday with all its attendant expenses to the calendar, Washington’s birthday was combined with Abraham Lincoln’s to create “Presidents’ Day,” accelerating the toppling of the man from Mt. Vernon from the cultural pedestal upon which he had been deservedly perched for nearly 200 years.
We are losing—and may have already lost—the commonality of purpose that made the United States what one former president at least called “the last, best hope” for mankind. The lack of any formal observance of Washington’s birthday, which is just now upon us once again, corresponds with a growing lack of understanding of the kind of man he was, his indispensability to the cause of American independence and just what it was he accomplished.
To the extent people know him now, it’s not because of the excellent scholarship of historians like Ron Chernow but because of the way he appears as a supporting character in the life of Alexander Hamilton in the eponymous musical devoted to the life of “the ten-dollar founding father.”
Rather than present a more balanced approach to his life and story, those who would condemn Washington seek now to eradicate him from the pantheon of American heroes worthy of our continued respect and admiration.
Washington is, as I’ve written before, the model citizen-statesman. His counsel regarding America’s role in the world is still valuable. His wisdom is eternal. He is a great man of history who, being in the right place at the right time, changed the course of human events in a way that set all Americans thereafter on the path of what another founding father famously called “the pursuit of happiness.”
What Washington told us about the need to limit the powers of the central government so that human freedom might flourish is as relevant today as it was then. His wise leadership produced what has become the greatest, freest, most prosperous, most generous society ever to exist. No American should be allowed to forget that, no matter how large a blot his ownership of slaves made on his copy paper.
Washington was a towering figure, standing head and shoulders above almost all his contemporaries. We need to return him to his place of honor on the calendar and in our hearts and to understand why his name was considered worthy of being put on schools in the first place. Those who are trying to rewrite history so they may change it should be ashamed. We let them succeed at our own peril. We cannot forge ahead together towards a better day if we do not understand where we began, warts and all. It is time for Congress to strike a blow against historical revisionism by restoring Washington’s Birthday on the calendar and moving it back to February 22, where it belongs.
The following is adapted from an online lecture delivered at Hillsdale College on November 6, 2020.
Every generation of Americans, from the beginning, has had to answer for itself the question: how should we live? Our answers, generation after generation, in war and in peace, in good times and bad times, in small things and in great things through the whole range of human affairs, are the essential threads of the larger American story. There is an infinite variety of these smaller American stories that shed light on the moral and political reality of American life—and we keep creating them. These fundamental experiences, known to all human beings but known to us in an American way, create the mystic chords of memory that bind us together as a people and are the necessary beginnings of any human wisdom we might hope to find.
These mystic chords stretch not only from battlefields and patriot graves, but from back roads, schoolyards, bar stools, city halls, blues joints, summer afternoons, old neighborhoods, ballparks, and deserted beaches—from wherever you find Americans being and becoming American. A story may be tragic, complicated, or hilarious, but if it is a true American story, it will be impossible to read or listen to it attentively without awakening the better angels of our nature.
Here’s one, about the beautiful friendship of two remarkable Americans.
Helen Keller was 14 years old when she first met the world-famous Mark Twain in 1894. They became fast friends. He helped arrange for her to go to college at Radcliffe where she graduated in 1904, the first deaf and blind person in the world to earn a Bachelor of Arts degree. She learned to read English, French, German, and Latin in braille and went on to become practically as world-famous as her dear friend, writing prolifically and lecturing across the country and around the world. Twain, with his usual understatement, called her “one of the two most remarkable people in the 19th century.” The other candidate was Napoleon.
Keller lived into the 1960s and shared some of her fond memories of Twain in an autobiographical book she published in 1929. In particular, she records recollections from her last visit to her friend in his “Stormfield” home in Redding, Connecticut, which she thought of as a “land of enchantment.” She preserves for us a vivid image not only of Mark Twain—Mr. Clemens, as she called him—but of her own vivacious mind. About Twain she writes,
There are writers who belong to the history of their nation’s literature. Mark Twain is one of them. When we think of great Americans we think of him. He incorporated the age he lived in. To me he symbolizes the pioneer qualities—the large, free, unconventional, humorous point of view of men who sail new seas and blaze new trails through the wilderness.
As they gathered around the hearth one night after dinner at Stormfield, she records,
Mr. Clemens stood with his back to the fire talking to us. There he stood—our Mark Twain, our American, our humorist, the embodiment of our country. He seemed to have absorbed all America into himself. The great Mississippi River seemed forever flowing, flowing through his speech.
When Twain took her to her room to say goodnight, he said “that I would find cigars and a thermos bottle with Scotch whiskey, or Bourbon if I preferred it, in the bathroom.”
One evening, Twain offered to read to her from his short story, “Eve’s Diary.” She was delighted, and he asked, “How shall we manage it?” She said, “Oh, you will read aloud, and my teacher will spell your words into my hand.” He murmured, “I had thought you would read my lips.” And so that is what she did. Upon request, and as promised, Twain put on his “Oxford robe,” the “gorgeous scarlet robe” he had worn when Oxford University “conferred upon him the degree of Doctor of Letters.”
Here is Keller’s recollection of the evening:
Mr. Clemens sat in his great armchair, dressed in his white serge suit, the flaming scarlet robe draping his shoulders, and his white hair gleaming and glistening in the light of the lamp which shone down on his head. In one hand he held “Eve’s Diary” in a glorious red cover. In the other hand he held his pipe. . . . I sat down near him in a low chair, my elbow on the arm of his chair, so that my fingers could rest lightly on his lips.
“Everything went smoothly for a time,” she wrote. But Twain’s gesticulations soon began to confuse things, so “a new setting was arranged. Mrs. Macy came and sat beside me and spelled the words into my right hand, while I looked at Mr. Clemens with my left, touching his face and hands and the book, following his gestures and every changing expression.”
Keller reflected that,
To one hampered and circumscribed as I am it was a wonderful experience to have a friend like Mr. Clemens. I recall many talks with him about human affairs. He never made me feel that my opinions were worthless. . . . He knew that we do not think with eyes and ears, and that our capacity for thought is not measured by five senses. He kept me always in mind while he talked, and he treated me like a competent human being. That is why I loved him. . . . There was about him the air of one who had suffered greatly.
Whenever I touched his face his expression was sad, even when he was telling a funny story. He smiled, not with the mouth but with his mind—a gesture of the soul rather than of the face. His voice was truly wonderful. To my touch, it was deep, resonant. He had the power of modulating it so as to suggest the most delicate shades of meaning and he spoke so deliberately that I could get almost every word with my fingers on his lips. Ah, how sweet and poignant the memory of his soft slow speech playing over my listening fingers. His words seemed to take strange lovely shapes on my hands. His own hands were wonderfully mobile and changeable under the influence of emotion. It has been said that life has treated me harshly; and sometimes I have complained in my heart because many pleasures of human experience have been withheld from me, but when I recollect the treasure of friendship that has been bestowed upon me I withdraw all charges against life. If much has been denied me, much, very much has been given me. So long as the memory of certain beloved friends lives in my heart I shall say that life is good.
When Helen Keller left the enchanted land of Stormfield on that visit, she wondered if she would ever see her friend again, and she didn’t. It was 1909, and Clemens would live just one more year. But, she writes for us, “In my fingertips was graven the image of his dear face with its halo of shining white hair, and in my memory his drawling, marvelous voice will always vibrate.”
Here’s another story about an American whose name the whole world knows.
Twenty-two-year-old Marion Morrison, known to his friends as Duke, was carrying a table on his head across the soundstage of a John Ford movie. He was working as a prop man at the Fox Studio in Los Angeles early in 1930. Director Raoul Walsh was looking for a leading man for an epic western film he was developing about a great wagon train journeying across vast deserts and mountains to California. Walsh didn’t want a known star to play the lead. He was looking for someone who would “be a true replica of the pioneer type.” He didn’t want the audience to see a part being acted; he wanted them to see the real thing—“someone to get out there and act natural . . . be himself.” Then he happened upon the young Duke Morrison lugging a table across a soundstage.
“He was in his early 20s,” Walsh recalled, “[and] laughing. . . . [T]he expression on his face was so warm and wholesome that I stopped and watched. I noticed the fine physique of the boy, his careless strength, the grace of his movement. . . . What I needed was a feeling of honesty, of sincerity, and [he] had it.” Within a few weeks, after a quick screen test, Duke would be signed up for the part of Breck Coleman, the fearless young scout in an ambitious film to be called The Big Trail; he would more than double his income, from $35 to $75 a week. He had to let his hair grow long and learn to throw a knife—and he would have a new name: John Wayne.
Already, as the young frontiersman in The Big Trail, the man the world would come to know as John Wayne is recognizable. He is more athletic and beautiful than we remember him from his later pictures, and he has a sweetness and shyness of youth that recedes over time, but he is “tough and in charge”; he has “a natural air of command.” The widescreen film is still visually stunning and interesting to watch, but it was an epic flop and left Wayne languishing in B-movie purgatory for almost a decade before John Ford decided to make him a star as the Ringo Kid in the great western Stagecoach.
Ford was inspired by something similar to what Raoul Walsh had seen in Duke Morrison. “It isn’t enough for an actor to look the part and say his lines well,” said Ford. “Something else has to come across to audiences—something which no director can instill or create—the quality of being a real man.” Ford added that Wayne “was the only person I could think of at the time who could personify great strength and determination without talking much. That sounds easy, perhaps. But it’s not. Either you have it or you don’t.” John Wayne had it. As James Baldwin wrote, “One does not go to see [Katharine Hepburn or Bette Davis, Humphrey Bogart or John Wayne] act: one goes to watch them be.”
And Duke Morrison decided that John Wayne would be the kind of man he—and the audience—wanted to believe in. Whatever his flaws, and Wayne’s characters had many, he would present on screen a character that had something admirable in it. This character took on added dimensions in his greatest films like Red River and The Searchers. But its essence was discernable from the earliest days. He had courage and self-reliance, obstinacy and even ruthlessness; but also generosity of soul and spirit. As his biographer Scott Eyman put it, he had the kind of “spirit that makes firemen rush into a burning building . . . because it’s the right thing to do.” He had “humor, gusto, irascibility”; he was “bold, defiant, ambitious, heedless of consequences, occasionally mistaken, primarily alone—larger than life.” As one of Wayne’s colleagues said, “John Wayne was what every young boy wants to be like, and what every old man wishes he had been.”
Wayne was 32 when he made Stagecoach and 69 when he made his last film, The Shootist, in which he plays the dying gunfighter, John Bernard Books. His oft-quoted line from that film would have been right at home in The Big Trail: “I won’t be wronged, I won’t be insulted, I won’t be laid a hand on. I don’t do these things to other people, and I require the same from them.” For 25 years, from 1949 to 1974, he was among the top ten box office stars every year but one. And he was more than a star for his time. Well into the 21st century, 35 years after his death, he was still listed as one of America’s five favorite movie stars; he became “indivisibly associated with America itself.”
On his 72nd birthday, May 26, 1979, as Wayne lay dying of cancer in UCLA Medical Center, the United States Congress, in a unanimous bipartisan vote, approved an order signed by President Jimmy Carter for striking a Congressional Gold Medal in his honor. Wayne would be the 85th recipient of the Medal. The first recipient was George Washington. Winston Churchill was awarded the Medal just a few years before John Wayne. As President Carter said, Wayne’s “ruggedness, the tough independence, the sense of personal conviction and courage—on and off the screen—reflected the best of our national character.” Wayne’s friend, actress Maureen O’Hara, testifying before Congress, said: “To the people of the world, John Wayne is not just an actor, and a very ﬁne actor, John Wayne is the United States of America. He is what they believe it to be. He is what they hope it will be. And he is what they hope it will always be.”
And finally, here’s a story about an American whose name you may not know, but will want to.
“We Are All Americans”
Ely Parker was born in 1828 to Elizabeth and William Parker of the Tonawanda Seneca tribe of the Iroquois Confederacy in western New York. Parker became a leader in his tribe at a very young age. Trained as a civil engineer, he earned a reputation in that field. In 1857, when he was 29 years old, he moved to Galena, Illinois, as a civil engineer working for the Treasury Department, and there his life took a fateful turn.
He became friends with a fellow named Ulysses S. Grant. In these years, Grant was an ex-Army officer working as a clerk in his father’s store. Parker later liked to tell the story of coming to Grant’s aid in a barroom fight in Galena, the two of them back to back, fighting their way out against practically all the other patrons. At about five feet eight inches and 200 pounds, the robust Parker referred to himself as a “Savage Jack Falstaff.”
When the Civil War came on, Parker tried several times to join the Union Army as an engineer but was turned down because he was not a citizen. When he approached Secretary of State William Seward about a commission, he was told that the war was “an affair between white men,” that he should go home, and “we will settle our own troubles among ourselves without any Indian aid.”
Eventually, with Grant’s endorsement, Parker received a commission, with the rank of captain, as Assistant Adjutant General for Volunteers. By late 1863, he had been transferred to Grant’s staff as Military Secretary. He soon became familiarly known as “the Indian at headquarters” and was promoted to lieutenant colonel and later to brigadier general. He may have saved Grant’s life or at least prevented his capture one dark night during the Wilderness Campaign in 1864, when Grant and his staff, unbeknownst to themselves, were riding into enemy lines.
But Parker is rightly most remembered for something that happened in the parlor of a private residence in the village of Appomattox Court House on April 9, 1865.
In the days preceding, Union armies had captured the city of Petersburg and the Confederate capital of Richmond. Grant and the Federal Army of the Potomac had put Confederate General Robert E. Lee and the Army of Northern Virginia in such a position that in the late afternoon of April 7, Grant, sitting on the verandah of his hotel headquarters in Farmville, said to a couple of his generals, “I have a great mind to summon Lee, to surrender.” He immediately wrote a letter respectfully inviting Lee to surrender and had it sent to him under a flag of truce. It took Lee a couple of days of desperate failed maneuvers to come around to the idea. But by the morning of April 9, Lee had concluded that “there is nothing left me to do but to go and see General Grant, and I would rather die a thousand deaths.”
They agreed to meet in the village of Appomattox Court House to discuss terms.
Grant had been riding hard for days on rough roads in rough weather. When he met Lee in the parlor of the brick house where they had arranged to meet, he had on dirty boots, “an old suit, without [his] sword, and without any distinguishing mark of rank, except the shoulder straps of a lieutenant general on a woolen blouse.” Lee was decked out from head to toe in all the military finery he had at his disposal.
After introductions, and not much small talk, Lee asked Grant on what terms he would receive the surrender of Lee’s army. Grant told him that all officers and men would be “paroled and disqualified from taking up arms again until properly exchanged, and all arms, ammunition, and supplies were to be delivered up as captured property.” Lee said those were the terms he expected, and he asked Grant to commit them to writing, which Grant did, on the spot, and showed them to Lee.
With minor revisions, Lee accepted, and Grant handed the document to his senior adjutant general, Theodore Bowers, to “put into ink.” This was a document that would effectively put an end to four years of devastating civil war. Bowers’ hands were so unsteady from nerves that he had to start over three or four times, going through several sheets of paper, in a failed effort to prepare a fair copy for the signatures of the generals.
So Grant asked Ely Parker to do it, which he did, without trouble. This gave occasion for Lee and Parker to be introduced. When Lee recognized that Parker was an American Indian, he said, “I am glad to see one real American here.”
Parker shook his hand and replied, “We are all Americans.”
The American story, still young, is already the greatest story ever written by human hands and minds. It is a story of freedom the likes of which the world has never seen. It is endlessly interesting and instructive and will continue unfolding in word and deed as long as there are Americans. The stories that I think are most important are those about what it is that makes America beautiful, what it is that makes America good and therefore worthy of love. Only in this light can we see clearly what it is that might make America better and more beautiful.
In a heated presidential campaign year, two dates in history have illustrated our deep national divide. The New York Times spoke for liberal America when it declared last year that the real founding of the country was in 1619 when the first African slaves arrived on its shores. In short, the 1619 Project argued that what was distinctive and problematic about America was its economic system of capitalism and the original sin of slavery that established it.
President Trump responded for many conservatives last month when he proposed the creation of a 1776 commission, underscoring that the real founding of the country came with the Declaration of Independence and, a decade later, the Constitution. What makes America distinctive, in this view, is political freedom guaranteed by a unique constitutional system.
While this is an important debate, two other numbers speak more clearly and less divisively about today’s most serious problem with U.S. history: Twenty-four and 15. Those are the percentages of eighth graders who scored “proficient” or better in government/civics (24%) and U.S. history (15%) in National Assessment of Educational Progress test scores announced earlier this year. Secretary of Education Betsy DeVos rightly called these scores “stark and inexcusable.”
Sam Cooke’s 1960 song lyric is now literally true of America’s children: “Don’t know much about history.”
We fail to appreciate the profound effect civic ignorance has on the body politic. Only about 60% bother to vote, described by Founding Father Thomas Paine as “the primary right by which other rights are protected.” Only 55% voted in 2016, even fewer (40%-50%) when there is no race for the presidency. Data published by the Organization for Economic Cooperation and Development shows that U.S. voting rates are only rated 26th out of 32 highly developed democratic states. Young people’s trust in government has plummeted, with only 27% expressing trust in elected officials. Indeed, only 17% trust the government “to do what is right most of the time.” As one expert said, “How can you trust what you do not understand?”
At other times in our recent history, failures in our educational system led to alarm and action. The Soviets’ launch of Sputnik, the first satellite in space, in 1957, led to calls for improvement in science and technology education. A discouraging national report on the state of education generally, “A Nation at Risk,” launched a series of reading and math initiatives in the 1980s and beyond. Despite failing test scores and reduced curriculum offerings in civics education, however, little or nothing has been done.
In a recent article published by the Orrin G. Hatch Foundation, I have proposed a series of steps to reverse our civics decline. Happily, we do not have to wait for the gridlock and hyperpartisanship in Washington to go away in order to fix this because there are many important goals to be addressed at other levels, especially in the states and schools.
The main point is that we need to make civic education a national priority with extra emphasis everywhere. The federal government needs to restore and increase funding for civics that it practically eliminated in 2010. In fact, by one estimate, the federal government now spends $54 per schoolchild on STEM (science, technology, engineering, and math) education and a meager 5 cents per student on civics.
States that required multiple courses in civics and government in the 1960s in most cases now mandate only a single semester in civics education, with almost no attention to it in elementary and middle school. Studies show that teachers are often ill-prepared themselves to teach civics education. Is it any wonder that students in Rhode Island have sued their state for poor civics education?
Civics have taken a back seat in our schools to reading, math, and especially STEM. But can saving our democracy be any less important than getting good jobs in technology? That is what is at stake if we do not make a national commitment to strengthening civics education.
Generations before Facebook or Twitter, Tocqueville warned that censoring the press would endanger the survival of freedom and democracy in America.
With the recent suppression of a New York Post story damaging to Joe Biden’s presidential campaign, many Americans have finally had enough of the one-sided censorious behavior of tech giants. Less than three weeks before one the most contentious and fraught elections in American history, Facebook and Twitter users were alarmed when it became clear they were prevented from sharing the Post’s article detailing the sordid dealings of Joe Biden’s son, Hunter.
Both citizens and lawmakers justifiably fear the enormous influence wielded by entities like Facebook, Google, and Twitter; the rise of an unchecked tech-tyranny where one side of the political aisle has its views promoted while the other side has its views punished. Nearly two centuries ago, the author of one of the most penetrating insights on American life shared similar fears of what would happen should a free press remain free in name only.
Traveling across America in the 1830s, young French aristocrat Alexis de Tocqueville saw a nation filled with both promise and peril. Amidst boundless opportunities, an economically vibrant workforce, and an ever-increasing equalization of conditions, the potential for tyranny lurked underneath an otherwise promising future. Tocqueville feared some of the forces at work in the young republic could lead to despotism.
To prevent this future, Tocqueville sung the praises of two essential safeguards: a free press connected with freedom of association. Armed with these two weapons, Tocqueville argues the United States can help prevent a tyranny of the majority as well as the chilling and repressive effects of a nascent soft despotism. Yet, of the two, Tocqueville’s principal solution for America is a free press.
Unfortunately, as Tocqueville noted — and we’ve now witnessed — the free press he prescribes functions as a double-edged sword. To be sure, the press and modern media can help cultivate liberty. It can do a marvelous job of keeping the people informed of politics, sustaining their activity in local government, and helping to make their voices heard. In doing so, it can help train the populace in the necessary exercise of freedom. Liberty, after all, is like a muscle: if it is not used regularly it will atrophy.
On the contrary, an unhealthy, ill-functioning press can create problems rather than prevent them. If the press or powerful media organs can influence such a vast number of people at once; if there isn’t enough volume granted to dissenting voices; if the levers of media and press control are too tightly concentrated, a deadly homogenization of the American mind may occur.
When this happens, the former sovereignty of the people is transformed into something both helplessly docile and malevolent — worse, something deadly to liberty. These were the stakes back during the time of Andrew Jackson. Today, the situation is all the more dire.
In “Democracy in America,” Tocqueville writes Americans should strive to be continually “making liberty emerge from within the democratic society in which God makes us live.” One of the most effective avenues to pursuing this is to give some degree of local administrative power to bodies of private citizens such as would be found in newspapers, periodicals, or pamphlets — and today’s social media platforms.
A free press made up of numerous varied newspapers fulfilled this role in 19th-century America. In the 21st century, websites and social media should — hypothetically — join traditional print publications to prevent the dangers of the tyranny of the majority. When operating fairly and nobly, they provide a way for every voice to be heard.
Of course, a free press and media aren’t just useful vehicles for spreading ideas or forming associations, but for ensuring that new associations can connect their ideas over large distances. Furthermore, in a free nation, the press can and should help to disperse power — not concentrate it within itself. The answer to ideas some citizens disagree with is not to stifle, curtail, or limit such speech, it is to encourage more of it.
Beyond this, protecting freedom of the press is vitally important because it can often serve as an individual’s best or only means of appeal. Tocqueville writes:
A citizen who is oppressed has therefore only one means of defending himself; it is to address himself to the whole nation, and if it is deaf to him, to humanity; he has only one means to do it, it is the press. Thus liberty of the press is infinitely more precious among democratic nations than among all others; it alone cures most of the evils that equality can produce. Equality isolates and weakens men; but the press places beside each one of them a very powerful weapon, which the weakest and most isolated can use.
As Tocqueville observes in “Democracy in America,” opening and running an American newspaper in the 19th century was both relatively inexpensive and unregulated. As such, this meant a truly free press was an accessible weapon available to the common man to beat back the tyranny of the majority and the homogenization of the mind.
Thousands of newspapers operating throughout the country and representing various individuals, associations, and interests, was both a way of protecting divergent opinions as well as checking against the rise of despotic or tyrannical forces. In the current climate of Big Tech censorship, men and women of all political stripes should be asking themselves if this can be said of America any longer.
A healthy and truly free press is one of the mechanisms that can help prevent the public from being manipulated into having one set of “approved” opinions. Freedom of the press, says Tocqueville, does not just hold important influence over the success or failure of political parties, it makes its power felt “over all of the opinions of men”; not only that, it modifies both the laws and the mores of a society.
Indeed, if laws can affect the mores of a society, and the mores of society can affect the laws, something that can simultaneously change both is a weapon capable of either awe-inspiring good or tremendous evil. Tocqueville argues a free press has the power to do just that.
What happens if this power is used to stifle speech rather than spread it? The result, unfortunately, is not good for any polity featuring democratic institutions. As University of Oklahoma professor Donald J. Maletz puts it: “Tocqueville associates democracy with freedom of the press as a matter of principle.” As one goes, so goes the other. Forebodingly, Tocqueville calls the issue of how to handle a free press “the greatest problem of modern societies.”
Due to its non-institutional nature, a free press is unique in its role in helping prevent tyranny because it exists apart from the governmental arena. Separations of power and varied institutions are not enough to prevent tyranny if all interests involved are the same — you need associations or organizations outside of government as well.
Ultimately, the freedom of the press may well be the final bulwark of liberty against a rising tide of corruption. By Tocqueville’s reasoning, once the press ceases to be free, it’s hard for any society wishing to regain freedom for its citizens to do so, as the best avenues for opposition will be closed. Because of this reality, those who love liberty and value an open society must guard against any censorship of the press.
Tocqueville acknowledges in “Democracy in America” that an unfettered press can create problems, and is only so virtuous because it prevents more problems than it creates. Even so, Tocqueville goes on to powerfully proclaim one cannot be “moderate” in support of a free press. For Tocqueville, there’s no sustainable or workable “middle ground” when it comes to press censorship.
To “reap the inestimable advantages” brought by the freedom of the press, society must learn how to handle its potential pitfalls. This much is clear, however: liberty starts to evaporate the moment powerful entities within society start to censor its press or suppress the work of reporters and writers.
As historian Thomas G. West points out, James Madison saw free speech as a natural, retained right, not a privilege created by the government. West puts it in clear terms:
There is an absolute right to freedom of speech, just as there is an absolute or inalienable right to liberty in general. … For the founders, speech is simply a part of the overall natural right to liberty, which it is the main job of government to secure.
Indeed, the 1780 Massachusetts Declaration of Rights went so far as to say: “The liberty of the press is essential to the security of freedom in a state: it ought not, therefore to be restrained.”
In her analysis of “Democracy in America,” the University of Notre Dame’s Catherine Zuckert believes Tocqueville saw freedom of speech as an “essential part of liberal democracy.” She’s right. Tocqueville warned stifling press freedom, even a little, will lead to a chilling silence, and society will find itself “under the feet of a despot.”
The publication of Thomas Paine’s “Common Sense” shows the power of a free press during turbulent times. Paine’s pamphlet, which sold around 100,000 copies in 1776, is called by historian Grant S. Wood “the most incendiary and popular pamphlet of the entire revolutionary era.” It is an exemplary case of a political tract in layman’s language that shaped the future of a continent — all made possible by the press.
Freedom of the press, when combined with associations, acts as an incentivization to participate and be active in politics. For Tocqueville, the relationship between newspapers and free associations is symbiotic and correlative: “newspapers make associations, and associations make newspapers.”
Properly functioning and free, the press can encourage debate instead of hindering it. It can foster statesmanship instead of leading to the rise of despots. The exchange of ideas and the proliferation of the best new civic and societal notions can be a chief tool in preserving the essential balance between liberty and virtue in America.
While the left’s current stranglehold on corporate media is formidable, Tocqueville would at least be partially hopeful that the rise of conservative voices on the internet, new media, and outlets like Facebook, YouTube, and Twitter will at put up a fight to uphold liberty — that is, as long as they aren’t silenced in turn by the very platforms that are supposed to aid in the spread of ideas.
The “press” may look a lot different than in 1831, but it remains pivotal in the struggle to preserve freedom. Until enough Americans unite their voices and demand that tech giants like Facebook and Twitter stop their oppressive censorship of the very press and media outlets essential to the health of our republic, things will only get worse, and Tocqueville’s worst nightmares will inch closer to becoming reality.
It always amazes me just how stupid reporters are. Maybe stupid isn’t the right word, ignorant is more like it. How do people who claim to be the arbiters of what is news not follow the news? Seems like knowing what you’re talking about would be an important component of journalism, especially since journalism considers itself “the first draft of history.” But for too many of these left-wing teleprompter readers and Democratic Party stenographers, history just started yesterday.
MSNBC anchor Katy Tur is known not for her depth of knowledge on important issues, but her basic ignorance of things that happened in her lifetime is disturbing. In a debate in 2017 with a Republican congressman (because why wouldn’t a “news” anchor debate a Republican?), she exposed how unaware she was of something that happened in 2012 – when then-President Barack Obama told then-Russian President Dmitry Medvedev to tell Vladimir Putin he’d have “more flexibility” after the election. It was news to Tur, whose excuse was, “To be fair, I didn’t touch politics in 2012. I almost exclusively covered fires and shootings in NYC area.” Apparently New York City doesn’t have cable news or newspapers.
But all the ignorance of things that happened before today isn’t limited to television personalities. Colby Itkowitz, who covers national politics for the Washington Post, showed just how oblivious a reporter could be and still hold a job. Saturday, after President Trump signed executive orders related to tax policy and coronavirus relief, Colby tweeted, “Let’s ponder the most played out question of the last four years, but can you imagine if Obama had broken up a congressional stalemate over funding by simply signing an executive order and saying it was so? (jinx @pbump).”
This is particularly stupid for a number of reasons. First, in tagging her co-worker Phillip Bump, she showed she was quite proud of beating him to this declaration, that this sort of talk is common around the Post. Second, President Obama changed large sections of Obamacare with the stroke of his magic pen well within her lifetime. Third, if history didn’t start until Trump was elected, you’d at least think a reporter covering national politics for a major newspaper would be aware of the legal challenges to the DACA program, especially since the Supreme Court just ruled on it in June.
All of these escaped Itkowitz’s notice, somehow. When her ignorance was made apparent to her, she did what all good “journalists” would do – deleted the tweet and pretended it never happened.
Lest you think it’s just the younger media types who are ignorant of history, the senior citizen-set appears to have a memory rivaling Joe Biden’s as well.
New York Times columnist Maureen Dowd wrote a column titled “No Wrist Corsages, Please,” Saturday about how it’s been since 1984 that Democrats had a man and a woman on their presidential ticket. “It’s hard to fathom, but it has been 36 years since a man and a woman ran together on a Democratic Party ticket, writes @MaureenDowd,” the Times tweeted about a column Down had written proclaiming the same.
I understand why liberals would want to forget the 2016 election, and why everyone would like to forget Hillary Clinton, but you’d think someone in the multi-person editorial process that takes place before anything gets published by the Times would have a memory of it. (Not to mention ignoring the 2008 Republican “mixed-gender ticket.) You’d be wrong. The correction, “An earlier version of this column incorrectly stated the history of the Democratic ticket. It has been 36 years since a man chose a woman to run as his vice-president on the Democratic ticket, not 36 years since a man and a woman ran together on a Democratic Party ticket,” is one for the record books.
These are but three examples of ignorance of recent history from people working in a profession noted for the smugness of its practitioners.
Sadly, journalism is important. Unfortunately, we aren’t getting any. We’re getting self-righteous lectures from arrogant know-nothings who, whenever possible, ignore their mistakes, which uniformly go in one direction – against Republicans. Is it any wonder that 86 percent of the public in a recent survey said they find either “a great deal” (49 percent) or “a fair amount” (37 percent) of bias in media? They used to at least pretend to be honest.
Of course, when you operate in an ever-shrinking bubble of likeminded colleagues, you don’t even notice the problem. A new study found“Beltway journalism ‘may be even more insular than previously thought,’” which the authors say raises “‘additional concerns about vulnerability to groupthink and blind spots.’”
If there’s no one in your circle who knows any better, you’ll never think you’re wrong and not know when you’ve crossed a line. If everyone you know is polishing their resume in the hope of getting a job in a Biden administration, you’d better update yours too. If Joe loses, you can fill that hole in your heart with the awards you’ll be showered with for your biased, incorrect reporting. And you don’t have to worry about being haunted by thoughts of betraying the ideals of your profession since history starts all over again tomorrow.
Over the last six weeks, America has been rocked to its cultural foundations by a wave of attacks on monuments and memorials to persons and events traditionally held to be historically significant. What began as an assault on statuary dedicated to the memory of former Confederate generals has evolved into an all-out war on the national narrative.
No one or thing is safe. Statues of George Washington. Abraham Lincoln and slave-born abolitionist Frederick Douglass have all been recently vandalized as have those dedicated to the memory of musicians Stevie Ray Vaughn and Jimi Hendrix.
Little of this makes sense. The protests that began in the aftermath of George Floyd’s death have evolved into riots, looting, and general mayhem stoked by anarchists and progressives who not only want to destroy Donald Trump but everything they believe he and his presidency represent.
They have a distorted view of history – as their defacing and destruction of statues of Washington, Lincoln, and others who led crusades on behalf of freedom and equality prove. Whatever they learned in school, it had little to do with the hard decisions and moral choices we may all at some point be called upon to make in life.
Would it have been better if the founders, because they could not agree to end slavery had abandoned America’s bid for independence? Or if only those that would abolish slavery had proceeded, leaving them to fight both the British crown and the colonies that remained tied to the King? Or, as most all of us have long believed, the struggle for the independence and equality of all men and women began with this effort of some to secure liberty for themselves and those like them? And for that, we owe them our gratitude and a certain degree of reverence?
Things have progressed well beyond the sensible out to the absurd. Reason no longer applies. The U.S. and Canadian press Tuesday reported that a memorial to victims of Communism under construction in Ottawa had been vandalized. According to The Post Millennial, the fence surrounding the site in the Canadian capital city was defaced by the phrase “Communism will win” in spray-painted in yellow alongside three depictions of the Communist hammer & sickle.
The American memorial to the Victims of Communism, which was completed more than a decade ago and sits at the base of Capitol Hill was similarly defaced with graffiti related directly to the Black Lives Matter movement in early June.
If this is meant to be some sort of cry for social justice, it is wrongly directed. Adolf Hitler, typically held up as the ultimate state-sponsor of evil in the 20th-century evil if not all time, led a Holocaust in which somewhere between 11 and 13 million people were killed according to most estimates. The leaders of the countries and rebel bands that formed the international Communist bloc – Lenin, Stalin, Mao, Castro, Che, and others all the way up to Kim Jong Un, who is still with us today – are responsible for the deaths of at least 10 times as many people.
Communism is neither just not equitable. American schools don’t do a good job teaching that if they teach it at all– which may be while those responsible in recent weeks for so much destruction in Seattle have left the Lenin statue there unmolested. They and those who’ve joined with them in cities like Richmond, Atlanta, Rochester, N.Y., and Washington, D.C., aren’t interested in rewriting American history. They want to erase it so they can replace it with a narrative of their own that leads to a justification of the demands they have today. History, before it can be rewritten, must be destroyed. The Confederate statues were just the beginning, low-hanging fruit, easy to get before the progressives could start reaching for objectives much higher on the tree.
There are certain incidents, indelibly etched on the memory of the American people, that have done much to shape our national character. Some, like 9/11, are still fresh in our minds. Others, like December 7th, 1941, are slipping away into the mists of time as the number of those who heard the dramatic news bulletins or experienced the attacks dwindles towards its inevitable destination.
Further in the past, events like Lexington and Concord, Washington crossing the Delaware, and the Battle of Yorktown have become the thing of myths. No one alive and no one who knew anyone alive at the time they occurred stills walks among us. We must rely on the historical record, embellished though it may at times be, to teach us what happened there.
Why these events are important though is a matter left to our judgment. Things change over time, as can be witnessed in the ongoing struggle to interpret — and reinterpret — the justifications for the American Civil War and the reasons men on both sides chose to fight.
It remains a divisive point in our history. At its end some were led out of bondage and into a form of freedom while others were to a degree subjugated as punishment for having been on the losing side. This was not what history tells us Abraham Lincoln wanted.
The vision of our martyr-president, laid out so eloquently by him in so many manuscripts and speeches still with us, was of a nation where all men and women were free and equal. He wanted a gentle peace, one that brought the people of the Union together once again as brothers and sisters. He made this clear many times, but perhaps best at the dedication of a cemetery for soldiers fallen in around Gettysburg, Pennsylvania.
The battle itself is regarded as the turning point of the war. It was certainly a time of heroics, from Chamberlain’s Mainers surge down Little Round Top, out of ammunition and bayonets affixed, to Pickett’s Charge and beyond. It was three horrific days of brother fighting brother yet, less than 100 years later, veterans of the North and veterans of the South came together again in this same place as one, in memory of fallen comrades and looking ahead to a nation once again knitted together by the toil and sweat and allegiance to the same Constitution.
Let us remember this on Memorial Day as we remember those who paid the ultimate sacrifice in defense of our country and of the freedoms for which it stands as a bright light, signaling the preeminence of liberty on our shores to all the world.
FOUR SCORE AND SEVEN YEARS AGO OUR FATHERS BROUGHT FORTH ON THIS CONTINENT, A NEW NATION, CONCEIVED IN LIBERTY, AND DEDICATED TO THE PROPOSITION THAT ALL MEN ARE CREATED EQUAL.
NOW WE ARE ENGAGED IN A GREAT CIVIL WAR, TESTING WHETHER THAT NATION, OR ANY NATION SO CONCEIVED AND SO DEDICATED, CAN LONG ENDURE. WE ARE MET ON A GREAT BATTLE-FIELD OF THAT WAR. WE HAVE COME TO DEDICATE A PORTION OF THAT FIELD, AS A FINAL RESTING PLACE FOR THOSE WHO HERE GAVE THEIR LIVES THAT THAT NATION MIGHT LIVE. IT IS ALTOGETHER FITTING AND PROPER THAT WE SHOULD DO THIS.
BUT, IN A LARGER SENSE, WE CAN NOT DEDICATE — WE CAN NOT CONSECRATE — WE CAN NOT HALLOW — THIS GROUND. THE BRAVE MEN, LIVING AND DEAD, WHO STRUGGLED HERE, HAVE CONSECRATED IT, FAR ABOVE OUR POOR POWER TO ADD OR DETRACT. THE WORLD WILL LITTLE NOTE, NOR LONG REMEMBER WHAT WE SAY HERE, BUT IT CAN NEVER FORGET WHAT THEY DID HERE. IT IS FOR US THE LIVING, RATHER, TO BE DEDICATED HERE TO THE UNFINISHED WORK WHICH THEY WHO FOUGHT HERE HAVE THUS FAR SO NOBLY ADVANCED. IT IS RATHER FOR US TO BE HERE DEDICATED TO THE GREAT TASK REMAINING BEFORE US — THAT FROM THESE HONORED DEAD WE TAKE INCREASED DEVOTION TO THAT CAUSE FOR WHICH THEY GAVE THE LAST FULL MEASURE OF DEVOTION — THAT WE HERE HIGHLY RESOLVE THAT THESE DEAD SHALL NOT HAVE DIED IN VAIN — THAT THIS NATION, UNDER GOD, SHALL HAVE A NEW BIRTH OF FREEDOM — AND THAT GOVERNMENT OF THE PEOPLE, BY THE PEOPLE, FOR THE PEOPLE, SHALL NOT PERISH FROM THE EARTH.
NOVEMBER 19, 1863
Professional American historiography has made steady advances in the breadth and sophistication with which it approaches certain aspects of the past, but those advances have come at the expense of public knowledge and shared historical consciousness. The story of America has been fractured into a thousand pieces and burdened with so much ideological baggage that studying history actually alienates young Americans from the possibility of properly appreciating their past. Nearly 20 years ago I wrote a small book called The Student’s Guide to U.S. History for ISI Books. I was unable to include in its bibliography a high school or college level textbook on U.S. history, because there was not one suitable for recommendation.
But criticism of the status quo is easy. What is harder is to create a better alternative. That was my aim in writing Land of Hope: An Invitation to the Great American Story.
Land of Hope swims against the prevailing currents in several ways, not the least of which is that it is a physical book. It is no coincidence that the giant textbook publisher Pearson has just announced its plans to go digital-first with its own massive array of textbooks, 1,500 titles in all, including those in history. Students will eventually be required to use—and institutions will be required to offer—the constantly updated texts, tethering students and schools exclusively to the publisher’s digital platform. George Orwell, please call the Ministry of Truth.
In the early years of printing, printers would often display a truncated version of a Latin proverb: Littera scripta manet, which means, “The written letter remains.” The whole proverb reads: Vox audita perit littera scripta manet, which can be translated, “The heard voice perishes, but the written letter remains.” It contrasts fleeting orality and settled literacy. What does such a proverb mean today, when our civilization—in which the great majority of inhabitants, as Christians and Jews, have been People of the Book—is fast becoming a civilization inhabited by People of the Screen, people tied to the ever-changing, ever-fluid, ever-malleable presentation of the past made possible by the nature of digital technology?
Land of Hope also goes against the current by not dumbing down the reading level. It is written with an underlying conviction that we should never sell short the capacity of young Americans to read challenging books if they are interesting and well-wrought. Such books are far more likely to stoke the fire of their imaginations and convey to them the complexity and excitement of history—history not as an inert recitation of facts, but as a reflective task that takes us to the depths of what it means to be human.
Let me mention three distinctive themes that run through the book, themes that are hinted at in the book’s title and are instructive about America’s character.
First, there is the theme of America as a land—not just an idea, but also a people and a nation; a nation with a particular history, connected to a particular piece of real estate. To understand our nation, it’s not enough to understand principles such as equality and liberty, as important as those are. We also have to understand how those principles were put into action, how they were developed, how they came to be forces in our national life. American history, to be sure, is inseparable from America’s principles and ideals, but America is not simply those things. It is a place with a venerable history created by men and women to whom our veneration is owed. Think of those who lie in Arlington National Cemetery and of countless others in the long history of such sacrifices made on behalf of our country. These things bind us to the land in visceral ways that go beyond ideas or principles.
Second is the theme of hope. The idea of America as a land of hope shouldn’t be misinterpreted as signifying a saccharine or sentimental view of America’s past, but rather as taking into account history’s spiritual dimension. We are creatures with free wills and aspirations, not merely tumbleweeds at the mercy of large historical forces. Hope is a quality of soul, something that’s not quantifiable or explicable in strictly material terms. It is a consistent characteristic of this country that we have always sought to rise above or move beyond the conditions that are given to us at birth—something not true of every people. To be an American is to believe that the status we are born into is never the final word. We have a spirit of striving, a spirit of hope that goes back to our very beginnings.
Third and finally there is the theme of story. Our narratives large and small are an essential part of the way that we Americans make sense of the world. As I write in the book,
The impulse to write history and organize our world around stories is intrinsic to us as human beings. We are, at our core, remembering and story-making creatures, and stories are one of the chief ways we find meaning in the flow of events. What we call “history” and “literature” are merely the refinement and intensification of that basic human impulse, that need.
The word need is not an exaggeration. For the human animal, meaning is not a luxury; it is a necessity. Without it, we perish. Historical consciousness is to civilized society what memory is to individual identity. Without memory, without the stories by which our memories are carried forward, we cannot say who, or what, we are. Without them, our life and thought dissolve into a meaningless, unrelated rush of events. Without them, we cannot do the most human of things: we cannot learn, use language, pass on knowledge, raise children, establish rules of conduct, engage in science, or dwell harmoniously in society. Without them, we cannot govern ourselves.
Nor can we have a sense of the future as a time we know will come, because we remember that other tomorrows have come and gone. A culture without memory will necessarily be barbarous and easily tyrannized, even if it is technologically advanced. The incessant waves of daily events will occupy all our attention and defeat all our efforts to connect past, present, and future, thereby diverting us from an understanding of the human things that unfold in time, including the path of our own lives.
The stakes were beautifully expressed in the words of the great Jewish writer Isaac Bashevis Singer: “When a day passes it is no longer there. What remains of it? Nothing more than a story. If stories weren’t told or books weren’t written, man would live like the beasts, only for the day. The whole world, all human life is one long story.”
Singer was right. As individuals, as communities, as countries: we are nothing more than flotsam and jetsam without the stories in which we find our lives’ meaning.
Of course, there are stories and then there are stories. French writer André Malraux once wrote, “A man is what he hides: a miserable little pile of secrets.” That’s one way of thinking about a man’s life, but it’s a reductive and simplistic way. We’ve all read biographies like that. But where in this approach is an account of a man’s striving, his ambitions, his ideals, his efforts at transcendence? Is it a fair and accurate account of a man to speak only or even mainly of his secrets and failings? Similarly with a nation’s history, it must be far more than a compilation of failings and crimes. It must give credence to the aspirational dimension of a nation’s life, and particularly for so aspirational a nation as the United States—arguably the most aspirational nation in human history.
A proper history of America must do this without evading the fact that we’ve often failed miserably, fallen short, and done terrible things. We have not always been a land of hope for everyone—for a great many, but not for all. And so our sense of hope has a double-edged quality about it: to be a land of hope is also to risk being a land of disappointment, a land of frustration, even a land of disillusionment. To understand our history is to experience these negative things. But we wouldn’t experience them so sharply if we weren’t a land of hope, if we didn’t embrace that outlook and aspiration. To use a colloquialism, we Americans allow ourselves to get our hopes up—and that is always risky.
Land of Hope’s epigraph is a passage that has long been a source of inspiration and direction to me. Written by John Dos Passos, a man of the radical left in his youth who later moved to the sensible right, it is from a 1941 essay, “The Use of the Past,” and it is uncannily relevant to the present:
Every generation rewrites the past. In easy times history is more or less of an ornamental art, but in times of danger we are driven to the written record by a pressing need to find answers to the riddles of today. We need to know what kind of firm ground other men, belonging to generations before us, have found to stand on. In spite of changing conditions of life they were not very different from ourselves, their thoughts were the grandfathers of our thoughts, they managed to meet situations as difficult as those we have to face, to meet them sometimes lightheartedly, and in some measure to make their hopes prevail. We need to know how they did it.
In times of change and danger when there is a quicksand of fear under men’s reasoning, a sense of continuity with generations gone before can stretch like a lifeline across the scary present and get us past that idiot delusion of the exceptional Now that blocks good thinking. That is why, in times like ours, when old institutions are caving in and being replaced by new institutions not necessarily in accord with most men’s preconceived hopes, political thought has to look backwards as well as forwards.
Isn’t that marvelous? There’s so much to unpack in it, but of special relevance today is his rather rough denunciation of “that idiot delusion of the exceptional Now.” This phrase expresses something that nearly all of us who teach history run up against. It’s harder than usual today to get young people interested in the past because they are so firmly convinced that we’re living in a time so unprecedented, enjoying pocket-sized technologies that are so transformative, that there’s no point in looking at what went on in the eighteenth and nineteenth centuries. To them the past has been superseded—just as our present world is forever in the process of being superseded.
While this posture may be ill-informed and lazy, a way to justify not learning anything, it also represents a genuine conviction, amply reinforced by the endless passing parade of sensations and images in which we are enveloped—one thing always being succeeded by something else, nothing being permanent, nothing enduring, always moving, moving, moving into a new exceptional Now. But it is a childish and disabling illusion that must be countered, in just the way that Dos Passos suggests.
Even in confronting the challenging questions of American history, most notably the existence of slavery, there are deep lessons to be learned. By the time of the Constitutional Convention in 1787, the institution of slavery had become deeply enmeshed in the national economy, despite all the ways that its existence stood in glaring contradiction to our nation’s commitment to equality and self-rule as expressed in the Declaration of Independence. Hence there was real bite to the mocking question fired at Americans by British writer and lexicographer Samuel Johnson: “How is it that we hear the loudest yelps for liberty among the drivers of negroes?”
How, we wonder today, could such otherwise enlightened and exemplary men as George Washington and Thomas Jefferson have owned slaves, a practice so contradictory to all they stood for? As I write in the book:
There is no easy answer to such questions. But surely a part of the answer is that each of us is born into a world that we did not make, and it is only with the greatest effort, and often at very great cost, that we are ever able to change that world for the better. Moral sensibilities are not static; they develop and deepen over time, and general moral progress is very slow. Part of the study of history involves a training of the imagination, learning to see historical actors as speaking and acting in their own times rather than ours; and learning to see even our heroes as an all-too-human mixture of admirable and unadmirable qualities, people like us who may, like us, be constrained by circumstances beyond their control. . . .
The ambivalences regarding slavery built into the structure of the Constitution were almost certainly unavoidable in the short term, in order to achieve an effective political union of the nation. What we need to understand is how the original compromise no longer became acceptable to increasing numbers of Americans, especially in one part of the Union, and why slavery, a ubiquitous institution in human history, came to be seen not merely as an unfortunate evil but as a sinful impediment to human progress, a stain upon a whole nation. We live today on the other side of a great transformation in moral sensibility, a transformation that was taking place but was not yet completed in the very years the United States was being formed.
A related lesson of history is that acts of statesmanship often require courage and imagination, even daring, especially when the outcome seems doubtful. Take the case of Lincoln. So accustomed are we to thinking of Lincoln in heroic terms that we forget the depth and breadth of his unpopularity during his entire time in office. Few great leaders have been more comprehensively disdained, loathed, and underestimated. A low Southern view of him, of course, was to be expected, but it was widely shared in the North as well. As Lincoln biographer David Donald put it, “Lincoln’s own associates thought him ‘a Simple Susan, a baboon, an aimless punster, a smutty joker.’” Abolitionist Wendell Phillips called him “a huckster in politics, a first-rate, second-rate man.” George McClellan, his opponent in the 1864 election, openly disdained him as a “well-meaning baboon.” For much of that election year, Lincoln was convinced, with good reason, that he was doomed to lose the election, with incalculable consequences for the war effort and the future of the nation.
To quote the book again:
We need to remember that this is generally how history happens. It is not like a Hollywood movie in which the background music swells and the crowd in the room applauds and leaps to its feet as the orator dispenses timeless words, and the camera pans the room full of smiling faces. In real history, the background music does not swell, the trumpets do not sound, and the carping critics often seem louder than the applause. The leader or the soldier has to wonder whether he is acting in vain, whether the criticisms of others are in fact true, whether time will judge him harshly, whether his sacrifice will count for anything. Few great leaders have felt this burden more completely than Lincoln.
In conclusion, let me suggest that the story of the ending of the Civil War in April 1865 might hold a lesson for those of our fellow countrymen today who seem to regard America’s past with contempt:
On April 9, after a last flurry of futile resistance, Lee faced facts and arranged to meet Grant at a brick home in the village of Appomattox Court House to surrender his army. He could not formally surrender for the whole Confederacy, but the surrender of his army would trigger the surrender of all others, and so it represented the end of the Confederate cause.
It was a poignant scene, dignified and restrained and sad, as when a terrible storm that has raged and blown has finally exhausted itself, leaving behind a strange and reverent calm, purged of all passion. The two men had known one another in the Mexican War, and had not seen one another in nearly twenty years. Lee arrived first, wearing his elegant dress uniform, soon to be joined by Grant clad in a mud-spattered sack coat, his trousers tucked into his muddy boots. They showed one another a deep and respectful courtesy, and Grant generously allowed Lee’s officers to keep their sidearms and the men to keep their horses and take them home for the spring planting. None would be arrested or charged with treason.
Four days later, when Lee’s army of 28,000 men marched in to surrender their arms and colors, General Joshua L. Chamberlain of Maine, a hero of Gettysburg, was present at the ceremony. He later wrote of his observations that day, reflecting upon his soldierly respect for the men before him, each passing by and stacking his arms, men who only days before had been his mortal foes: “Before us in proud humiliation stood the embodiment of manhood: men whom neither toils and sufferings, nor the fact of death, nor disaster, nor hopelessness could bend from their resolve; standing before us now, thin, worn, and famished, but erect, and with eyes looking level into ours, waking memories that bound us together as no other bond;—was not such manhood to be welcomed back into a Union so tested and assured? . . . On our part not a sound of trumpet more, nor roll of drum; not a cheer, nor word nor whisper of vain-glorying, nor motion of man standing again at the order, but an awed stillness rather, and breath-holding, as if it were the passing of the dead!”
Such deep sympathies, in a victory so heavily tinged with sadness and grief and death. This war was, and remains to this day, America’s bloodiest conflict, having generated at least a million and a half casualties on the two sides combined, [including] 620,000 deaths, the equivalent of six million men in today’s American population. One in four soldiers who went to war never returned home. One in thirteen returned home with one or more missing limbs. For decades to come, in every village and town in the land, one could see men bearing such scars and mutilations, a lingering reminder of the price they and others had paid.
And yet, Chamberlain’s words suggested that there might be room in the days and years ahead for the spirit of conciliation that Lincoln had called for in his Second Inaugural Speech, a spirit of binding up wounds, and of caring for the many afflicted and bereaved, and then moving ahead, together. It was a slender hope, yet a hope worth holding, worth nurturing, worth pursuing.
We all know that it did not turn out that way, due in part to Lincoln’s death at the hands of John Wilkes Booth. But the story is illustrative nonetheless. If Chamberlain’s troops could find it in their hearts to be that forgiving, that generous, that respectful of men who had only days before been their mortal enemies, we certainly ought to be able to extend a similar generosity towards men in what is now, for us, a far more distant past. Lincoln himself said something similar, at a cabinet meeting on April 14, the very day of his assassination:
I hope there will be no persecution, no bloody work after the war is over. . . . Enough lives have been sacrificed. We must extinguish our resentment if we expect harmony and union. There has been too much of a desire on the part of some of our very good friends to be masters, to interfere with and dictate to those states, to treat the people not as fellow citizens; there is too little respect for their rights. I do not sympathize in these feelings.
That was good counsel then and now, and it is an example of the wisdom that the study of history can provide us. May such wisdom be an impetus for us to rediscover such a humane and generous example in our own times.
As part of its ambitious “1619” inquiry into the legacy of slavery, The New York Times revives false 19th century revisionist history about the American founding.
Across the map of the United States, the borders of Tennessee, Oklahoma, New Mexico, and Arizona draw a distinct line. It’s the 36º30′ line, a remnant of the boundary between free and slave states drawn in 1820. It is a scar across the belly of America, and a vivid symbol of the ways in which slavery still touches nearly every facet of American history.
That pervasive legacy is the subject of a series of articles in The New York Times titled “The 1619 Project.” To cover the history of slavery and its modern effects is certainly a worthy goal, and much of the Project achieves that goal effectively. Khalil Gibran Muhammad’s portrait of the Louisiana sugar industry, for instance, vividly covers a region that its victims considered the worst of all of slavery’s forms. Even better is Nikole Hannah-Jones’s celebration of black-led political movements. She is certainly correct that “without the idealistic, strenuous and patriotic efforts of black Americans, our democracy today would most likely look very different” and “might not be a democracy at all.”
Where the 1619 articles go wrong is in a persistent and off-key theme: an effort to prove that slavery “is the country’s very origin,” that slavery is the source of “nearly everything that has truly made America exceptional,” and that, in Hannah-Jones’s words, the founders “used” “racist ideology” “at the nation’s founding.” In this, the Times steps beyond history and into political polemic—one based on a falsehood and that in an essential way, repudiates the work of countless people of all races, including those Hannah-Jones celebrates, who have believed that what makes America “exceptional” is the proposition that all men are created equal.
For one thing, the idea that, in Hannah-Jones’ words, the “white men” who wrote the Declaration of Independence “did not believe” its words applied to black people is simply false. John Adams, James Madison, George Washington, Thomas Jefferson, and others said at the time that the doctrine of equality rendered slavery anathema. True, Jefferson also wrote the infamous passages suggesting that “the blacks…are inferior to the whites in the endowments both of body and mind,” but he thought even that was irrelevant to the question of slavery’s immorality. “Whatever be their degree of talent,” Jefferson wrote, “it is no measure of their rights. Because Sir Isaac Newton was superior to others in understanding, he was not therefore lord of the person or property of others.”
The myth that America was premised on slavery took off in the 1830s, not the 1770s. That was when John C. Calhoun, Alexander Stephens, George Fitzhugh, and others offered a new vision of America—one that either disregarded the facts of history to portray the founders as white supremacists, or denounced them for not being so. Relatively moderate figures such as Illinois Sen. Stephen Douglas twisted the language of the Declaration to say that the phrase “all men are created equal” actually meant only white men. Abraham Lincoln effectively refuted that in his debates with Douglas. Calhoun was, in a sense, more honest about his abhorrent views; he scorned the Declaration precisely because it made no color distinctions. “There is not a word of truth in it,” wrote Calhoun. People are “in no sense…either free or equal.” Indiana Sen. John Pettit was even more succinct. The Declaration, he said, was “a self-evident lie.”
It was these men—the generation after the founding—who manufactured the myth of American white supremacy. They did so against the opposition of such figures as Lincoln, Charles Sumner, Frederick Douglass, and John Quincy Adams. “From the day of the declaration of independence,” wrote Adams, the “wise rulers of the land” had counseled “to repair the injustice” of slavery, not perpetuate it. “Universal emancipation was the lesson which they had urged upon their contemporaries, and held forth as transcendent and irremissible [sic] duties to their children of the present age.” These opponents of the new white supremacist myth were hardly fringe figures. Lincoln and Douglass were national leaders backed by millions who agreed with their opposition to the white supremacist lie. Adams was a former president. Sumner was nearly assassinated in the Senate for opposing white supremacy. Yet their work is never discussed in the Times articles.
In 1857, Chief Justice Roger Taney sought to make the myth into the law of the land by asserting in Scott v. Sandford that the United States was created as, and could only ever be, a nation for whites. “The right of property in a slave,” he declared, “is distinctly and expressly affirmed in the Constitution.” This was false: the Constitution contains no legal protection for slavery, and doesn’t even use the word. Both Lincoln and Douglass answered Taney by citing the historical record as well as the text of the laws: the founders had called slavery both evil and inconsistent with their principles; they forbade the slave trade and tried to ban it in the territories; nothing in the Declaration or the Constitution established a color line; in fact, when the Constitution was ratified, black Americans were citizens in several states and could even vote. The founders deserved blame for not doing more, but the idea that they were white supremacists, said Douglass, was “a slander upon their memory.”
Lincoln provided the most thorough refutation. There was only one piece of evidence, he observed, ever offered to support the thesis that the Declaration’s authors didn’t mean “all men” when they wrote it: that was the fact that they did not free the slaves on July 4, 1776. Yet there were many other explanations for that which did not prove the Declaration was a lie. Most obviously, some founders may simply have been hypocrites. But that individual failing did not prove that the Declaration excluded non-whites, or that the Constitution guaranteed slavery.
Even some abolitionists embraced the white supremacy legend. William Lloyd Garrison denounced the Constitution because he believed it protected slavery. This, Douglass replied, was false both legally and factually: those who claimed it was pro-slavery had the burden of proof—yet they never offered any. The Constitution’s wording gave it no guarantees and provided plentiful means for abolishing it. In fact, none of its words would have to be changed for Congress to eliminate slavery overnight. It was slavery’s defenders, he argued, not its enemies, who should fear the Constitution—and secession proved him right. Slaveocrats had realized that the Constitution was, in Douglass’s words, “a glorious liberty document,” and they wanted out.
Still, after the war, “Lost Cause” historians rehabilitated the Confederate vision, claiming the Constitution was a racist document, so that the legend remains today. The United States, writes Hannah-Jones, “was founded…as a slavocracy,” and the Constitution “preserved and protected slavery.” This is once more asserted as an uncontroverted fact—and Lincoln’s and Douglass’s refutations of it go unmentioned in the Times.
No doubt Taney would be delighted at this acceptance of his thesis. What accounts for it? The myth of a white supremacist founding has always served the emotional needs of many people. For racists, it offers a rationalization for hatred. For others, it offers a vision of the founders as arch-villains. Some find it comforting to believe that an evil as colossal as slavery could only be manufactured by diabolically perfect men rather than by quotidian politics and the banality of evil. For still others, it provides a new fable of the fall from Eden, attractive because it implies the possibility of a single act of redemption. If evil entered the world at a single time, by a conscious act, maybe it could be reversed by one conscious revolution.
The reality is more complex, more dreadful, and, in some ways, more glorious. After all, slavery was abolished, segregation was overturned, and the struggle today is carried on by people ultimately driven by their commitment to the principle that all men are created equal—the principle articulated at the nation’s birth. It was precisely because millions of Americans have never bought the notion that America was built as a slavocracy—and have had historical grounds for that denial—that they were willing to lay their lives on the line, not only in the 1860s but ever since, to make good on the promissory note of the Declaration.
Their efforts raise the question of what counts as the historical “truth” about the American Dream. A nation’s history, after all, occupies a realm between fact and moral commitments. Like a marriage, a constitution, or an ethical concept like “blame,” it encompasses both what actually happened and the philosophical question of what those happenings mean. Slavery certainly happened—but so, too, did the abolitionist movement and the ratification of the Thirteenth, Fourteenth, and Fifteenth Amendments. The authors of those amendments viewed them not as changing the Constitution, but as rescuing it from Taney and other mythmakers who had tried to pervert it into a white supremacist document.
In fact, it would be more accurate to say that what makes America unique isn’t slavery but the effort to abolish it. Slavery is among the oldest and most ubiquitous of all human institutions; as the Times series’ title indicates, American slavery predated the American Revolution by a century and a half. What’s unique about America is that it alone announced at birth the principle that all men are created equal—and that its people have struggled to realize that principle since then. As a result of their efforts, the Constitution today has much more to do with what happened in 1865 than in 1776, let alone 1619. Nothing could be more worthwhile than learning slavery’s history, and remembering its victims and vanquishers. But to claim that America’s essence is white supremacy is to swallow slavery’s fatal lie.
As usual, Lincoln said it best. When the founders wrote of equality, he explained, they knew they had “no power to confer such a boon” at that instant. But that was not their purpose. Instead, they “set up a standard maxim for free society, which should be familiar to all, and revered by all; constantly looked to, constantly labored for, and even though never perfectly attained, constantly approximated, and thereby constantly spreading and deepening its influence, and augmenting the happiness and value of life to all people of all colors everywhere.” That constant labor, in the generations that followed, is the true source of “nearly everything that has truly made America exceptional.”
Americans are losing interest in the Civil War—or at least they are losing interest in learning about it and visiting historic battle sites. The Wall Street Journal reported recently that the country’s “five major Civil War battlefield parks—Gettysburg, Antietam, Shiloh, Chickamauga/Chattanooga, and Vicksburg—had a combined 3.1 million visitors in 2018, down from about 10.2 million in 1970.” Gettysburg, America’s most famous and hallowed battlefield, drew fewer than a million visitors last year, and just 14 percent of the visitor total in 1970.
In addition to fewer tourists, the number of Civil War re-enactors is also declining. Many are growing old, and younger men are not stepping in to replenish their ranks. As one 68-year-old re-enactor, who recently helped organize a recreation of the Battle of Resaca in Georgia, told the Journal, “The younger generations are not taught to respect history, and they lose interest in it.”
But it’s not just that young people are not taught to respect history. They are often not taught history at all. To the extent they are, they are told that American history is a parade of horribles: slavery, genocide, bigotry, greed—a story above all of injustice and oppression, perpetrated by the powerful against the weak.
No wonder then, that recent public interest in the Civil War has mostly taken the form of a push to remove Confederate monuments from public places and rename buildings and roads bearing the names of Confederate leaders. We hear much about removing and renaming these days, but almost nothing about building more and better monuments, or reinvigorating public interest and education about the war.
In a country where large numbers of college graduates do not even know the half-century in which the Civil War occurred, but are convinced that Confederate monuments should come down, we should expect genuine interest in the Civil War to wane if not to disappear entirely, except perhaps as an object for political activism.
This problem of course goes well beyond the Civil War; it encompasses all of history. Consider the case of the College Board’s Advanced Placement U.S. History examination. In 2014, the National Association of Scholars issued a report exposing the exam’s heavy progressive bias, systematic downplaying of American virtues, and outright omission of important periods in American history. The report sparked enough outrage and bad press that the College Board revised its exam—this time including previously omitted figures like James Madison—but according to the NAS the course materials for the test were unchanged and reflected the same progressive bias.
In 2016, the NAS decided to take a closer lookat another of the College Board’s offerings, the new AP European History examination, which, it turns out, reflected the same progressive bias as the American history exam. “The College Board’s persisting progressive distortion of history substantiates concerns that the 2015 APUSH revisions do not represent a genuine change of direction,” wrote the NAS’s David Randall, “but only a temporary detour in the College Board’s long march to impose leftist history on the half a million American high school students each year who prepare themselves for college by taking APUSH or APEH.”
(The exam, which purports to be about European history, omits all mention of Christopher Columbus, Michel de Montaigne, John Wesley, the Duke of Wellington, Florence Nightingale, and Václav Havel. It mentions Winston Churchill only “as a prompt for learning how to analyze primary sources.”)
Progressive bias in high school and college curricula is in part the long legacy of Howard Zinn, whose “A People’s History of the United States,” first published in 1980, presents a cartoonish, left-wing version of American history that pits “the people” against “the rulers” and casts the entire American experiment of democratic self-rule in a decidedly negative light. That approach is now common among professional historians, with the result that growing numbers of Americans don’t know much, or care to know much, about their own history.
As the historian Wilfred McClay said in a recent interview, the Zinn approach invites historical ignorance and indifference: “Why learn what the Wilmot Proviso was, or what exactly went into the Compromise of 1850, when you could just say we had this original sin of slavery?”
The danger here is not just that Civil War battlefields will eventually lie fallow for lack of visitors, but that we will unlearn the painful lessons of our past. To some extent, we’ve already started down that path.
Another recent NAS report, for example, examined the re-emergence of segregation on college campuses—what the authors call “neo-segregation.” In a survey of 173 schools, including small private colleges as well as major universities like the Massachusetts Institute of Technology and Yale University, the study found“42 percent offer segregated residences, 46 percent offer segregated orientation programs, and 72 percent host segregated graduation ceremonies.”
These segregated graduation ceremonies are not mandatory, of course, and are offered in addition to regular graduation ceremonies. But the fact that they have become so prevalent on college campuses should disturb anyone familiar with the history of segregation in America. Whether it’s segregation by race, as at Columbia University’s “Raza Graduation Ceremony” and “Black Graduation,” or by sexual orientation, as at the University of Texas’s “Lavender graduation” for LGBT students, the trend of self-segregation among minority college students is a cause for worry, especially at a time when divisions in civil society are deepening.
There’s a ruthless logic to this, just as there’s a ruthless logic to reducing American history to a catalog of the worst things we’ve ever done. If history is just another tool in the pursuit of political power, there’s not much of an impetus to get it right.
HAPPY FOURTH OF JULY! LET FREEDOM RING!
“Taxation without representation!” was the battle cry in America’s 13 Colonies, which were forced to pay taxes to England’s King George III despite having no representation in the British Parliament. As dissatisfaction grew, British troops were sent in to quell the early movement toward rebellion. Repeated attempts by the Colonists to resolve the crisis without military conflict proved fruitless.
On June 11, 1776, the Colonies’ Second Continental Congress met in Philadelphia and formed a committee whose express purpose was drafting a document that would formally sever their ties with Great Britain. The committee included Thomas Jefferson, Benjamin Franklin, John Adams, Roger Sherman and Robert R. Livingston. Jefferson, who was considered the strongest and most eloquent writer, crafted the original draft document (as seen above). A total of 86 changes were made to his draft and the Continental Congress officially adopted the final version on July 4, 1776. Continue reading
“So in the end I am left not with concerns but with gratitude: To Spielberg for making this movie, and to my fellow moviegoers, for only when movies succeed will others make similar movies in the future.”
by Walter Stahr
I read somewhere, not long before my first book was published, that being a published author would ruin the experience of going to a bookstore. I scoffed, but I soon learned that it was all too true. A bookstore will have no copy of your precious baby. Or it will have one or two copies, buried so deep in the back that nobody will see them. Or the store has a few copies well-placed, but nobody seems to be paying any attention to your book, much less buying it. Continue reading