President Trump has signed an executive order that aims to tackle U.S. prescription drug spending, but won’t implement it until the public, including private sector drug makers, can comment.
The order pegs the prices of certain drugs covered by Medicare to the lower prices paid in other developed countries, whose governments impose strict price controls.
The order might save the federal government some money — at least temporarily — but at great expense to patients and long-term scientific progress.
History shows that adopting government price-setting inevitably stifles medical innovation and reduces patients’ access to lifesaving drugs. This move is particularly dangerous and baffling in the middle of a pandemic.
Trump is well-intentioned. But when it comes to healthcare policy, it’s not the thought that counts. Americans can only hope the president rescinds this order, which will have harmful long-term effects.
Right now, Medicare pays 80 percent more for drugs than government health insurers in other developed countries, such as the United Kingdom and Japan. So, the administration wants to tie Medicare reimbursements to the significantly lower prices in those countries.
It’s no accident that drugs are cheaper abroad. Many foreign nations have heavily socialized healthcare systems that regulate drug prices. If pharmaceutical companies don’t accept pitifully low reimbursement rates, foreign government officials simply ban firms from selling their medicines at all.
Pegging Medicare reimbursements to those artificially suppressed prices would, in effect, impose price controls here. That might have some superficial appeal — after all, who doesn’t like the idea of cheaper drugs — but price control schemes never end well.
Government price-setting invariably restricts patients’ access to novel therapies. Right now, 96 percent of all new cancer medicines invented worldwide between 2011 and 2018 are available in America. That’s because our country has a relatively free-market drug pricing system that gives firms a chance to earn back their research and development costs.
Contrast that with the United Kingdom and Japan, where patients have access to just 71 percent and 50 percent of those cancer drugs, respectively. In these countries, drug companies stand little chance at recouping their R&D costs and earning a profit on many drugs. So, they often stay away.
Even if drug companies do enter those markets, foreign patients often wait months — or years — to receive new drugs. While Americans typically have immediate access to breakthrough cancer therapies, patients in Japan wait 23 months, on average, after a drug’s initial launch before gaining access.
Imagine if the 44 million Americans on Medicare — 15 percent of the U.S. population — had to wait an extra year and a half before they could take a new immunotherapy. That horrific consequence explains why, in this case, most congressional Republicans don’t back the president on his executive order.
The administration’s plan would also decimate medical innovation. It takes several billion dollars and over a decade to create just one new drug. The existing pricing system incentivizes companies to make those research investments — and the results have been nothing short of miraculous. Cancer death rates have plummeted more than 25 percent over the last quarter-century, mostly thanks to new treatments.
In fact, in 2019, American life expectancy increased for the first time in four years. One of the key causes was better cancer treatments.
I applaud Trump’s effort to reduce drug prices. But there are ways to do so without bringing foreign price-setting to our shores.
He already got one way right. In the same ceremony, Trump signed another executive order to target middlemen in the drug supply chain called pharmacy benefit managers. These negotiators set the prices for drugs that end up on insurers’ list of covered treatments and on the shelves of local pharmacies.
The instinct to reform the practices of PBMs was spot on. PBMs receive significant rebates from manufacturers for adding a drug to an insurer’s formulary. But they don’t disclose those rebates or use them to lower patients’ costs at the pharmacy counter.
Requiring them to pass along savings directly at the point of sale will help achieve the president’s desired reduction in drug prices without costing Americans access to lifesaving cures.
Additionally, the administration could stop the unfair trade practice of banning an American medicine unless it’s sold at an artificially low price. That would stop developed countries from benefiting off the backs of American taxpayers, who foot the bill for new drug development.
Government price-setting would snuff out future medical breakthroughs while limiting patients’ access to existing drugs. The savings aren’t worth the cost in American lives. Let’s hope the administration decides to reverse course on its new executive order.
The pressure to reopen schools is on everywhere now that New York is doing it. This means something else big: Their hard opposition to school reopenings is politically devastating for Democrats.
Prominent Democrat politicians have started making huge concessions on reopening schools. Back in May, Democrats pounced after President Trump supported reopening. Despite the data finding precisely the opposite, it quickly became the Democrat-media complex line that opening schools this fall would be preposterously dangerous to children and teachers.
In July, when New York City Mayor Bill de Blasio unveiled a plan to put the city’s 1.1 million school kids back in schools half the week and “online learning” the rest of the week, New York Gov. Andrew Cuomo picked a public fight with him, saying, “If anybody sat here today and told you that they could reopen the school in September, that would be reckless and negligent of that person.”
Then on Friday, Cuomo cleared schools to open this fall, just a few weeks after making uncertain noises about the prospect as teachers unions breathed down his neck. That same day, New York Sen. Chuck Schumer, the Senate’s minority leader, joined the Democrat messaging reversal:
House Speaker Nancy Pelosi tucked the posture shift into a Saturday response to Trump’s latest executive orders, saying “these announcements do…nothing to reopen schools,” as if Democrats have been all along supporting school reopenings instead of the opposite. Just a few weeks ago, Pelosi was on TV bashing Trump and Education Secretary Betsy DeVos for encouraging school reopenings, saying, falsely, “Going back to school presents the biggest risk for the spread of the coronavirus. They ignore science and they ignore governance in order to make this happen.”
What gives? For one thing, New York’s richest people have fled during the lockdowns. If their kids’ tony public schools don’t offer personal instruction or look likely to maintain the chaos of rolling lockdown brownouts, those wealthy people have better choices. They can stay in their vacation houses or newly bought mansions in states that aren’t locked down. They can hire pod teachers or private schools.
And the longer they stay outside New York City and start to make friends and get used to a new place, the less likely they are to ever return. Cuomo is well aware of this.
“I literally talk to people all day long who are now in their Hamptons house who also lived here, or in their Hudson Valley house, or in their Connecticut weekend house, and I say, ‘You got to come back! We’ll go to dinner! I’ll buy you a drink! Come over, I’ll cook!’” Cuomo revealed in a recent news conference. “They’re not coming back right now. And you know what else they’re thinking? ‘If I stay there, I’ll pay a lower income tax,’ because they don’t pay the New York City surcharge.”
Reopening means swimming against their anti-Trump base and teachers union donors’ full-court press to amp school funding and slash teacher duties. That means the below-surface financial and political pressure Cuomo, Pelosi, and Schumer are under to make this kind of a reversal must be huge. It’s likely coming from not only internal polling but also early information about just how many people have left New York and New York City, as well as interpersonal intelligence from their influential social circles.
This means three things. First, the pressure to reopen schools is on everywhere now that New York is doing it. Second, Democrats’ hard opposition to school reopenings has been politically devastating. Third, all the push polls and media scaremongering promoting the idea that most parents shouldn’t and wouldn’t send their kids back to school have failed.
One of the most significant reasons it failed is that parents’ experience with online pandemic schooling was a horror show. Another is that private schools have clearly outpaced public schools’ response to coronavirus. That’s both in offering quality online instruction when forced to close, and in seeking to remain open as much and as safely as possible, all while teachers unions have been staging embarrassing tantrums over people on public payroll actually having to do their jobs to get paid, even though epidemiologists have noted “there is no recorded case worldwide of a teacher catching the coronavirus from a pupil.”
Public schools have been so clearly shown up by private schools during the coronavirus panic that state and local officials have begun to target them specifically, and have carefully included them in all onerous government burdens on school reopenings, to reduce their embarrassment and bring private schools down to the public school level as much as possible.
The most prominent recent example is in Maryland, where a local bureaucrat in one of the nation’s richest counties specifically banned private schools from safely teaching children in person, and is now battling with the state’s Republican governor over the edict. In North Carolina, many private schools are offering safe, face-to-face, five-day instruction, while most public schools are not.
Part of this is just that government bureaucrats hate individuals making their own decisions based on their own circumstances (a major reason for mask mandates, by the way). But also they’re scared because the coronavirus panic is expanding the massive fault lines inside public schooling. And public schools are a feeder system for Democrat support.
Before coronavirus hit, a near-majority of parents already thought a private school would be better for their kids than public school. People really are not happy with public education. Mostly they do it because they think it’s cheap.
But politicians’ handling of coronavirus has shown that public education is actually very expensive. The instability, the mismanagement, the lying, the public manipulation, all of it has tipped many people’s latent dissatisfaction with public schooling into open dissatisfaction. It’s a catalyst. Now many more people have decided to get their kids out of there, either by homeschooling, moving school districts, forming “pandemic pods,” or finally trying a private school.
Like all the rich people leaving locked-down locales, parents removing kids from locked-down public schools have scared public officials. If just 10 percent of public-school kids homeschool or join a private school for two years, that is a watershed moment for the social undercurrent of animosity towards public schools. That is especially true in the government funding era we’re entering, in which government debt and health and pension promises are set to gobble up education dollars faster than ever, a dynamic that was already ruinous before it was accelerated further by the coronavirus.
This is dangerous to Democrats’ political dominance because the education system tilts voters their way through cultural Marxism, and because public education is a huge source of Democrat campaign volunteers and funds. Now Democrats have detached people from their conveyor belt. The consequences will be huge.
Reopening public schools the way Democrats are doing is not going to stave off this tsunami, either. New York City’s “reopening,” for example, includes several days per week of distasteful online instruction, as well as a rule that a school will close for two weeks any time two inmates test positive for COVID. That’s a recipe for endless school brownouts that will drive parents and kids nuts. Humans simply can’t live under this manufactured instability, by the pen and phone of whatever self-appointed petty little dictators feel like changing today.
Democrats are trying to have it both ways. They’ve learned that parents are not going to put up with putting school indefinitely on hold when everything from swimming to climbing stairs is more dangerous to children. But they also want to maintain the fiction that coronavirus is an emergency situation that requires tossing trillions of dollars in deficit funding out of helicopters, keeping people cooped up and restive as an election nears, and purposefully choking the nation’s best economy since before Barack Obama got his hands on it.
Democrats are their own worst enemy. The problem is, the rest of us are so often their collateral damage.
The COVID-19 experience helps us decide what is essential and what isn’t
One effect of the lockdown is that we find ourselves with frequent decisions as to what is essential to our survival and happiness and what isn’t. Life gets stripped down to essentials, with all the extras becoming secondary, if that. Here are some ideas along these lines.
The first essential is food. The availability of food for us to buy entails a massive industry. First, there is the source which is the farmers and ranchers who provide our meat, fruit and vegetables. Their activities require thousands of acres of land and huge amounts of water for crops and livestock, which in turn depend on favorable weather. Bad weather can bring both floods and droughts.
Then there is also a vast capital expense required for equipment and labor to plant, cultivate and harvest the crops which feed both people and animals.
Ahead is the immense supply chain which involves the transportation, processing and ultimately delivery to the thousands of stores and restaurants which will make our food supply available to all of us. It is important to remember that this entire industry and all its parts must continue to operate at all times in order for us to survive. Any significant disruption could have disastrous consequences.
Closely related to food is water. Humans can survive longer without food than without water. The availability of water involves another massive industry as well as favorable weather. When we turn on a faucet and water appears, it is well to remember what has gone into that daily miracle.
The moral of these reflections is that 1) we are all radically dependent on the proper functioning of extremely complicated and expensive sources and supply chains for the very fundamentals of our existence, and 2) that the survival of the human race depends on factors which are mostly beyond our control.
Among other things, these essentials remind us that they depend entirely on people working, pandemic or no pandemic.
The subject of “work” brings up another consideration: buildings may not be as universally essential as we thought. Specifically, our housing is essential. If we never thought about that before the “shelter in place” mandate appeared, staying home for three or four months certainly showed us the importance of our house.
For many, however, the experience also demonstrated that “office” is not essential to work. We have been forced to discover that, thanks to all the modern communication technology, much of the work we do can be as easily preformed at home as in an office. So, offices are not really on some lists as essential.
But work really is essential. We have discovered what we always knew – that our work is what keeps us going, defines our place in this society, which, if we are not satisfied with the way things are, provides alternatives for us to test and follow. Work is also critical for society as a whole because it constitutes the means by which all those complex supply chains are sustained. Combined, they are the “economy” which is followed so thoroughly by the news – and Wall Street.
Another essential which has been forced to the front of our attention span by the pandemic is our family. In many cases, parents who work hard in often stressful circumstances have re-discovered the importance and the joys of marriage and parenthood by staying home for extended periods. They have become re-acquainted with their spouse and children, and spouses and children have in turn made their own discoveries.
Fathers especially sometimes become almost mythical figures to children who see them only for short periods, often in a disciplinary circumstance. The rest of the time their father is talked about but not there. Getting to know each other better is beneficial to all.
Hygiene is another subject which has drawn more attention in the last few months than in the last few years. We have been told ad nauseum how to wash our hands and sterilize every surface in sight. Like it or not, cleanliness – of person and environment – has become a new essential.
Shopping, restaurants, sports events and sports teams have fallen to lower placed priorities. All are missed – acutely by some – but there are other ways to get exercise and to prepare and consume food and drinks, other ways which involve much less risk of contracting disease.
Among the essentials most missed, however, are social events and interactions with other people. Some have discovered that the absence of crowds and gatherings is so important that being deprived has led to depression or worse. Others – often a significant number – have decided to seek communal activities, whether parties or protest marches, in spite of advice and even prohibitions to the contrary. To them, a full social life is essential, damn the consequences!
Just some contemplative thoughts (while working at home!).
On Thursday, the nation learned that second-quarter U.S. gross domestic product was down by a third, the biggest one-quarter drop on record. It’s an astounding measure of just how deeply the coronavirus lockdowns imposed by the governors, combined with their reluctance to reopen their states for business, has affected life in America.
If these were normal times, the story would lead everywhere. The news channels would have economists on all day talking about what it means, not just for President Donald J. Trump’s re-election prospects, but for the health of the dollar and the status of the recovery. Faced with numbers like that, the national conversation should be about whether America will ever regain its leadership role in the global economy. Instead, everyone is talking about whether the November 3 election can, or should, be postponed.
Welcome to Trumpworld, where the president of the United States lives rent-free inside the heads of nearly every talk show host, newspaper editor, political reporter, pundit and Democratic officeholder. They’re obsessed with him and he, over his almost four years in office, has become expert at pushing their buttons—as he did with this Thursday morning tweet:
Almost immediately, and it’s hard to believe the president didn’t intend for this to happen, the airwaves and the Internet were full of conversations and prognostications about the election being put off, usually with the spin attached that Trump, by proposing it, was only putting off his inevitable defeat. What this kind of commentary misses is that he still hasn’t lost his ability to turn the conversation in any direction he wants, at any given time. Once the campaign starts in earnest, which will probably happen as soon as Joe Biden is locked in as the Democratic nominee, things will get really ugly, really fast.
Right now, the polls show Biden in the lead. They should. The nation has been through crisis after crisis, most of them not of Mr. Trump’s making but which are nonetheless, because he is the president, his responsibility. But Mr. Biden, who spent almost all but the last four years of his life in one elective office or another, is an unknown quantity to most voters. They don’t know him, and they don’t know what he’s going to do if he’s elected except that he’ll be different.
Now, that might be welcome. When the voters figure out the progressives running the Black Lives Matter Movement and the various Soros-funded groups and the other fringe elements of the Democratic Party are actually in charge of Biden’s campaign and will be in charge of the White House, things may change.
That doesn’t mean Mr. Trump will win. It just means the race is probably a lot closer than the polls show, and that it will get even closer before votes are cast. And, referring again to the president’s Thursday tweet, the concerns he voiced about the various vote-by-mail schemes proposed for the fall have some validity to them. Through the primary season, there have been reports of ballots going missing or to the wrong place, of multiple ballots going to addresses where an intended recipient no longer lived, and more. A national election, especially one as apparently consequential as this one, is not the time for a “make things up as we go along” experiment with the voting process.
Once things get going, expect Mr. Biden to throw mud at Mr. Trump, and for the president to respond in kind. It’s unlikely there will be much either campaign does to show why its guy is the better candidate—and that’s a mistake. As any election expert will tell, you must give the people a reason to vote for you, not just against the opposition. Republicans after Reagan seemed to understand that better than the Democrats did, at least until very recently. Now, some of the Republican national leadership seems content to say, “If you don’t want socialism to come to America, vote for us. Don’t let us become the next Venezuela.”
It’s a sentiment many people share, but it’s not enough. “Yuck, Trump” might get Mr. Biden close, but it won’t bring him home to victory. There has to be more, and it needs to be heard not just by the base and the people who have already made up their mind, but by independents and undecideds and the disaffected in each party. The one to figure that out, not first but best, like the first one to explain how to get back that lost third of U.S. gross domestic product, probably wins.
In late spring, oil prices dipped below zero for the first time ever. Futures contracts for May delivery traded as low as negative $37 a barrel, as producers and speculators paid refineries and storage facilities to take excess crude off their hands.
In some sense, this historic moment was inevitable. Oil markets are completely saturated. Worldwide coronavirus lockdowns have depressed energy demand. And in March, Saudi Arabia and Russia announced they would increase production, thus exacerbating the glut.
President Trump has tried to help beleaguered U.S. producers. He recently mediated a deal between Saudi Arabia, Russia, and other major oil producers, who collectively agreed to cut production by nearly 10 million barrels a day.
But prices are still falling. And now, the White House is toying with other ways to prop up U.S. oil producers, ranging from tariffs on imported oil to direct cash payments to energy companies.
This desire to help energy companies, and the millions of workers they employ, is commendable — but ultimately counterproductive. In the long run, the industry will emerge stronger if the White House allows the free market to resolve this crisis.
This pandemic-induced economic crisis is going to be painful for the energy sector. Cost-cutting and layoffs are already underway.
But the industry is strong and adaptive, and has bounced back from past crises by investing in technology. In fact, economic pressure encourages the kind of innovation and belt-tightening that helps companies thrive in the long run.
The United States last faced low oil prices in 2014 and 2015, when Saudi Arabia ramped up output to try to cripple U.S. producers that specialized in fracking — a technique used to extract oil from underground shale rock. By early 2016, prices had dropped below $30 a barrel, well below what U.S. shale producers needed to break even.
The government didn’t come to the rescue, which forced frackers to get creative. They researched how to extract more oil for less, and came up with a variety of new techniques, like drilling several wells simultaneously and using drones to detect faulty equipment. As a result, the average break-even price for frackers dropped from $69 a barrel in 2014 to an average of $40 a barrel by 2017. Had the government tried to solve the problem by slapping tariffs on Saudi crude, the U.S. oil industry likely would have never set its all-time production record of 13.1 million barrels a day in February.
We can be confident the U.S. energy industry will apply its ingenuity to this crisis, too — because these days, it excels at invention. In 2019, the oil and gas sector increased adoption of digital technologies, including cloud data storage and new software. Over the next five years, digitizing could slash the cost of oil production by almost 10 percent.
By using sensor technology — tiny, data-tracking devices attached to oil-field gear — producer ConocoPhillips recently cut in half the amount of time it took to drill new wells in South Texas. Other companies are using data analytics to search for the best drilling locations.
In short, the pressures of a downturn are likely to encourage even more future-focused transformation. The industry doesn’t need to hide behind tariffs. If we trust the free market to encourage creativity, in the long run, we’ll all benefit from a cheaper and more efficient energy supply.
There’s a lot riding on whether the nation’s children go back to school in the fall. The restoration of the economy. The ability of many parents to return to work. The safety and continued education of our kids. All of that, one way or another, is contingent on the return to things as they were before COVID hit.
The science says it’s safe if reasonable precautions are taken. Even the Centers for Disease Control and Prevention and the American Academy of Pediatrics are, to one degree or another on board. Keeping kids out of school might be more harmful, say the experts, than letting them attend.
Leading the fight against the return to normalcy is the usual cast of characters, many of whom oppose a normal school year because President Donald J. Trump and Education Secretary Betsy DeVos want it. That’s a reflexive response, hardly meaningful as these are the same people who’d probably try to give up breathing if Trump said it was good for you.
Teachers and their unions are also resisting. You would have thought they’d be anxious to get back to work, especially since the science shows it is in the best interest of the children. But no, they’re on the frontlines arguing against any proposal that doesn’t at least cut back on the time that will be spent in the public-school classroom.
Some are going further. In Washington, D.C., where bad decisions by local politicians have caused the novel coronavirus to hit especially hard, public school teachers this week briefly lined up “body bags” outside the city’s administrative offices to pressure Mayor Muriel Bowser to keep the government-run schools closed.
It’s not in the kids’ best interests to do that. Yet the teachers’ unions who are the first to proclaim they are the guardians of that sacred trust anytime something like a tax increase to fund education comes up are leading the charge to keep schools closed and more. A coalition of unions, including those representing teachers in Chicago, Boston, Los Angeles, St. Paul, Milwaukee, Racine, Little Rock, and Oakland has assembled a list of demands that is at best self-serving and, as they say, “non-negotiable.”
They won’t come back to work, they say, “until the scientific data supports it.” Which it does, even if they won’t acknowledge it. Also on the list is “police free schools,” a “moratorium on new charter or voucher programs and standardized testing,” a “massive infusion of federal money to support the reopening funded by taxing billionaires and Wall Street,” “Support for our communities and families, including (a) moratorium on evictions/foreclosures, providing direct cash assistance to those not able to work or who are unemployed, and other critical social needs,” and “All schools must be supported to function as community schools with adequate numbers of counselors and nurses and community/parent outreach workers.”
There may be a couple more, but you should understand their intent by now. The unions representing these teachers want to bring an end to any chance students might have, especially those in the inner cities, to a better education leading to a better quality of life than they knew growing up by putting an end to accountability and an end to the competition posed by charter schools.
We shouldn’t be funding these people with our tax dollars. We should be doing education differently, starting with what we pay for. We should be funding learning instead of schools and children instead of teachers. What we’re doing now doesn’t work unless you’re a politician who backs things as they are because you get political support for doing so.
Thomas Sowell, the great economist and public intellectual who has long been a leader in the fight for education reform once said, “Propagandists in the classroom are a luxury that the poor can afford least of all. While a mastery of mathematics and English can be a ticket out of poverty, a highly cultivated sense of grievance and resentment is not.” Yet that’s what we’re seeing in the demands the teachers’ unions and their coalition partners are making before they’re willing to let the schools reopen. They’re showing us they’re not in it for the kids as they claim. They’re in it for themselves and they’ve finally, because of the COVID crisis, exposed themselves for what they are.
Let me begin by stipulating that I do not consider myself an authority on the future of higher education. I have been too long absent from the field to have insights derived from recent experience. I retain, nevertheless, a keen interest in the topic. Following are some thoughts about the what I would like the higher education of the future to look like.
Who is served by higher education?
Fundamentally, higher education, like all socialization, serves both the greater society and the individual: society by increasing its cadre of specialized experts in maintaining and advancing society’s technology and life experience; the individual by further defining and securing his/her role in society.
Humankind are all herd animals. We are born with the need to belong to a group of our fellow humans. Sociologists describe those groups as family, clan and tribe, depending on the size and intimacy of the group. “Family” is composed of those we are closest to and is the smallest of the groups. “Clan” denotes a larger, less intimate group, such as our cultural or religious or political associations. “Tribe” is the largest and least intimate of our associations, but equally important to the individual’s well-being, including nation, language, and history.
All humans are also curious. Our search for new knowledge and understanding never ceases, although the range and perspective of inquiry varies considerably from individual to individual, often from time to time for the same person over a lifetime.
Within this framework, higher (and all) education primarily serves the tribe by expanding the individual’s scope and perspective of inquiry or curiosity. Life itself is constantly providing the same service, but in a random and unpredictable fashion. Education is supposed to provide perspective and order to the individual’s ability to interpret these experiences in a meaningful context.
What are the criteria for evaluating whether or not higher education is providing a valuable experience?
The criteria are easy to identify, if difficult to evaluate. They are: Does higher education fulfill its obligation to society? And to the individual?
Higher education’s obligations to the greater society are twofold: cultural and technological. The knowledge and skills pertaining to an expansion of the individual’s understanding of his/her culture include the history, language and ideals of the society in which one lives. The second criterion is the same obligation in the realm of the society’s technology base, in the broader sense of “technology”, namely the “techniques” by which the society copes with the various challenges of its existence: food, heat, light, communication, transportation, lodging, water, to name a few of the obvious. The technology requirement presumes specialization in some aspect of these social needs.
Higher education’s obligation to enhance the individual’s well-being and success in his/her society include more personal knowledge and skills. Included here are topics such as religion, a practical understanding of how society is organized and functions, how government works, problem-solving skills such as logic, research, factual versus false data, appreciation of the arts, including painting, architectural, music, and the like.
These are areas frequently of controversy. How to deal with dissent, to weave one’s own way though the thicket of varying opinions, false claims and disputed facts represents a valuable but illusive skill which should be part of every college experience.
We have now set the stage for a discussion of the future of higher education:
Higher education exists to serve society and the individual by expanding his/her knowledge and skills of
In this manner, higher education seeks to expand the individual’s success in “life, liberty and the pursuit of happiness”.
I have always been intrigued by the concept of “Individual Educational Plans” (IEP), defined as “a written plan/program [which]… specifies the student’s academic goals and the method to obtain these goals.” Originally signed by President Reagan in 1986 and enhanced periodically since, the IED is required for all handicapped children.
What if IEP’s were specified for ALL children? The practical implementation of such an idea was beyond our capabilities until the introduction into education of the digital age. Unfortunately, computers were confined to two areas of education, teaching content (a problematic application) and administration. It has not been used extensively for the application which it would be most fruitfully applied, namely, implementation of complex scheduling. Elsewhere I have designed the way in which computers could be used to implement IED’s for ALL children. Needless to say, I was ahead of my time (where I spent most of my later years in education!).
However, I believe such a plan could now be implemented for higher education with today’s technology. After all, we were able to execute a form of this pedagogy in the 1970’s before computers were even introduced, as I explain the accompanying essay (see “The Fiddler and Me” attached).
The system would draw heavily from several sources: the Oxford University tutorial method of instruction, computer-based scheduling (which I helped introduce in my post-Crown Center career with Control Data Corporation) and doctoral degree programs, as well as the credit-for-experience, Portfolio Plan, which I pioneered in Kansas City’s Crown Center campus (details in accompanying essay). A very significant addition would be the computer-based courseware now available as well as the internet with its nearly unlimited research resources.
Briefly, the system would look like this:
Development of his/her IEP based on each student’s individual interests and guided by a personal academic advisor. “What do you know now? (Portfolio optional) What don’t you know now that you would like to know? How will you acquire that expertise? How will we measure what you have learned? (Thesis required.)” Content could be achieved at the student’s discretion by seminar, tutorial or digitally. Benchmark endorsements from faculty required.
Many details are left undeveloped here because of space limitations. However, I hope this vision will be achieved somewhere down the road as higher education continues to evolve.
The left calls for racial quotas in the name of progress
The American dream is that any citizen, regardless of sex, race, creed, or color, can rise on his determination and merit. History is littered with examples of the reformers who worked to realize that dream, pushing the most influential institutions in the country to prize talent and hard work over wealth and connections.
The introduction of standardized testing, accessible to all American teens, was part of that push. Harvard University began administering a standardized test to all applicants in 1905. Its effect was profound and immediate: historically a landing spot for the Protestant upper crust, the school began admitting far more public school kids, Catholics, and Jews.
The increasing number of Jewish students was a major concern for Harvard president and committed progressive A. Lawrence Lowell. He tried to implement a quota on Jews, then pivoted to an admissions process that used intangible factors such as “character” and “manliness.” It worked: Jewish applicants consistently fell short.
These sorts of hazy, intangible assessments are now championed by the left. In the name of racial equality, the woke now seek to dismantle meritocratic norms and return to the quota systems that practices like standardized testing were designed to relegate to the trash heap of history.
In a lawsuit likely headed for the Supreme Court, hundreds of would-be Asian admittees allege that Harvard caps their numbers with quotas based on “personality”—an eerie echo of Lowell’s method for keeping out Jews.
The New York Times’s classical music critic, Anthony Tommasini, is calling for the end of the blind symphony audition, which drove a tripling of women’s representation in the field, so that conductors can make race-based selections. The University of Connecticut School of Medicine, where merit is literally a matter of life or death, recently suspended admissions to its honor society because the GPA-based admissions criterion did not produce an honor society that, as Bill Clinton said, “looked like America.”
The SAT—which measures intellect better and more fairly than do intangible heuristics—is under fire. University of California president and former Obama official Janet Napolitano has joined the chorus of administrators at elite universities who complain that race-blind admissions aren’t producing the desired results.
Those calling for “progress” usually want to forfeit someone else’s job. Tommasini is a white man, as are all his listed colleagues at the Times‘s “music” section. So is the L.A. Times’s Mark Swed, and Washington Post music critic Michael Brodeur, who recently penned a news report about classical music’s “long overdue reckoning with racism.”
All are curiously quiet on the “racism” of their clique. None seem ready to give up their own position for indigenous or trans critics, who surely exist! Surely they are waiting somewhere for the call from the New York Times that their turn has come, merit be damned!
As the Times‘s own Ross Douthat noted, those who stand to benefit most from this new attitude are the rich and powerful, who will be free to clear the way for their underachieving kids—the Varsity Blues scandal, legitimated by wokeness.
The new war on merit is the same as the old, and it marks regression rather than progress. It’s straight out of Lowell’s playbook: In the name of “equality,” tear down the only system we have that gives the talented a shot over the powerful.
We have to ignore the alarmists and get back to work
One of the ongoing controversies in recent days is the dispute over which should be the nation’s top priority: economic recovery or pandemic precautions? Both positions are framed in the same terms: no recovery will be successful if everybody is afraid of catching the virus; likewise, drastic prevention measures, if continued, will bring on the worst economic disaster since the Great Depression of the 1930’s. The answer is that both positions are essentially correct.
We cannot afford either of these alternatives. Common sense tells us that we must resume full economic recovery as soon as possible, but we ignore the frightful prospect of an unchecked pandemic at our own peril. Each consideration has its own imperative: we must resume economic activity at its fullest capacity as soon as possible and we take all reasonable precautions at the same time.
So, the key question is: what are reasonable measures for protecting ourselves as a society?
The first answer to this question is what we should not do. We should not trust the public health officials’ solution to this problem. They speak from a very limited perspective, namely, the optimal methods for avoiding the disease altogether. Obviously, the surest way to avoid the disease is to cease all human contact entirely — “shelter in place”. There are several economic activities which can be executed alone, thanks to the internet and the telephone, such as, writing, meeting, accounting, record-keeping, reporting, selling (some items), etc. The surge of some sectors of the economy, such as mail-orders and delivery services, show the enhanced value of such activities.
Starting from avoiding all human contact as the best protection for individuals — which even public health experts realize is not doable for most people — the next step is simulating “personal quarantine”. Thus “social distancing” and masks. This practice is marginally practical, meaning it can be done successfully by people engaged in some economic activities, such as counseling and lecturing.
Most economic activities, however, require closer contact. Therein lies the problem. Since most manufacturing and service industries are not compatible with “social distancing”, and since the nation cannot survive economically without these major sources of income, and, further, since the pandemic is not going away any time soon – in view of all these factors, another solution has to be forthcoming.
What is that solution? It seems clear that the solution is to carry on our economic life, using as many precautions as are feasible but not to the extent of continuing to suspend any significant activities which do not lend themselves to such precautions. For example, the practice of taking the temperature of all entrants to a building and requiring masks to be worn while inside – as being practiced in more and more venues already – can be adopted by far more businesses. Perhaps even on a mass scale such as ball games. Yes, it increases the cost of doing business, but that is better than no business at all. Imagination and creativity will be needed to cope with these issues. But those are characteristic attributes of Americans. The new question needs to be “How?” not “If”.
And how do we regain the confidence of the American public? How do we answer the inevitable charge that we are putting money ahead of saving lives?
The first thing we do is to stop measuring the success or failure of our efforts to contain the virus by the number of cases identified. This number is bound to increase as more and more people are tested every day. The proper metric is the death rate due to the virus. Even with the sloppy counting being used, the rate of COVID-19 deaths is actually going down. For example, the percent of deaths to cases reported for July 11 was 1.3%. (Source: Johns Hopkins CSSE) Longer term reports are equally encouraging.
What accounts for this statistic? In general, there are several reasons for this progress:
1) therapeutics are increasingly effective – both human competence, which has improved with experience, and new medicines which have been developed specifically to treat this COVID-19 illness. Treatment can be expected only to improve with more of both human and pharmaceutical development. Also, vaccines are due to start becoming available by the end of 2020.
2) Hospitals are getting more efficient in their protocols and procedures. The metric for the early preparatory efforts by the Administration was the fear of overcrowding the hospital capacity of the United States. While this is still a possibility on a local level, the occupancy is currently under control.
3) As younger people start to constitute a larger percentage of the total test population, mortality rates are expected to continue to decline because the virus appears to be less lethal for youths. In fact, many youngsters who have been infected never suffer any symptoms at all. In fact, their primary danger as a group seems to be their unwitting role as carriers of the disease to older contacts.
In general, America is learning to live with COVID-19 and to survive. It is now time to begin to flourish as we were before we were so rudely interrupted.
The brand of all cultural revolutions is untruth about the past and present in order to control the future. Why we have this happening to our country is the only mystery left.
The current revolution is based on a series of lies, misrepresentations, and distortions, whose weight will soon sink it.
Unfortunately few in authority have been more wrong, and yet more self-righteously wrong, than the esteemed Dr. Anthony Fauci. Given his long service as the director of the National Institute of Allergy and Infectious Diseases and his stature during the AIDS crisis, he has rightly been held up by the media as the gold standard of coronavirus information. The media has constructed Fauci as a constant corrective of Trump’s supposed “lies” about the utility of travel bans, analogies with a bad flu year, and logical endorsement of hydroxychloroquine as a “what do you have to lose” possible therapy.
But the omnipresent Fauci himself unfortunately has now lost credibility. The reason is that he has offered authoritative advice about facts, which either were not known or could not have been known at the time of his declarations.
Since January, Fauci has variously advised the nation both that the coronavirus probably was unlikely to cause a major health crisis in the United States and later that it might yet kill 240,000 Americans. In January, he praised China for its transparent handling of the coronavirus epidemic, not much later he conceded that perhaps they’d done a poor job of that. He has cautioned that the virus both poses low risks and, later, high risks, for Americans. Wearing masks, Fauci warned, was both of little utility and yet, later, essential. Hydroxychloroquine, he huffed, had little utility; when studies showed that it did, he still has kept mostly silent.
At various times, he emphasized that social distancing and avoiding optional activities were mandatory, but earlier that blind dating and going on cruise ships were permissible. Fauci weighed in on the inadvisability of restarting businesses prematurely, but he has displayed less certainty about the millions of demonstrators and rioters in the streets for a month violating quarantines. The point is not that he is human like all of us, but that in each of these cases he asserted such contradictions with near-divine certainty—and further confused the public in extremis.
In terms of how the United States “fared,” it is simply untrue that Europe embraced superior social policies in containing the virus. The only somewhat reliable assessments of viral lethality are population numbers and deaths by COVID-19, although the latter is often in dispute.
By such rubrics, the United States, so far, has fared better than most of the major European countries—France, Italy, the United Kingdom, Spain, Sweden, and Belgium—in terms of deaths per million. Germany is the one major exception. But if blame is to be allotted to public officials for the United States having a higher fatality rate than Germany, then the cause is most likely governors of high-death, Eastern Seaboard states—New York, New Jersey, Massachusetts, and Connecticut in particular. They either sent the infected into rest homes, or did not early on ensure that their mass transit systems were sanitized daily as well as practicing social distancing.
New York Governor Andrew Cuomo, more than any other regional or national leader, is culpable for decisions that doomed thousands of elderly patients. He did not just suggest long-term-care facilities receive active COVID-19 patients, but ordered them to take them—knowing at the time that the disease in its lethal manifestations targeted the elderly, infirm, and bedridden.
Then in shameful fashion, after thousands died, Cuomo claimed that either the facilities themselves or Donald Trump were responsible for the deaths. In truth, in the United States, the coronavirus is largely a fatal disease in two senses: the vulnerable in just four states on the Eastern Seaboard that account for about 12 percent of the nation’s population but close to half of its total COVID-19 fatalities, and/or patients in rest homes or those over 65 years old with comorbidities.
Why are there currently spikes in cases among young people in warmer states and those of less population density in late June? No one is certain. But one likely reason is that millions of protestors for nearly a month crammed the nation’s cities, suburbs, and towns, shouting and screaming without masks, violating social distancing, and often without observant hand washing and sanitizing—most often with official exemption or media and political approval.
The period of exposure and incubation is over, and the resulting new cases—for the most part asymptomatic and clustered among the young—are thus no surprise. Still, what is inconvenient is the rise in these cases—given that the Left either had claimed its mass demonstrations would not spread the disease, or, if they would, the resulting contagion was an affordable price to pay for the cry of the heart protests.
Perhaps, but the real cost of four weeks of protesting, rioting, and looting was to undermine the authority of state officials to enforce blatant violations of the quarantine. Obviously, if some can march with impunity in phalanxes of screaming, shoulder-to-shoulder protestors, while others are jailed as individuals trying to restart a business, then the state has lost its credibility with people and they will simply ignore further edicts as they see fit. Now what adjudicates quarantines are the people’s own calibrations of their own safety.
Mismanagement of the virus? There have been four disastrous official policy decisions: sending patients into rest homes; allowing millions en masse for political reasons to violate state mandates on masks and social distancing; retroactively attempting to reissue quarantine standards that their advocates and authors had themselves earlier de facto destroyed; and consistently issuing pandemic alerts solely on the flawed basis of new positive cases, without distinguishing those who were asymptomatic, or who were infected and recovered without ever being tested, or who were asymptomatic and tested positive for antibodies, or who were only briefly ill, recovered, and by no means still a case-patient.
Black Lives Matter, Antifa, and other revolutionary groups hijacked the tragic death of George Floyd. Within hours they created a mythology of rampant white police lethal attacks on innocent black victims. But that trope, too, was without a factual basis.
The wrongful deaths of unarmed African-Americans in custody have been on the decline, is far less than the number of police murdered per year, less than the number of white suspects killed, and proportionally fewer, in terms of percentages of those arrested by police, than other racial groups.
In rare interracial violence, blacks are five times more likely to attack whites than vice versa. There is a tragic war against young, black males—over 7,000 murdered per year—but it is an urban genocide of sort perpetrated in liberal cities, governed by liberal mayors and governors, and overseen by liberal police chiefs. The shooters are overwhelmingly other black males.
Somehow those facts were distorted by the Left into a trope that George Floyd was typical of an epidemic of white-generated lethal racial hatred. One can certainly argue about systematic racism as being a factor in all these asymmetries, but that is not what the rioting and their apologists have done in trafficking in accusations that have no data to support them.
There is no logic to statue toppling, name changing, or culture canceling other than the quest to assert power, humiliate authorities, and create crises where they do not exist in order to manufacture a faux state of emergency—in service of a political agenda. In some sense, whether any statues fall is contingent entirely on the lack of resistance.
We know this because the ignorant rioters and protestors cannot explain why monuments to Ulysses S. Grant, Cervantes, black Civil War veterans, or Abraham Lincoln need to be toppled and destroyed as much as a statue of Robert E. Lee. We are not told why the Woodrow Wilson School at Princeton is canceled out, but not the Wilson Center in Washington, or why a memorial to President Washington is targeted for defacement but not the hit play, “Hamilton,” another founder who at one time owned slaves. And what or who, if any, exactly is to replace our fallen luminaries? Name the most iconic—Martin Luther King Jr., Malcolm X, or Che Guevarra and the current rules of perfection would disqualify them all.
The abettors of the madness—corporations, the Democratic National Committee, universities, and the media—are not so mad. Yale, named for a slave owner, is now mostly a brand name, not a certification of a first-class, disinterested, and classically liberal education.
Take the elite stamp away, and what replaces it might as well be an online degree mill—given that it is no longer so demonstrable that a Yale graduate learned more than in his four years than did a graduate of Cal State Stanislaus.
So university presidents at Princeton, Yale, Stanford, and Columbia, know that by the standards of BLM their brand names must be changed. But to do so is synonymous with multi-billion-dollar losses and the destruction of centuries-old brands. Perhaps that is why they pander to the mob the way a Roman would-be emperor outbid rivals seeking to win over the Praetorian Guard.
The truth is that the COVID-19 epidemic, the lockdown, and the rioting were seen by the Left, the media, and now the Democratic Party as a renewed effort in this election year to do what Robert Mueller, Ukraine, and impeachment had not—abort the presidency of Donald Trump, or make it impossible for him to be reelected.
So Trump was to be reconfigured as a racist responsible for the death of George Floyd. Then he was smeared as a Herbert Hoover who supposedly crashed the economy all on his own. And then he became a Typhoid Mary purveyor of death who sickened and killed tens of thousands of Americans at his rallies in a way millions at left-wing protests did not.
To that end, almost daily, entire fantasies were birthed, floated, crashed, and then were replaced by new hoaxes. The strategy was that while one lie might be refuted, the bigger and more numerous the lies, the more a continuous narrative could be fabricated.
Consequently, the last two weeks, in succession we were told by the media that a noose was left in a NASCAR garage as a racist threat to NASCAR’s only major African-American driver, typical of Trump’s racist America; that Donald Trump, in dejection and self-incrimination, was soon to quit rather than face the humiliation of a landslide defeat in November; that the president knowingly rejected intelligence that the Russians were paying bounties on American soldiers in Afghanistan, as part of his obeisance to Vladimir Putin; and that Trump went to Mount Rushmore to honor racist presidents and dishonor sacred Native American land.
All were not just lies, but respectively unimaginative and banal successors to similarly long ago discredited lies—the Jussie Smollett hoax, the “Trump never wished to be president in the first place” hoax, the Russian “collusion” hoax, and the hoax that Trump’s presence turns once esteemed monuments that prior presidents, most recently Barack Obama, visited into racist dog whistles.
Then there was the monstrous lie that Joe Biden has no cognitive disabilities. That he does was the consensus of one in five polled Democratic voters, of many of his own primary rivals in numerous Democratic debates, of handlers who bragged that his basement quarantine need not end because it resulted in him outpolling Trump, of a scramble to turn the vice-presidential nomination into a veritable presidential bid, and in a litany of gaffes, blank outs, and tragic memory lapses of familiar names, places, and common referents.
Biden finally came out of his bunker to do some tele-fundraising and talk to a few preselected reporters. He almost immediately blasted a reporter as a “lying dog face.” In one of his next appearances, his opening statement started with “I am Joe Biden’s husband, even as the liberal media insisted “Joe” was “Jill.” There is now a Biden-inspired cottage industry of arguing that what Biden is recorded as saying is not what he was saying—on the theory that he so poorly pronounces words that they can become almost anything you wish.
What is cruel is cynically using a cognitively challenged candidate for the purpose of winning an election and then replacing him with a far-left vice president who otherwise likely would never have been elected.
FDR and the Democratic Party did something similar in his successful fourth-term bid in 1944 because of FDR’s anticipated early death in office—but in matters of hiding physical rather than cognitive impairment. Moreover, at least that dishonest gambit was undertaken in order to prevent a socialist takeover of the United States by jettisoning the hard leftist, Vice President Henry Wallace.
In 2020, the effort is not to ensure that a socialist not be appointed president who otherwise would not have been elected, but rather to ensure that she will be.
The brand of all cultural revolutions is untruth about the past and present in order to control the future. Why we have let this happen to our country is the only mystery left.
To know what the Chinese are really up to, read the futuristic novels of Liu Cixin.
“We are in the foothills of a Cold War.” Those were the words of Henry Kissinger when I interviewed him at the Bloomberg New Economy Forum in Beijing last November.
The observation in itself was not wholly startling. It had seemed obvious to me since early last year that a new Cold War — between the U.S. and China — had begun. This insight wasn’t just based on interviews with elder statesmen. Counterintuitive as it may seem, I had picked up the idea from binge-reading Chinese science fiction.
First, the history. What had started out in early 2018 as a trade war over tariffs and intellectual property theft had by the end of the year metamorphosed into a technology war over the global dominance of the Chinese company Huawei Technologies Co. in 5G network telecommunications; an ideological confrontation in response to Beijing’s treatment of the Uighur minority in China’s Xinjiang region and the pro-democracy protesters in Hong Kong; and an escalation of old frictions over Taiwan and the South China Sea.
Nevertheless, for Kissinger, of all people, to acknowledge that we were in the opening phase of Cold War II was remarkable.
Since his first secret visit to Beijing in 1971, Kissinger has been the master-builder of that policy of U.S.-Chinese engagement which, for 45 years, was a leitmotif of U.S. foreign policy. It fundamentally altered the balance of power at the mid-point of the Cold War, to the disadvantage of the Soviet Union. It created the geopolitical conditions for China’s industrial revolution, the biggest and fastest in history. And it led, after China’s accession to the World Trade Organization, to that extraordinary financial symbiosis which Moritz Schularick and I christened “Chimerica” in 2007.
How did relations between Beijing and Washington sour so quickly that even Kissinger now speaks of Cold War?
The conventional answer to that question is that President Donald Trump has swung like a wrecking ball into the “liberal international order” and that Cold War II is only one of the adverse consequences of his “America First” strategy.
Yet that view attaches too much importance to the change in U.S. foreign policy since 2016, and not enough to the change in Chinese foreign policy that came four years earlier, when Xi Jinping became general secretary of the Chinese Communist Party. Future historians will discern that the decline and fall of Chimerica began in the wake of the global financial crisis, as a new Chinese leader drew the conclusion that there was no longer any need to hide the light of China’s ambition under the bushel that Deng Xiaoping had famously recommended.
When Middle America voted for Trump four years ago, it was partly a backlash against the asymmetric payoffs of engagement and its economic corollary, globalization. Not only had the economic benefits of Chimerica gone disproportionately to China, not only had its costs been borne disproportionately by working-class Americans, but now those same Americans saw that their elected leaders in Washington had acted as midwives at the birth of a new strategic superpower — a challenger for global predominance even more formidable, because economically stronger, than the Soviet Union.
It is not only Kissinger who recognizes that the relationship with Beijing has soured. Orville Schell, another long-time believer in engagement, recently conceded that the approach had foundered “because of the CCP’s deep ambivalence about the way engaging in a truly meaningful way might lead to demands for more reform and change and its ultimate demise.”
Conservative critics of engagement, meanwhile, are eager to dance on its grave, urging that the People’s Republic be economically “quarantined,” its role in global supply chains drastically reduced. There is a spring in the step of the more Sinophobic members of the Trump administration, notably Secretary of State Mike Pompeo, deputy National Security Adviser Matt Pottinger and trade adviser Peter Navarro. For the past three and a half years they have been arguing that the single most important thing about Trump’s presidency was that he had changed the course of U.S. policy towards China, a shift from engagement to competition spelled out in the 2017 National Security Strategy. The events of 2020 would seem to have vindicated them.
The Covid-19 pandemic has done more than intensify Cold War II. It has revealed its existence to those who last year doubted it. The Chinese Communist Party caused this disaster — first by covering up how dangerous the new virus SARS-CoV-2 was, then by delaying the measures that might have prevented its worldwide spread.
Yet now China wants to claim the credit for saving the world from the crisis it caused. Liberally exporting cheap and not wholly reliable ventilators, testing kits and face masks, the Chinese government has sought to snatch victory from the jaws of a defeat it inflicted. The deputy director of the Chinese Foreign Ministry’s information department has gone so far as to endorse a conspiracy theory that the coronavirus originated in the U.S. and retweet an article claiming that an American team had brought the virus with them when they participated in the World Military Games in Wuhan last October.
Just as implausible are Chinese claims that the U.S. is somehow behind the recurrent waves of pro-democracy protest in Hong Kong. The current confrontation over the former British colony’s status is unambiguously Made in China. As Pompeo has said, the new National Security LawBeijing imposed on Hong Kong last Tuesday effectively “destroys” the territory’s semi-autonomy and tears up the 1984 Sino-British joint declaration, which guaranteed that Hong Kong would retain its own legal system for 50 years after its handover to People’s Republic in 1997.
In this context, it is not really surprising that American public sentiment towards China has become markedly more hawkish since 2017, especially among older voters. China is one of few subjects these days about which there is a genuine bipartisan consensus. It is a sign of the times that Democratic presidential candidate Joe Biden’s campaign clearly intends to portray their man as more hawkish on China than Trump. (Former National Security Adviser John Bolton’s new memoir is grist to their mill.) On Hong Kong, Nancy Pelosi, the Democratic speaker of the House, is every bit as indignant as Pompeo.
I have argued that this new Cold War is both inevitable and desirable, not least because it has jolted the U.S. out of complacency and into an earnest effort not to be surpassed by China in artificial intelligence, quantum computing and other strategically crucial technologies. Yet there remains, in academia especially, significant resistance to my viewthat we should stop worrying and learn to love Cold War II.
At a forum last week on World Order after Covid-19, organized by the Kissinger Center for Global Affairs at Johns Hopkins University, a clear majority of speakers warned of the perils of a new Cold War.
Eric Schmidt, the former chairman of Google, argued instead for a “rivalry-partnership” model of “coop-etition,” in which the two nations would at once compete and cooperate in the way that Samsung and Apple have done for years.
Harvard’s Graham Allison, the author of the bestselling “Destined for War: Can America and China Escape Thucydides’s Trap?”, agreed, giving as another example the 11th-century “frenmity” between the Song Emperor of China and the Liao kingdom on China’s northern border. The pandemic, Allison argued, has made “incandescent the impossibility of identifying China clearly as either foe or friend. Rivalry-partnership may sound complicated, but life is complicated.”
“The establishment of a productive and predictable US/China relationship,” wrote John Lipsky, formerly of the International Monetary Fund, “is a sine qua non for strengthening the institutions of global governance.” The last Cold War had cast a “shadow of a global holocaust for decades,” observed James Steinberg, a former deputy secretary of state. “What can be done to create a context to limit the rivalry and create space for cooperation?”
Elizabeth Economy, my colleague at the Hoover Institution, had an answer: “The United States and China could … partner to address a global challenge,” namely climate change. Tom Wright of the Brookings Institution took a similar line: “Focusing only on great power competition while ignoring the need for cooperation will not actually give the United States an enduring strategic advantage over China.”
All this sounds eminently reasonable, apart from one thing. The Chinese Communist Party isn’t Samsung, much less the Liao kingdom. Rather — as was true in Cold War I, when (especially after 1968) academics tended to be doves rather than hawks — today’s proponents of “rivalry-partnership” are overlooking the possibility that the Chinese aren’t interested in being frenemies. They know full well this is a Cold War, because they started it.
To be sure, there are also Chinese scholars who lament the passing of engagement. The economist Yu Yongding recently joined Kevin Gallagher of Boston University to argue for reconciliation between Washington and Beijing. Yet that is no longer the official view in Beijing. When I first began talking publicly about Cold War II at conferences last year, I was surprised that no Chinese delegates contradicted me. In September, I asked one of them — the Chinese head of a major international institution — why that was. “Because I agree with you!” he replied with a smile.
As a visiting professor at Tsinghua University in Beijing, I have seen for myself the ideological turning of the tide under Xi. Academics who study taboo subjects such as the Cultural Revolution find themselves subject to investigations or worse. Those who take a more combative stance toward the West get promoted.
Yan Xuetong, dean of the Institute of International Relations at Tsinghua, recently argued that Cold War II, unlike Cold War I, will be a purely technological competition, without proxy wars and nuclear brinkmanship. Yao Yang, dean of the National School of Development at Peking University, was equally candid in an interview with the Beijing Cultural Review, published on April 28.
“To a certain degree we already find ourselves in the situation of a New Cold War,” he said. “There are two basic reasons for this. The first is the need for Western politicians to play the blame game” about the origins of the pandemic. “The next thing,” he added, “is that now Westerners want to make this into a ‘systems’ question, saying that the reason that China could carry out such drastic control measures [in Hubei province] is because China is not a democratic society, and this is where the power and capacity to do this came from.”
This, however, is weak beer compared with the hard stuff regularly served up on Twitter by the pack leader of the “wolf warrior” diplomats, Zhao Lijian. “The Hong Kong Autonomy Act passed by the US Senate is nothing but a piece of scrap paper,” he tweeted on Monday, in response to the congressional retaliation against China’s new Hong Kong security law. By his standards, this was understatement.
The tone of the official Chinese communiqué released after Pompeo’s June 17 meeting in Hawaii with Yang Jiechi, the director of the Communist Party’s Office of Foreign Affairs, was vintage Cold War. On the persecution of the Uighurs, for example, it called on “the US side to respect China’s counter-terrorism and de-radicalization efforts, stop applying double standards on counter-terrorism issues, and stop using Xinjiang-related issues as a pretext to interfere in China’s internal affairs.”
And this old shrillness, so reminiscent of the Mao Zedong era, is not reserved for the U.S. alone. The Chinese government lashes out at any country that has the temerity to criticize it, from Australia — “gum stuck to the bottom of China’s shoe” according to the editor of the Party-controlled Global Times — to India to the U.K.
Those who hope to revive engagement, or at least establish frenmity with Beijing, underestimate the influence of Wang Huning, a member since 2017 of the Standing Committee of the Politburo, the most powerful body in China, and Xi’s most influential adviser. Back in August 1988, Wang spent six months in the U.S. as a visiting scholar, traveling to more than 30 cities and nearly 20 universities. His account of that trip, “America against America,” (published in 1991) is a critique — in places scathing — of American democracy, capitalism and culture (racial division features prominently in the third chapter).
Yet the book that has done the most to educate me about how China views America and the world today is, as I said, not a political text, but a work of science fiction. “The Dark Forest” was Liu Cixin’s 2008 sequel to the hugely successful “Three-Body Problem.” It would be hard to overstate Liu’s influence in contemporary China: He is revered by the Shenzhen and Hangzhou tech companies, and was officially endorsed as one of the faces of 21st-century Chinese creativity by none other than … Wang Huning.
“The Dark Forest,” which continues the story of the invasion of Earth by the ruthless and technologically superior Trisolarans, introduces Liu’s three axioms of “cosmic sociology.”
First, “Survival is the primary need of civilization.” Second, “Civilization continuously grows and expands, but the total matter in the universe remains constant.” Third, “chains of suspicion” and the risk of a “technological explosion” in another civilization mean that in space there can only be the law of the jungle. In the words of the book’s hero, Luo Ji:
The universe is a dark forest. Every civilization is an armed hunter stalking through the trees like a ghost … trying to tread without sound … The hunter has to be careful, because everywhere in the forest are stealthy hunters like him. If he finds other life — another hunter, an angel or a demon, a delicate infant or a tottering old man, a fairy or a demigod — there’s only one thing he can do: open fire and eliminate them. In this forest, hell is other people … any life that exposes its own existence will be swiftly wiped out.
Kissinger is often thought of (in my view, wrongly) as the supreme American exponent of Realpolitik. But this is something much harsher than realism. This is intergalactic Darwinism.
Of course, you may say, it’s just sci-fi. Yes, but “The Dark Forest” gives us an insight into something we think too little about: how Xi’s China thinks. It’s not up to us whether or not we have a Cold War with China, if China has already declared Cold War on us.
Not only are we already in the foothills of that new Cold War; those foothills are also impenetrably covered in a dark forest of China’s devising.
Editor’s Note: This is an edited excerpt, comprising the Introduction and Conclusion, from a longer essay by Mr. Atlas. Titled ‘The Costs Of Regulation And Centralization In Health Care,' it is published by the Hoover Institution as part of a new initiative, "Socialism and Free-Market Capitalism: The Human Prosperity Project."
The overall goal of US health care reform is to broaden access for all Americans to high-quality medical care at lower cost. In response to a large uninsured population and increasing health care costs, the Affordable Care Act (ACA, or “Obamacare”) aimed first and foremost to increase the percentage of Americans with health insurance. It did so by broadening government insurance eligibility, adding extensive regulations and subsidies to health care delivery and payment, and imposing dozens of new taxes. The ACA was projected to spend approximately $2 trillion over the first decade on its two central components: expanding government insurance and subsidizing heavily regulated private insurance.
Through its extensive regulations on private insurance, including coverage mandates, payout requirements, co-payment limits, premium subsidies, and restrictions on medical savings accounts, the ACA counterproductively encouraged more widespread adoption of bloated insurance and furthered the construct that insurance should minimize out-of-pocket payment for all medical care. Patients in such plans do not perceive themselves as paying for these services, and neither do physicians and other providers. Because patients have little incentive to consider value, prices as well as quality indicators, such as doctor qualifications or hospital experience, remain invisible, and providers do not need to compete. The natural results are overuse of health care services and unrestrained costs.
In response to the failures of the ACA, superimposed on decades of misguided incentives in the system and the considerable health care challenges facing the country, US voters at the time of this writing are being presented with two fundamentally different visions of health care reform: (1) a single-payer, government-centralized system, including Medicare for All, the extreme model of government regulation and authority over health care and insurance, which is intended to broaden health care availability to everyone while eliminating patient concern for price; or (2) a competitive, consumer-driven system based on removing regulations that shield patients from considering price, increasing competition among providers, and empowering patients with control of the money. This model is intended to incentivize patients to consider price and value, in order to reduce the costs of medical care while enhancing its value, thereby providing broader availability of high-quality care.
Outside a discussion of the role of private versus public health insurance are two realities. First, America’s main government insurance programs, Medicare and Medicaid, are already unsustainable without reforms. The 2019 Medicare Trustees report projects that the Hospitalization Insurance Trust Fund will face depletion in 2026. Most hospitals, nursing facilities, and in-home providers lose money per Medicare patient. Dire warnings about the closure of hospitals and care provider practices are already projected by the Centers for Medicare and Medicaid due to the continued payment for services by government insurance below the cost of delivery of those services. Regardless of trust fund depletion, Medicare and Medicaid must compete with other spending in the federal budget. America’s national health expenditures now total more than $3.8 trillion per year, or 17.8 percent of gross domestic product (GDP), and they are projected to reach 19.4 percent of GDP by 2027. In 1965, at the start of Medicare, workers paying taxes for the program numbered 4.6 per beneficiary; that number will decline to 2.3 in 2030 with the aging of the baby boomer generation. Unless the current system is reformed, federal expenditures for health care and social security are projected to consume all federal revenues by 2049, eliminating the capacity for national defense, interest on the national debt, or any other domestic program.
Second, beyond the growing burden from lifestyle-induced diseases, including obesity and smoking, that will require medical care at an unprecedented level, America’s aging population means more heart disease, cancer, stroke, and dementia—diseases that depend most on specialists, complex technology, and innovative drugs for diagnosis and treatment. The current trajectory of the system is fiscally unsustainable, and millions are already excluded from the excellence of America’s medical care.
In most nations, heavy regulation of the supply of health care goods and services care is coupled with marked centralization of payment for medical care. The United States has a far less centralized but still highly regulated system in which health expenditures are roughly equal from public and private insurance. The system is characterized by its unique private components: more than 200 million Americans, including most seniors on Medicare, use private insurance. The US system is the world’s most effective by literature-based, objective measures of access, quality, and innovation, but US health care demands reform. Health care costs are high and increasing, and the projected demand for medical care by an aging population and the future burden of lifestyle-related disease threaten the sustainability of the system.
Although the regulatory expansion under the Affordable Care Act reduced the uninsured population, it generated increased private insurance premiums, a withdrawal of insurers from the market, and sector-wide consolidation that is historically associated with higher prices and reduced choices of medical care. In its wake, American voters are now presented with two fundamentally different visions for reform that have a diametrically opposed reliance on regulation and centralization: (1) the Democrats’ single-payer proposals, including Medicare-for-All, based on the most extreme level of government regulation and authority over health care and health insurance; or (2) the Trump administration’s consumer-driven system that relies on strategic deregulation to increase market-based competition among providers and empowering patients with control of the money. Both pathways are intended to contain overall expenditures on health care and broaden access.
Intuitively, a single-payer model of health care represents a simplification, but the reality is that such centralized systems impose overwhelming restrictions on both demand and supply. Government-centralized single-payer systems actively hold down health care expenditures mainly by sweeping restrictions on the utilization and payment for medical procedures, drugs, and technology under the single authority of the central government. The overall costs of this false simplification are enormous, creating societal costs that extend beyond calculated tax payments that are required to support such a system.
The alternative approach involves rule elimination and decentralization, that is, strategic deregulation, to induce competition for value-seeking patients. Reducing the price of health care by competition, instead of more regulation, generates lower insurance premiums, reduces outlays from government programs, and broadens access to quality care. Broadly available options for cheaper, high-deductible coverage less burdened by regulations; markedly expanded health savings accounts; and tax reforms to unleash consumer power are keys to achieving price sensitivity for health care. Reforms to increase the supply of medical care by breaking down long-standing anti-consumer barriers to competition, such as archaic certificates‐of‐need for technology, unnecessary state‐based licensure of physicians, and overly regulated pathways to drug development, while facilitating transparency of price and quality among doctors and hospitals, would generate further competition and reduce the price of health care. Preliminary results from such deregulatory actions demonstrate promising results and offer an evidence-based context for the broader discussion of the role and reach of government regulation in socialism compared with free-market systems.
Predicting the speed and strength of the United States' recovery from the current recession is extremely difficult. But what is clear is that policymakers must boost incentives to work in normal times when jobs are plentiful, while strengthening the safety net for when they are not and for those who are unable to work.
STANFORD – Like most of the world, the United States is attempting to overcome both the COVID-19 pandemic and a deep recession caused by the resulting government-ordered shutdown. At annual rates, the US economy shrank by 5% in the first quarter of 2020, and in the second quarter just ending, it could contract by 40% – the steepest decline since the Great Depression.
Moreover, tens of millions of workers have lost their jobs, causing the unemployment rate to soar to a post-Great Depression high of 14.7% in April. And although 70% of those laid off say they expect to be recalled to their jobs, not all will be, because many firms will fold, relocate, or reorganize.
True, the initial reopening of the economy has led to a sharp rebound that is projected to continue in the third quarter. Employment rose by 2.5 million in May, while high-frequency data from credit cards and mobility tracking for May and June show sizable bounce backs from April lows, with activity in a few sectors approaching or even exceeding year-earlier levels.
But the rebound varies by sector and region. Although Big Tech, home-improvement suppliers, and retail sales of alcoholic beverages have flourished, travel and leisure have collapsed and will take much longer to recover. And restaurants with drive-through service have fared much better than those able to serve only indoors.
Most forecasters therefore predict that the early “V-shaped” recovery will slow over the next few quarters, and instead come to resemble the Nike swoosh. But this plausible baseline forecast is subject to greater than normal uncertainty.
For starters, the shutdown of non-essential businesses in response to the pandemic led to a demand-side shock as well. So far, trillions of dollars in business grants and loans, cash payments to households, and unemployment insurance with federal bonus payments (enabling two-thirds of eligible workers to receive benefits that exceed their lost earnings) have provided a cushion to help the economy recover. The US Federal Reserve has pledged to keep its target interest rate until the economy returns to full employment, and it continues to expand the scope of its asset purchases. And a fourth fiscal package expected next month should focus on reopening the economy, including by limiting firms’ legal liability and redirecting bonus payments to encourage employees to return to work.
How quickly the US recovers from its public-health and economic crises will also depend on how well other countries handle them, and vice versa. The World Bank expects 93% of countries to slide into recession in 2020, the highest share ever.
Although the recent spikes in new COVID-19 cases and hospitalizations in the US appear manageable for now, given adequate provision of hospital beds and equipment, a significant worsening could trigger new shutdowns or stall further reopening. That would slow the recovery, resulting in economic despair and related health and social problems for many Americans.
Moreover, America’s twin crises have revealed longer-term problems, starting with the country’s inadequate stockpiles of medical supplies. California, for example, never maintained the supplies then-Governor Arnold Schwarzenegger built up to combat the 2002-03 SARS epidemic, and had to repair hundreds of defective ventilators. And state governments’ antiquated computer systems for processing unemployment claims and dispensing benefits buckled under the pandemic-induced strain.
In addition, the COVID-19 shock has shown that too many individuals and firms lack the financial margin to weather even a few months of lost income or revenue. It has also both highlighted and worsened racial disparities in health, income, and vulnerability to economic and health shocks.
These crises elicited massive, rapid, and unprecedented interventionist responses. But government responses enacted under exigent circumstances must control costs better and restore private incentives in the longer term, because history shows that, once launched, public programs and interventions seldom end.
The economic and health recoveries also heavily depend on the actions of businesses, citizens, and schools, including whether they adhere to recommended precautions such as social distancing, frequent hand washing, and wearing face masks. It remains to be seen whether firms can survive with restrictions on employees and customers, and whether the accelerated digital transformation will be a net plus. The other danger, of course, is a large second wave of the virus that overwhelms hospitals and scares away employees, students, and customers.
One bright spot has been the rapid pace of adaptive innovation. Most US schools quickly continued teaching online following the shutdown, while telemedicine has boomed, helped by the relaxation of government pay restrictions and rules prohibiting inter-state medical consultations. And medical researchers quickly refocused on COVID-19 testing, therapeutics, and vaccines: human trials have started for several promising vaccines, and new tests may be deployed before winter. For the first time, vaccine production capacity will be ramped up simultaneously with testing, so that any safe and effective vaccine that emerges will become available far more quickly.Sign up for our weekly newsletter, PS on Sunday
But the longer-term problems revealed by the pandemic and the recession will not disappear when these crises end. True, before COVID-19 struck, things finally had started looking up for lower-income workers. Minority unemployment was at an all-time low, and wages were rising most rapidly at the bottom of the pay scale. But while strong economic growth will be needed to ensure that these trends resume, there are pockets of people who have been left behind.
To address this requires reinvigorating policies to broaden school choice, bring private jobs and capital to depressed areas, and ensure better job training (including more apprenticeships and job matching), as well as taking a new approach to overlapping means-tested anti-poverty programs. US welfare recipients face extremely high implicit marginal tax rates in terms of the benefits they lose if they work, with many standing to earn less if they worked than if they remained on the several overlapping programs.
It is extremely difficult to predict the speed and strength of the US economic recovery with any certainty. What is clear, however, is that we must boost incentives to work in normal times when jobs are plentiful, while strengthening the safety net for when they are not and for those who are unable to work.
It’s been decades since the Democrats settled on a presidential nominee as weak as former Vice President Joe Biden. He’s not popular in his own party. In Tuesday’s Kentucky presidential primary, for example, he only won about 60 percent of the vote—and he’s already clinched the nomination.
Party leaders should be worried about this. It’s not as though the only Democrats to show up in the Bluegrass State on Tuesday were the fringes and the freaks. The suddenly competitive race between former congressional candidate Amy McGrath and State Senator Charles Booker for the nomination to go against Senate Majority Leader Mitch McConnell (R-KY) in the fall brought out Democrats of all stripes all over the state. Much of the party faithful, it’s clear, just doesn’t like the idea of a Biden presidency.
If things were any worse, the talk of replacing him at the top of the ticket might be at a fever pitch by now. Instead, while he hides in his basement—and perhaps because he does—Biden is ahead in every national poll, just about every poll in just about every swing state and is preferred by most voters to Donald Trump on every issue except the handling of the economy. And even if the polls are suspect, as Trump’s team and many Republicans say they are, the Democrats must be pleased with the sentiments those polls reveal.
For all intents and purposes, it is a strange election indeed. Which makes the decision to play the “Obama card” so early in the process curious.Ads by scrollerads.com
Obama is the big gun. He’s Mr. Charisma. He’s the face of the Democratic Party everybody loves. Usually, you’d hold someone like him back for the fall election and then work him nearly to death, sending him to every targeted state, time and again, on the party’s behalf. If Hillary Clinton had been able to do that with husband Bill in 2016, she might have won, but—for reasons already discussed ad nauseam—Republicans like Roger Stone kept him pretty much on the sidelines. Yet rather than hold Obama in reserve to move the voters they need to win late in the election, the Biden people rolled him out earlier in order to raise money.
It makes some sense. Biden may be leading the polls, but he’s trailed way behind Trump in fundraising. That, oddly enough, may end up being what makes the difference in November. The free campaign being waged by the pundits, political reporters and news channels on Biden’s behalf have made the election a referendum on Trump.
That’s a hard race for anyone to win, let alone the current president—especially given the cynical nature of most American voters. No contemporary politician except possibly Ronald Reagan—who won 49 of 50 states in 1984—could run against his own record and win. No one is that good. No one is that beloved. Even Barack Obama, unlike Bill Clinton, got fewer votes running for re-election than he did in 2008.
If it were up to the people who establish the national campaign narratives on the newspapers and TV screens, the campaign would remain a referendum on Trump. But they can’t control that. The president needs to change the conversation and make the election a choice between competing visions of what America should be—and force voters to make that choice.
Team Trump can do it. It has more money in the bank than any of his predecessors, money that’s being used to establish communication channels all over social media. The campaign is even doing original programming to counter what’s airing on the networks. It’s a great leap forward, and Biden alone can’t raise the money to match. So in that sense, playing the Obama card now makes sense. The former vice president’s campaign needs the kind of money only someone of the former president’s stature can raise right now.
It may also be that Obama, while of great value to candidates down-ballot in the general election, won’t be able to help Biden on the stump much at all. The former president will always overshadow the would-be future president at every joint appearance. Appearing on his own, he’s a constant reminder to every Democrat and independent of how charisma- and vision-challenged Biden actually is.
Keeping Biden in the basement may not have been intended as a strategy, at first. It may have just been a response to the COVID-19 lockdowns. But now it looks like a blessing in disguise. The voters, at least right now, are showing a decisive preference for the candidate they can’t see over the one they can. It’s not clear if that is sustainable, but the Democrats will try to keep it going for as long as they can. Unfortunately for the down-ballot races, that may mean keeping Obama under wraps, too. They can’t risk giving Trump anything to play off other than himself.