The Case for Trump’s War Is the Case for Bush’s War

“It’s not 2003.” So say some fervent Donald Trump supporters who are desperate to distinguish the U.S. attack on Iran from the U.S. invasion of Iraq 23 years ago. And since it’s not 2003, “this is not a time for neocons to be spiking the football.” So said Heritage Foundation President Kevin Roberts and, by saying so, inadvertently made the case that the “neocons” he takes to have been responsible for the Iraq War do have excellent reason to spike the football over Iran. The Trump of Roberts’s imagination would never do anything like what the “neocons” of his wild and convenient imaginings cooked up for Iraq. Except that Trump just did.

The United States is now at war with a country whose leaders have been gathering mobs to chant “Death to America” since the Islamic revolution of 1979 and have made good on it ever since by killing Americans within reach of Iran’s power when practical. Let us set aside for now analysis of the curious need of so many to wrench Trump’s bold and surprising decision into alignment with their historically uninformed quest for Marvel-style neocon villains. While we’re at it, let’s set aside consideration of those on the other side of the aisle whose desire for Trump to fail seems more powerful than their desire for their country to succeed. We’re at war, and if it’s too much to ask that the nation unify around the American cause, that’s sad—but such are the times in which we live, and fortunately for all of us, Iran is now in the fearsome hands of the U.S. military.

In truth, it is 2003 again. History rhymes. An American president has had to decide, on the basis of information he has at hand, how to cope with a grave threat to American interests and values. And again, a president has chosen war. Then, it was Iraq. Now, it is Iran. The real surprise is that, geopolitically, the Iraq example has turned out to be a good one for Trump to follow. The dubious origin of that war a generation ago and the epic failure of American-style liberal values to take root in that country have obscured significant aspects of the positive outcome of the conflict. And confusion about the security issues that faced the United States then and that are facing it now is distorting public and elite perceptions of the Iran problem Trump has taken the country to war to address.

When a recent poll of American historians ranked the best and worst American foreign policy decisions of all time, it was a foregone conclusion that going to war in Iraq in 2003 would rank as the worst. And the conclusion was correct. True, 58,000 Americans died in Vietnam, and the American military nearly collapsed in its aftermath; estimates of direct civilian deaths in Vietnam ran as high as 2 million and nearly double that regionally; the United States lost Indochina to Communism, and Saigon fell ignominiously. Contrast that with Iraq, where we lost 4,500 killed, after which the U.S. military emerged more capable than ever; where estimates of direct civilian deaths range to 200,000, a tenth of the toll in Vietnam. We tried withdrawing in 2011 but had to go back after the emergence of ISIS, which killed perhaps 100,000 civilians before we and allies destroyed it and saved Iraq and the world from the deadliest innovation of the 2010s. But there you are. It is difficult to talk about Iraq in polite company—on both left and right—in any way other than with acknowledgments of how disastrous, how calamitous, how ruinous it was.

And we all know why.

To review, the Bush administration erroneously believed that Saddam Hussein possessed large stocks of chemical and other weapons—and that he still harbored an intention to develop nuclear weapons, an intention that dated back to the construction of the Osirak nuclear reactor, which Israel destroyed in 1981. We should not waste time on those who claim the Bush administration was consciously lying when it led us into war. This groundless slander actually works to obscure a complex truth. Leaders can make decisions only on the basis of the information they have at the time. But events that follow those decisions compel us to pass judgment on them in light of additional information gleaned in their wake. “If you knew then what you know now” is an inevitable question in retrospect—but it is meaningless when it comes to real-time decision-making. For Bush, in the post-9/11 context, Saddam’s supposed capabilities and ambitions made him too dangerous going forward not to confront and depose.

The second Iraq problem was the failure to anticipate—and once it was underway, to acknowledge—the gathering insurgency inside the country working in opposition to the U.S. occupation and its efforts to install a democratic government. Vice President Dick Cheney said in 2005 the insurgency was in its “last throes.” Unfortunately, it was not, and the U.S. military was increasingly vexed by its inability to solve the lethal problem of improvised explosive devices on roadbeds. Those bombs accounted for about half of all American casualties.

Like many, I supported going to war in Iraq in 2003 on the grounds that Saddam, his weapons stocks, and his ambitions posed an unacceptable risk. Had we known then that the WMD fears were the product of American and allied intelligence failures, which Saddam could have dispelled but chose not to, most of us would have supported the continuation of the Clinton -administration policy of slapping Saddam back even as he probed the determination of the West to hold fast to the limitations on his actions and choices to which he had agreed as a condition of ending the 1991 U.S.-led war against him. But we would have worried about the long-term viability of keeping him in that box. Support at the UN Security Council for the sanctions regime had begun to erode only a few years after it had been imposed. After 9/11, it was also all too easy to imagine him making common cause with, or being an active participant in, the new and indeterminate terrorist threat to the United States (and of course there were those who believed he had been involved in some way with the attack). He certainly had more than sufficient motive.

At the time, many of us embraced the view that America and its allies would be liberating Iraq from Saddam, a vicious tyrant, and, once liberated, that America would have a responsibility to try to establish a decent government for Iraq’s people to replace the malevolent one we took down.

Critics on both left and right have claimed ever since that we went to war for the misbegotten purpose of bringing democracy to Iraq. This view—in part a result of ex post facto foolhardy utopianism that flowed from the feckless pens of the Bush White House’s talented but overenthusiastic rhetoricians—gets the sequence of events wrong. In the absence of serious security concerns about the Saddam regime, there would have been no war and hence no “democracy promotion.” If there is no decision to topple the regime first, there are no questions about what you will replace that regime with. Some hoped the Middle East was ready for a wave of change from autocracy to liberalization and democracy. It wasn’t. But the war aim of the military power the United States deployed was first to oust Saddam, not to democratize the region and the world.

The decision to go to war would have been forever vindicated had the U.S. military indeed turned up large stockpiles of chemical and biological weapons—though some might disagree in light of the ensuing insurgency and the cost it inflicted on our troops. In policy circles, a significant number of the war’s initial supporters were ready for a withdrawal by mid-decade as U.S. casualties mounted. They had no stomach by 2007 for Bush’s counterinsurgency “surge.” Yet the surge—in conjunction with the “Anbar awakening,” in which Sunni sheikhs starting in mid-2006 turned against the insurgents of al-Qaeda in Iraq and allied instead with U.S. forces—was a clear success by summer 2008. And the war ended two years later.

If you are George W. Bush, and you took the country to war on the basis of a mistake on the scale of the Iraq WMD intelligence failure, you cannot expect the judgment of history to be other than negative—even though you can honestly claim you made the decision on the basis of what you considered the best information available at the time. At the same time, this negative retrospective judgment offers no real counsel to presidents and policymakers assessing future dangers and making decisions about them. They, too, will have to make incredibly hard choices without perfect information. Trump just did.

And here is the key point when it comes to reassessing our fight in Iraq. Saying history’s judgment of a decision is negative is not the same as saying that nothing positive came of the decision. In Iraq, the United States sought militarily to establish with certainty that Saddam Hussein would no longer be a factor in global politics and that Iraq would have no chemical and biological weapons or the ambition or prospect of developing a nuclear weapon. To return to a notorious phrase, “mission accomplished.” Saddam was out, Iraq was free of WMD, and after the surge ended the war, the government in Iraq has posed no threat of any kind to anyone outside its borders. It is a functioning state, though it is shot through with corruption and tribal tangles and internal squabbling that have so far prevented it from securing a bright future. But it’s off the map when it comes to geopolitical hot spots—following a 30-year period during which Iraq was one of the most destabilizing forces for evil on the globe.

It’s impossible to say what would have happened if the United States had left Saddam in place. Clearly, in the short run, he did not pose a threat as a supplier of dangerous weapons to terrorist actors, as we had feared, because he didn’t actually have those weapons. But that is something we came to know only as a result of toppling the regime; the perceived threat would still have been a huge preoccupation for American and Western leaders. That’s not enough to justify the war, but it adds to an honest understanding of why the war came to be.

Saddam in the longer run would have been an entirely different matter. Having played a malevolent role in Iraq and regionally for decades, he would certainly have sought to continue in it to the extent possible. In a grab for oil, he invaded Iran in 1980. He used chemical weapons extensively from 1983 to 1988 in the Iran–Iraq war, and he used them against his own Kurdish population during his Anfal campaign of 1988. In another grab for oil, he invaded and conquered Kuwait in 1990, and he threatened to use chemical weapons (but didn’t) against the U.S.-led coalition that ejected him in 1991. During that first Gulf War, he also launched dozens of Scud missiles against Israel and civilian targets in Saudi Arabia; because he had used them, Israelis spent the war putting on gas masks in case he had loaded them onto his missiles. The fact that, by 2003, Saddam had no stores of chemical or biological weapons was unknown to anyone but himself and whatever “inner circle” he had. The question, which had taken on a new coloration in the aftermath of 9/11, was whether he was too dangerous to ignore, especially if the Security Council allowed the sanctions imposed on him to lapse, giving him more resources.

Saddam was 65 years old when Baghdad fell to the United States. The problems he posed in international politics might have persisted for decades had Baghdad remained in his hands. Instead, those specific problems ended with his regime and his death by hanging three years later. Good riddance.

The Iraq War also had implications beyond Iraq. It was intended in part to scare others out of pursuing and possessing nuclear, chemical, and biological weapons. Under some circumstances, the United States has the military capability to prevent hostile states from acquiring especially dangerous military capabilities. The question of whether such an adversary would actually use such capabilities once it possesses them doesn’t arise if the state doesn’t possess them.

Did this added deterrent dimension of the Iraq War work? There’s evidence to suggest it did.

At the end of 2003—that is, after the fall of the Saddam regime—Libya’s Muammar Qaddafi made the decision to abandon his nuclear, chemical, and biological weapons programs as well as long-range missile development. International inspectors verified their termination in 2004.

Syria also had a nuclear program underway when war broke out. Construction had begun in 2001 on a facility at al-Kibar modeled on a reactor in North Korea that could produce enough plutonium for one or two nuclear weapons per year. Syria insisted the project was not a nuclear reactor at all but a conventional military facility. When reports of a contract with Russia to build a reactor in Syria surfaced in February 2003, during the buildup before the U.S. attack on Iraq, both Russia and Syria hastily denied any such arrangement—which was likely an indication of newfound caution in the wake of America’s declared determination that it would intervene if nuclear threats began to gather rather than wait until it was too late to do so. As the al-Kibar facility neared completion in 2007, Israel bombed and destroyed it. Subsequent inspections found incontrovertible evidence of al-Kibar’s nuclear nature. After its destruction, Syria’s nuclear ambitions went dark, perhaps in keeping with a sense of the heightened risk of proceeding. (Syria did use chemical weapons against its own people in 2013, which Barack Obama had declared a “red line” requiring intervention—a line from which he hastily retreated when the test came.)

Meanwhile, Iran’s nuclear-weapons-development program underwent a shift in 2003. The mullahs dispersed its elements and moved it underground. A 2007 U.S. National Intelligence Estimate found that 2003 was a turning point—a conclusion subsequently confirmed by the release in 2008 of internal Iranian documents by the chief inspector of the International Atomic Energy Agency. Those documents offered information about Iran’s pre-2003 “Project Amad,” a detailed plan to develop nuclear weapons and configure them as missile warheads. In 2018, Israel’s Mossad seized another cache of nuclear records further describing the Amad project’s ambition to produce a small arsenal by the early 2000s. That didn’t happen. In short, while Iran by no means gave up its nuclear ambitions and programs after the U.S. took down Saddam’s regime, Iran’s leaders understood that their pursuits entailed greater risk in light of Bush’s determination to deal with the nuclear threats before they took full root. The shifts they felt they had to make likely slowed their progress.

Then there is the case of North Korea. The Kim regime’s pursuit of a nuclear arsenal had been underway for decades by 9/11, and Pyongyang was getting very close. The United States made that clear in 2002, when Washington openly announced we had been played for suckers—that a decade of Western bribes (called “the agreed framework”) paid to North Korea to prevent nuclearization had failed. Given that fact, and the fact that in 2003, the United States had successfully ousted Saddam, North Korea did the opposite of Libya. It rushed ahead, and by 2005 openly announced it had achieved nuclear-weapons capability—then conducted a successful underground test in 2006. The Bush administration did not act. It had its hands full with Iraq. But there was a unique feature of the situation on the Korean peninsula: thousands of conventional munitions the North has had at the ready for decades to fire off at South Korea’s capital, Seoul, which is less than 40 miles from North Korean territory. The United States was thus conventionally deterred from military action to halt or slow North Korea’s nuclear-weapons program.

The prospect of conventional weapons deterring the United States from attacking an aspiring nuclear-weapons state is a good vantage point from which to return to the Iran of 2026. Iran’s ability in the days following the U.S.-Israeli attack to fire off barrages of missiles and drones is an indication of where the problem of the Iranian nuclear-weapons program was headed: in the direction of a conventional Iranian deterrent to the ability of the United States or Israel to do anything about it. Both Secretary of State Marco Rubio and Prime Minister Benjamin Netanyahu specifically said that Israel had determined it had to strike when it did because Iran’s increasing conventional short-range missile capacity would soon make such an attack too dangerous—the North Korea problem.

The United States had considerably damaged the Iranian nuclear-weapons program with its June 2025 attack, in conjunction with Israel, on Fordow and other nuclear facilities. And the United States and Israel could perhaps have continued to strike as necessary while Iran built replacement facilities over time. But not indefinitely if Iranian conventional capabilities continued to increase rapidly. As of late 2025, the path to an Iranian nuclear arsenal no longer ran underground but through the skies, in the form of missiles and drones.

That Iran is pursuing nuclear weapons is not in doubt—on the strength of vastly more evidence than the intelligence case against Saddam Hussein. Iran’s single-mindedness in its quest is unique in international politics. Its threats to wipe out Israel have been nonstop, and they extend to the “Great Satan,” the United States. Through its direct and indirect actions against U.S. interests—whether supplying sophisticated roadside bombs to insurgents in Iraq or its support for numerous Middle East malefactors from Hamas to Hezbollah to the Houthis—the regime in Iran has conclusively demonstrated that it is as dangerous as a nonnuclear-weapons state can be, and there is every reason to doubt that an Iranian nuclear weapon would be useful to the regime only as a defensive deterrent. Given Iran’s embrace of Shiite millenarianism, it’s an open question whether the nuclear weapons Israel and the United States possess would deter Iranian use.

Trump has not been alone in saying Iran can’t have a nuclear weapon. The “international community” says so as well. But such declarations are largely performative in the absence of the power to back them up. This Trump commands. Once among the harshest critics of the decision to go to war in Iraq, Trump has found that the information he has at his disposal has obliged him to take the country to war over a threat from Iran—a threat that is analogous to, but far more serious than, the one George W. Bush perceived in Iraq.

No, “it’s not 2003.” It’s a generation later, and the problem of the worst weapons in the hands of the worst state actors persists. Iraq under Saddam Hussein was one. It is no longer. Donald Trump has made it clear he is determined to make sure the Islamic Republic follows the Baath regime into the dustbin of history. It’s likely that one person in America who is rooting him on, based on his own complex and rueful experience, is George W. Bush.

This article was originally published on March 17, 2026 in Commentary.

The Age of Trump: A Sobering Return to Reality

A decade after Donald Trump’s descent down an escalator in his New York City apartment building in 2015, it can no longer be denied, either by friend or foe, that we are living in the Age of Trump, and that his shadow will be cast over the first half of the 21st century for as long as historians write their chronicles. But what does this even mean? Trump makes it difficult to discern. We cannot tell what, for him, is a core conviction rather than a negotiating point. He pivots so rapidly between seemingly contradictory positions that his policy framework has become a Rorschach test for the various factions within his coalition. Nevertheless, as we enter the second decade of the Age of Trump, we can begin to define the fundamental values that are undergirding his administration, especially in the realm of foreign policy, even if it often seems as though Trump is allergic to any kind of core principle. But even that, if true, is a matter of values. It’s just a question of what he values and what he is willing to put on the line for it.

Trump began as a candidate in revolt against Democrats and Republicans and all the niceties and rituals that had been established to help mediate the spaces between the parties. Trump-era values are therefore, at least in part, a critique of the animating principles of the past—but how far back in the past?

The predecessor to the Age of Trump was the “post–Cold War era consensus,” and the critique Trump and his supporters make of it is, to put it mildly, robust. The collapse of the Soviet Union brought with it a generation of American hegemonic dominance across the globe that seemed, on balance, quite satisfactory to those involved in creating and perpetuating it. But it was unsatisfactory to Trump and many of those he represents. They rail against the consensus’s supposed preference for “endless wars” and against an economics seen as favoring the interests of shareholders and great wealth over the concerns of the working class and Main Street.

But Trump’s doings and undoings are more than merely a reaction to the triumphalism of the period, including the notion that we had reached the “end of history.” The objections extend back to the basic elements of the post–World War II liberal order itself. Though this order was largely American in origin and a product of the unprecedented global dominance of the United States across all measures of power in the aftermath of World War II, for many it has become a euphemism for a system that allowed our allies a free ride on our defense dollar and the entrenchment of trade rules that allowed foreign countries to place barriers to entry on American-made products while the United States opened itself up to a flood of imports grounded in cheap labor abroad. Even after the Cold War, the United States maintained a disproportionate security burden, while NATO allies shirked defense commitments to boost their domestic welfare programs. American-led interventions in Kuwait and the former Yugoslavia went off smoothly in the earliest post–Cold War years, but the failures in Iraq and Afghanistan created a crisis of confidence and fueled debates about American military presence abroad.

Meanwhile, the economic model that em-erged at the end of the 1970s—with Margaret Thatcher’s ascendancy in the UK, the beginnings of U.S. deregulation in the late Carter administration, and finally the election of Ronald Reagan in 1980—is viewed with deep skepticism despite the fact that the American economy has grown sevenfold over the past four decades and remains the worldwide engine of innovation and productivity. The model, some in the Trump camp argue, led to American manufacturing moving offshore in pursuit of low-cost labor. That produced cheaper goods for American consumers but shuttered U.S. factories and thereby hollowed out middle- or working-class lifestyles across the country.

During that same time, they point out, American strength was being degraded from within. Progressive elites have grown increasingly committed to a worldview that rejects classical liberal and Judeo-Christian values in favor of a self-loathing disrespect toward Western heritage and culture. The very notion of “human rights” went from serving as an international bulwark against another Holocaust and a rallying cry against Communist totalitarian oppression to a weapon used to advance progressive policy preferences—from new forms of marriage to radical notions of gender identity, as well as twisted conceptions of “oppressed versus oppressors” used to justify or excuse anything from antiwhite bigotry to Pakistani grooming gangs in the UK to the heinous attacks of October 7. The values-based case for preserving the postwar liberal order rings hollow when Christians are arrested in the United Kingdom for praying silently outside abortion clinics at the same time that Islamists march freely down British streets chanting anti-Semitic and anti-Western hate, or when free speech is censored under the guise of fighting disinformation and “hate speech” as defined by leftist NGOs.

For these and other, less seemly reasons, more radical elements of the Trump coalition claim that anyone who speaks in favor of maintaining the “postwar foreign policy consensus” is just part of a shameful and entropic “uniparty”—members of a camp pushing for an international order determined to constrain U.S. freedom of action abroad and diminish American sovereignty in favor of the interests and values of a global and “globalist” class.

But even if, as its defenders argue, this order still manages to provide more benefits than any available alternative, it is hard to dispute that its returns have begun falling short relative to the investment of American blood and treasure. How did we manage to reach a point where the nation that established and has led this order is now seeing such diminishing returns? The answer lies in the underlying animating value at the heart of America’s grand strategy for the past century—and ultimately at the heart of the Age of Trump’s critique.

_____________

The United States has treated its role as a global superpower much differently than past hegemons. For nearly a century, a fundamental assumption underpinning American grand strategy has been the belief that it was possible (and desirable) at some level to replicate on the international stage what the American experiment aims to do domestically—“to form a more perfect Union.”

For all of its very real triumphs, American foreign policy throughout much of the 20th century and into the 21st century suffered from a misguided, idealistic hubris—certain that our American way was establishing the conditions for permanent peace and stability across the globe. It was within reach; we had only to pave the road. After defeating existential threat after existential threat at significant cost, from Nazi Germany and Japan to the Soviet Bloc, our strategic priority in victory was not to prioritize our own sovereignty and enlightened self-interest but instead to look for ways to foster global cooperation and harmony. Rather than concentrating on identifying and preparing for the inevitable rise of the next great threat, our time and energy were spent trying to create a world in which new threats would not emerge.

Woodrow Wilson was the first to begin advancing this vision of a glorious future—believing that our World War I victory had created an opportunity to secure world peace by creating collective security arrangements grounded in binding multilateral commitments, with the aid of a new international body. Wilson hoped an elite expert class could help set international rules and standards to enable countries to transcend the messy notions of national interests and balances of power in the joint pursuit of the greater global good. Should any threat to this new order arise, each country was expected to jump to its defense, regardless of where the threat originated. Of course, Wilson and fellow idealists believed there would be little need for any such enforcement, because states would adhere to it, being rational actors who wanted good things. Wilson envisioned a self-sustaining order whose foundation lay in the power of institutions and law, rather than what we have come to call “hard power.”

In the end, Wilson’s vision was a resounding failure. Nations were not amenable to being told by an international bureaucratic elite working at his League of Nations what their interests should and should not be, nor were they interested in enforcing multilateral collective security commitments that did not take their concrete national interests into consideration. Wilson’s idealism was no match for the hard realities of power and conflict, and critics like Senate Majority Leader Henry Cabot Lodge were rightly skeptical of the proposition that open-ended universal commitments had an automatic claim on precious American blood and treasure.

Two decades later, as World War II was coming to its end, Franklin Roosevelt tried a different approach. Rather than trying to avoid the problem of national interests, Roosevelt bet that the victorious Allied powers would all see it was in their interest to maintain a stable, peaceful global order. Recognizing this required actual power, he came up with the “Four Policemen” idea, according to which four of the most powerful nations emerging from World War II—the United States, the Soviet Union, the United Kingdom, and China—would work together as enforcers, a concept later echoed in the formation of the United Nations Security Council.

The problem this time was that the United States and the Soviet Union had vastly different views on what that global order should look like, given their fundamentally incompatible ideologies and core values. It took the likes of Republican Senator Arthur Vandenberg—who as a young newspaper editor championed Lodge’s opposition to Wilson’s League of Nations idea—to find common ground with President Harry Truman in shifting American foreign policy to deal with the Soviets as the adversaries they were rather than the permanent allies Roosevelt naively hoped they could be.

By the time Francis Fukuyama put forth his “end of history” thesis in 1989, however, it did seem to many that this time could be different. With the United States emerging as the sole superpower, great-power competition seemed relegated to the past, and thanks to the triumph of democratic capitalism over Communism, it appeared there also was finally an answer as to how nations could organize their affairs in a universally satisfying manner, one capable of unlocking the full potential of the postwar liberal order.

Capitalism and free trade made it possible to envision the interests of nations playing out in a constant series of win-win interactions, fostering strong incentives for peace as a means of maintaining economic prosperity and encouraging the transition of the likes of Russia and China into liberal democracies and responsible global partners. And since, according to the “democratic peace” thesis, mature democratic states do not make war on each other, this would only further reinforce a permanent global peace. American post–Cold War strategy, then, was to ensure this progression continued apace. That it would happen was rarely questioned; the only real doubts were how quickly it would happen and how much work it would take to convince holdouts.

Yet just as with Wilson and Roosevelt, the post–Cold War promise of a universally accepted democratic capitalist system solidifying a permanent global peace came crashing down, in part due to the machinations of a radical Islamist terrorist and the 19 hijackers who brought the fantasy of universal democratic and Western consensus to a fiery end on a sunny September morning. The war that began in 2001 came to an ambiguous end two decades later with our pullout from Afghanistan, an event many think gave the Soviet Union’s dictatorial successor in Russia an implicit green light to start a war on the European continent for the first time in nearly 80 years.

And then there’s China. For not only has the liberal order failed to meet the expectations of Trump and his supporters, but as was true in the aftermath of World War I and World War II, another great-power threat has emerged in Beijing from a nation with the desire and increasing capability to significantly harm our interests—ironically, and inexcusably, thanks in large part to our help.

In a misguided effort to push China toward political liberalization, the United States went to great lengths to bring China into the international economic system. But far from following the rules, China went to great lengths to cheat and steal to gain every economic and technological advantage possible. At the same time, it began conducting what is widely believed to be the largest peacetime military buildup in history, all while significantly ramping up information warfare and malign influence operations aimed at the United States and our allies. Underlying all of this is a desire not just to gain a competitive market advantage or achieve regional hegemony, but to recast the global order in Beijing’s favor—and to the detriment of American interests and values.

In spite of the best intentions of American politicians, history returned with a vengeance—great power confrontation in a fight for global dominance, wars of aggression, economic uncertainty, competition for critical resources. And with the return of history came the Age of Trump.

_____________

At its core, the Age of Trump’s foreign policy is in part a rebuke of the idea that history will end, that the universal principles that animate our nation will be universally accepted, and that peace and stability will be everlasting. The question is what to do about this reality. How should this dark and skeptical view inform American foreign policy and America’s place in the world going forward? And how, if at all, do our founding values fit into this future?

A small but vocal Trump faction seeks an Age of Trump that eliminates all vestiges of the postwar liberal order and looks instead to the isolationist, or at least anti-interventionist, spirit that existed prior to World War II. Given how history actually played out, it is easy to forget how strong that current of thought was. Even after having been attacked by Imperial Japan on December 7, 1941, with war in the Pacific a certainty, it was not clear until Hitler declared war on the United States a few days later that we would join the fight against Nazi Germany. From its foundation in 1940, the America First Committee, which claimed 850,00 members—and whose chairman, Robert Wood, was a former general and then-chairman of Sears, Roebuck and Co.—held large rallies against going to war in Europe. It dissolved the day Hitler declared war. The committee’s present heirs seem perfectly comfortable letting American power diminish if doing so furthers the cause of a new Age of Trump characterized by non-intervention.

However, there are two significant reasons why even attempting to make the Age of Trump an isolationism redux will fail. First, Trump’s own actions and policies have made clear at this point that, while he views nearly everything as negotiable, he is not an isolationist and is perfectly willing to use American power to intervene abroad in service of American interests. The narrative that the muscular foreign policy of his first term was just the product of secret Never Trumpers in his administration has been resoundingly crushed by his actions in the second term. One does not send stealth bombers to obliterate Iran’s nuclear facilities or conduct a major military operation to arrest Venezuela’s illegitimate dictator in his bed and bring him to trial in the United States on narco-terrorism charges if there is any squeamishness about the use of American power.

Second, and every bit as important given that Trump has just a few more years in power, his voters actually overwhelmingly reject a United States that has accepted decline and isolation. Polls consistently show that Trump voters are far more hawkish and supportive of a strong U.S. presence on the global stage than the isolationist faction has sought to delude us into believing. Trump voters, including those within his most loyal MAGA camp, have no problem recognizing China, Russia, and Iran as adversaries, and they continue to recognize the value of those allies like Israel who pull their own weight and provide a benefit to American security and prosperity. Trump supporters overwhelmingly prefer a United States willing to confront adversaries rather than a United States that has accepted a supposedly inevitable decline. And even as they may dislike elements of the current postwar order, they have no desire to see a Chinese global order take its place or see someone else’s values and principles dictate global norms. There is a reason why Trump campaigned on slogans like “Make America GreatAgain” and “peace through strength”—that’s what his voters actually want.

So two things appear to be simultaneously true. Yes, there is a real discontent in the Age of Trump with how the postwar order has evolved in the post–Cold War era. There are real frustrations that the current order has not only required too much of the United States, but that many of its most influential thinkers are now advancing principles and values fundamentally contradictory to those upon which our nation was founded and that form the bedrock of Western civilization. At the same time, neither Trump nor the majority of his supporters wants to forswear U.S. global leadership in favor of a simplistic pre–World War II isolationism that meekly accepts the decline of American power.

In the end, what the Age of Trump’s protagonists seem to want is for the United States to start actually acting like a global power. That means ensuring that any global order we lead and sustain definitively serves the interests of the American people and reflects our founding values and principles. They have no problem with American intervention per se—they simply (and quite reasonably) want American power to be used successfully and in furtherance of America’s enlightened national interest. The goal is not retreating from the world or destroying all vestiges of the order that we helped build, but to remake it as necessary to ensure it is consistent with our national purposes. And if that is indeed the kind of foreign policy this era will pursue, the Founding Fathers provide a worthwhile blueprint for the future—and a bridge back to the moral core of our nation’s founding.

The truth is that the Founding Fathers would have felt far more at home in the rough-and-tumble Age of Trump than the heady early days of the post–Cold War period with all its unrealistic wishcasting. While they were animated by the belief that each person possesses unalienable rights flowing from an intrinsic, God-given dignity, the Founding Fathers did not share the impractical idealism of Wilson or Roosevelt. These 18th-century men refused to harbor unrealistic expectations about human beings and the way politics and power work. While they espoused principles universal in nature, the Founding Fathers were under no illusion that their principles would ever be universally accepted. They knew that their claims would meet resistance; most kings, including the colonists’ lawful sovereign, George III, had little use for dignity-grounded arguments that undermined the legitimacy of royal authority. The question of the vindication of the founding principles of the United States of America was therefore never separate from the need to defend them—and win them—by force.

The Declaration of Independence was not a suicide pact. Revolution is a risky business for those rebelling. Failure means a date with the hangman. But those who signed the Declaration had a plan. The Declaration was not merely a statement of principle and a catalogue of the abuses of the colonies by the crown. It was a strategic document as well.

The commander of the Continental Army, George Washington, had in mind a drawn-out war for independence, one that would avoid a decisive engagement between his force and the formidable British army and its German hireling auxiliaries, the Hessians. Washington sought to make use of the vast territory of the colonies to wear down the British to the point that they’d give up.

But that was not the only aspect of the American power-based strategy for independence. The United States needed, and through the Declaration sought and soon obtained, a willing ally capable of assisting with “boots on the ground” and substantial naval power, of which the United States had none.

France was the key. French strategists anticipated that the power balance in their long-running rivalry with Britain would tilt decisively in favor of the latter if Britain retained its colonies in the New World. Assisting the colonies in their struggle for independence would have the short-term benefit of tying up British forces there and, in the long run, if successful, prevent the British crown from making use of its assets and resources in America in the struggle for position in the Old World. For France, the future of Europe ran through the American Revolution.

The problem was that France couldn’t overtly support the Continentals in the sovereign territory of its British rival so long as the conflict remained at the stage of the tiny 1775 battles of Lexington and Concord. As the historian Larrie D. Ferreiro argues in his 2016 book Brothers at Arms, this was the problem the Declaration of Independence solved. Once the Continental Congress took decisive action, there was no turning back. The equation for France changed. Providing military aid to an independent country was a different proposition from interfering in internal disputes on someone else’s sovereign territory.

While France immediately started providing clandestine support to the Continental Army, the formal French-American alliance against Britain awaited the Continental Army providing the French with proof of concept for the viability of the military endeavor. That came in fall 1777, with the Battles of Saratoga in New York, which ended with the surrender of a surrounded and outnumbered British force of more than 5,000. The French would go on to play a critical role in the war’s final battle at Yorktown in 1781, where their naval forces deprived the British of their anticipated access to the Chesapeake Bay, and the Marquis de Lafayette and the Comte de Rochambeau led French troops alongside Washington’s Continental Army to victory over British General Charles Cornwallis. His surrender effectively ended the war and vindicated the Declaration.

France was not acting altruistically in support of the American Revolution. It was deploying its power in pursuit of its interests, namely, a weakened British Empire humiliated by the loss of its American colonies. The Continental Army had something bigger to strive for, not only independence and survival but also the principles Jefferson set forth in the Declaration. Without the power to defend them by prevailing against the Crown, the principles by themselves might have lived on to inspire others to take them up and fight for them. But with power, they marked the beginning of the United States and its advance to the pinnacle of global power in support of ideas grounded in equal God-given human dignity and the rights that flow from it.

_____________

This combination of power and principle, present at the creation of the United States and continuing to animate its growth and vitality for 250 years and counting, remains a reliable guide for American leaders and policymakers in the Age of Trump and beyond. It’s a legacy Americans have made for themselves. The nature of politics is to produce ugly outcomes. What’s unusual is a good outcome, and the United States by 2026 has produced more of them than history has recorded for any other polity, not merely because of our values but also because of the way our power sustains them.

The Age of Trump’s protagonists are right to vehemently reject the voluntary and unnecessary erosion of American power. The challenge is whether they can build something positive—whether they can retain the needed emphasis on power to secure American interests while remaining true to the founding principles that have made and continue to make our nation great.

Doing so will require clarity on several fronts. First, the United States does not merely face strategic competitors, but enemies. These enemies do not need to be manufactured—they have made themselves and their intentions clear. China is leading an anti-American bloc that includes Russia, Iran, North Korea, and (at least until his arrest) Maduro’s Venezuela, all united around a single goal, which is to bring the United States to its knees. China is ultimately not interested in securing a better trade deal or being placated with a sphere of influence, as ironically, Trump and some of his advisers seem to believe. China wants a Washington subservient to Beijing, and it knows it can count on its revanchist partners in a campaign to harm American interests and standing.

Second, while American foreign policy must be completely oriented toward denying and degrading the threat from this Chinese bloc, we must be realistic about what success means. While America’s 20th-century experiences with great-power clashes resulted in outright victories, history shows us that this is not necessarily the norm. We instead should expect decades, or even centuries, of the kind of long struggles seen throughout European history, where success more often looks like consistently tipping the scales in one’s favor rather than a decisive defeat that catapults us back into the status of uncontested global hegemon. This means steeling the American people and orienting our defense and economic policies on a timeline lasting decades while unabashedly employing hybrid-warfare tactics to weaken and undermine the enemy regimes—as they are doing to us now.

Relatedly, even if we did secure a more decisive victory reminiscent of World War II or the Cold War, we should not make the mistake of assuming that such a victory will be permanent. For every Japan that becomes a useful ally, there is the Soviet Union that simply morphs into the same adversary in a different form.

Third, our interests are best served when we both set and enforce the rules. The postwar order’s failures are lessons that must be learned and not repeated. We should not allow our adversaries into an order we lead. We should require even our allies to shoulder a fair burden, and we should hold them to account when they abandon shared values and principles. Preserving an order in which America remains predominant will require a lot of work. It will be far harder than throwing up our hands and walking away, as our enemies would like and as the isolationists among us dream of doing. But our order is far preferable to a world dominated by the Chinese Communist Party.

And fourth, the Age of Trump must be one that faces up to the “clash of civilizations” framing articulated by Fukuyama’s great antagonist, Samuel P. Huntington. It’s not just that our allies sometimes need cajoling to recommit to shared civilizational values; we also need to remind ourselves why we fight our enemies. Our national interests are morally superior to those of our adversaries because the values that inform them are morally superior. The principle animating our nation from the beginning is the unshakeable belief in the dignity of every human, and it is fundamentally incompatible with the values that animate the Chinese Communist Party, Putin, or any of our other adversaries. We know from history that our values will never be universally accepted but will always be under various forms of attack. Rather than running from this reality, the Age of Trump can and should use it as the glue that again marries principle with power.

We did not know it then, but Trump’s escalator entrance was the start of a sobering return to reality. History is clear: No peace is permanent, and human beings are incontrovertibly imperfectible. Conflict and war between states will never be relegated to the ash heap of history, and international relations will always be a nasty fight for supremacy, one in which the winner gets to shape the future according to its interests and values. The test for the Age of Trump is whether it ultimately will repeat past mistakes and abandon either principles or power (or both), or whether it will reconnect power to America’s founding values and lay to rest the dangerous delusion that power is unnecessary or self-sustaining.

This article was originally published on February 19, 2026 in Commentary by Tod Lindberg and Corban Teague.

The Invidious NVIDIA Deal

President Trump’s style is such that he would portray himself as a master of the foreign-policy game, like all other games, even in the absence of noteworthy successes in that realm. Yet both in his first term and in the first year of his second, he has put together a string of wins. One was his first-term “maximum pressure” reversal of Barack Obama’s dealmaking course on Iran, which culminated in Trump’s second-term destruction of Iran’s nuclear-weapons facility at Fordow. Related were the U.S.-brokered Abraham Accords improving relations between Israel and its Arab neighbors in a de facto alliance against Iranian regional influence. Another was the elimination of the Islamic State in Syria and Iraq. And how about his first-term decision to supply lethal aid, including Stinger missiles and sniper rifles, to Ukraine—which, along with robust covert U.S. intelligence engagement with Kyiv, probably saved the government from collapse in the early days of the full-scale Russian invasion in 2022?

The most important among Trump’s successes, however, was to crystallize, from 2017 on, an emerging view in Washington of China as a strategic rival in a return to global great-power competition. The National Security Strategy released that year rightly described China as an aspiring peer competitor, aiming to erode U.S. influence not only in the Pacific but also globally: “For decades, U.S. policy was rooted in the belief that support for China’s rise and for its integration into the post-war international order would liberalize China.” Contrary to that hope, the strategy argued, “China seeks to displace the United States in the Indo-Pacific region, expand the reaches of its state-driven economic model, and reorder the region in its favor.”

Trump’s revised view of China had implications across a range of policy areas—from military requirements to global supply chains and technology transfer. But the key to unlocking necessary reform is, first, the recognition that the strategic context has changed. The complacent view of China as a peacefully rising power that would soon settle into the role of “responsible stakeholder” in the American-led global order—the dominant Washington forecast for China since the Clinton administration—crumbled under the reality of a Chinese Communist Party determined to use all the resources at its command to maintain its exclusive grip on political power domestically and to increase Chinese influence regionally and globally.

These observations about Trump’s foreign-policy successes will be deeply offensive to almost everyone whose inability to stand him is now entering its second decade. And it will meet fierce resistance from those whose biggest concern is the foreign-policy damage Trump has done, especially to relations with European allies. What’s more, it turns out that the fatigue that often begins to gather during the fifth year of a two-term presidency is a factor whether the terms are consecutive or not. Trump’s high-velocity second term is exhausting not only because of the pace and breadth of policy change but also because Trump’s approach to the actions he takes seems to be premised on the belief that he has vast popular support, which he doesn’t.

Nevertheless, a more realistic view of the “China Challenge,” as a State Department Policy Planning document from 2020 dubbed it, was Trump’s most significant course change, and the new perspective (though not Trump himself) has won substantial bipartisan support. Today, that phrase “China challenge” seems if anything too mild a description of the danger Beijing poses to American-led global order.

But if consistency is the hobgoblin of little minds, Trump’s is capacious enough to encompass inconsistencies great in range and grand in scale. So against the rare constancy of the Trumpian view of China, he presented a breathtaking contradiction in December 2025. He proclaimed his willingness to allow American chipmaker Nvidia to sell to China its high-end H200 GPU, potentially providing a boost to Beijing’s effort to catch and surpass U.S. companies in pursuit of artificial intelligence.

This accommodation seemed wildly at odds with pretty much everything Trump has done or said about China going back to his pre-presidential years. His announcement produced a broad-based “What the hell?” moment—well, the actual word being used is not “hell”—among all those who have spent a decade getting more and more concerned about China, if not at Trump’s behest than at least in seeming accordance with his sympathies.

Why the apparent reversal? The search for explanations for Trump’s actions often brings trouble down upon the seeker. In many cases, no sooner does a plausible-sounding explanation emerge than events, often generated by Trump himself, overtake and obviate it. Thus, for example, at first blush, overriding the ban on the sale of the H200 was a massive boon to Nvidia, which one might either applaud or abhor in accordance with one’s view of Big Tech in general, Nvidia itself, or the weight of its valuation in one’s 401(k). So perhaps the announced deal was the latest installment of Trump’s deal-making, pro-business streak. But the United States government also stands to benefit fiscally from Trump’s deal, whose terms apparently call for 25 percent of the billions in proceeds from chip sales to flow to the Treasury. The legal basis and policy soundness of the government’s taking a direct cut on the sale of a product seem dubious—in effect, an excise tax beyond the power of the president to impose without congressional authority. But in the Age of Trump, it’s always full speed ahead, since the Republican-controlled Congress provides no blowback and creates no friction.

That leaves the courts to act, but if they don’t like it, Nvidia could presumably just make a voluntary contribution to the Treasury anyway according to Trump’s formula. True, that would give the company the discretion to welsh on the deal, but we have also reached the point at which CEOs have good reason to be concerned about incurring the president’s wrath. Certainly, the Nvidia CEO, Jensen Huang, has been heavily courting Trump this year, including at a meeting on December 3, mere days before Trump’s December 8 announcement. So maybe the administration is operating squarely in the tradition of “the chief business of the American people is business,” in the words of Calvin Coolidge. Billionaire CEOs are people, too, including Huang, a man who has contributed hundreds of millions to such public-spirited projects as Trump’s inauguration and Trump’s White House ballroom.

But maybe the Nvidia go-ahead isn’t so much about the company and the Treasury as it is the latest gambit in Trump’s pursuit of a mega-deal on tariffs and other economic matters with Chinese dictator Xi Jinping. The zigs and zags of Trump’s tariff maneuvering are maddeningly difficult for outsiders to follow—as they apparently are even for senior administration officials. While the latter have more access to Trump, they aren’t mind readers, and even if they were, Trump’s mind changes with some frequency for reasons known at most to himself. To Trump stalwarts who thought they knew his mind on China, the Nvidia announcement must have come as an even greater shock than it did to the Trump-curious and neutral Trump-watchers who wish success upon his presidency for the sake of the country. Trump-despisers, for their part, gravitate toward the view that whenever Trump does something of which they disapprove, he reveals his true colors. Here they had the option of classifying the decision as Trump coming under the sway of domestic billionaires kowtowing to him, or as Trump reverting to his supposed affinity for foreign dictators or strongmen. Or both.

To view Trump’s move as a bargaining ploy is to put Trump back into a comprehensible Trumpian context. Selling our biggest adversary our excellent chips doesn’t sound like an element of making America great again, but if the real goal is to butter up Xi for a deal that rectifies all Trump’s trade grievances with China, that sounds more MAGA-compliant.

But maybe that’s not what’s going on either. Maybe—or so emerged another line of interpretation—the Nvidia green light was actually Trump setting a trap for China. Next-generation GPU chips such as Blackwell are already available from Nvidia, and still-more-powerful GPUs like Rubin units are on the runway. So perhaps Trump was opening the way to get China hooked on an obsolescent chip. Widespread Chinese adoption of the H200 might lock in a chip gap with the United States in the lead. Easy access to the H200 would also slow the imperative for Chinese tech companies to develop competitive or possibly superior chip technology. In effect, Trump would be flooding China with American-made goods in the expectation that doing so would undermine China’s indigenous capacity to innovate and manufacture—a karmic high-tech turning of the tables on how China supposedly hollowed out ordinary American manufacturing by flooding the United States with goods produced by cheap Chinese labor.

We live in a golden age of speculative prognostication—not for its accuracy, of course, but for sheer volume and speed. It’s not quite right to say that the posters on X/Twitter foresee every possibility and every conceivable set of consequences flowing from each one. But there’s a lot bouncing around out there. So naturally, the possibility that Trump has set a trap for Xi has generated the second-order argument that Xi is on to him. In the end, the argument goes, China will buy very few H200 chips, precisely in order to avoid stunting the growth of Chinese chip development. Accordingly, the big deal will almost certainly be a bust, both for Nvidia and the Treasury. Or, in the telling of others, China will buy only enough H200s to retro-engineer them to steal the tech, as it has with so many other innovative American products—although it’s rather fanciful to suppose, given the sophistication of Chinese espionage efforts in this area, that export controls have hitherto been successful in preventing China from obtaining sufficient H200s to steal the tech already. But the chip design by itself is not enough. Manufacturing copycats, we are reliably told, is also beyond China’s current capabilities.

Still more esoteric is the rumor making the rounds that a joint effort Google and Meta are about to unveil will undercut Nvidia’s chip dominance with a system that will allow the products of others to easily run on the currently exclusive Nvidia operating system CUDA—which is now the standard for AI development. Though Nvidia is famous for chip-making, a huge component of its market valuation is a product of its software “moat” exclusivity, which will soon end. If true, Trump either knows this or believes it, or he doesn’t. The ensuing possibilities: He’s either supremely well-informed (because billionaires talk to billionaires in the 19th-century manner of the Cabots talking only to the Lodges), or he’s a complete ignoramus. Whichever is true, the China deal is an example of great dealmaking or supreme perfidy, depending on your prior outlook on him.

_____________

So to sum up, we don’t know and may never know why Trump made this decision. We don’t know whether it will go through in the end, and if it does, how many Nvidia GPUs will end up in China and with what effect on AI development there. And we don’t know how damaging the implications of such sales will be to U.S. national security. Though many claim otherwise, no one has a Magic 8 Ball. Few have seen the intelligence assessments of the effect of the sale of H200s, and they aren’t talking (and may be wrong). And few of us are privy to the group-chat banter of the Billionaire Boys’ Club, for what that’s worth.

What we do know, with a high degree of confidence, is that if there is indeed a China challenge—and there is—a presidential directive clearing the way to provide Beijing a boost in its effort to outpace us on artificial intelligence is not part of the way to meet it.

The reason that’s true has less to do with the technological ins and outs of the H200 question than with questions related to American seriousness of purpose, moral clarity, and resolve on China more broadly. Since the end of the Cold War, the United States has had a relatively easy time presiding over what the Chinese have come to call “hegemonic civilization.” Credit the Chinese Communist Party for recognizing the reality of U.S. power—that’s the “hegemonic” element—as well as its ideational element, the “civilization” that we have used our power to preserve and expand through such means as encouraging indigenous democrats working to liberalize governments of varying degrees of authoritarianism; calling out human rights abuses such as China’s slow-rolling genocide of the Uyghur people; and entering security partnerships or alliances with countries menaced by their neighbors.

We have our values and the power to back them up. China has different values and the power to maintain its grip at home. Increasingly, China seeks to flex and extend its influence abroad, with emphasis at present on intimidation tactics directed against our Asian allies, including military provocations. What, in China’s view, should come after “hegemonic civilization”? At first, a global order in which China is the dominant power in Asia, with U.S. influence there drastically diminished. In the long run, perhaps a return to hegemonic civilization, the problem with which all along may have been that the United States, not China, is hegemon.

That places the desire of the United States to remain on top of the global order on a collision course with Chinese ambition. In many gray-zone areas, that clash is already underway. It’s important to note, for example, that China thinks it has every right to help itself to the fruits of technology developed in the United States and “the West,” broadly construed. That’s because of the supposed illegitimacy of the self-serving global order that “the West” has been imposing on the world since about 1500, and especially during the “Century of Humiliation” from the First Opium War in 1839 through Mao’s revolution in 1949. This outside imposition kept China down, an outrage against a nation with thousands of years of continuous history. China is catching up by all available means and is unlikely to stop at parity.

The George W. Bush administration’s 2005 National Defense Strategy declared that the United States would not allow a “peer competitor” to rise to rival the United States. Some critics called this vow hubristic. Democratic administrations since then have sought to manage the relative decline in American power through adroit navigation of international law and institutions they hoped would buttress a rules-based order with widespread buy-in, including from China. The result wasn’t good. Now we have the Trump 2025 National Security Strategy vowing, like Bush’s, to maintain U.S. military dominance without expiration. It states: “We want to recruit, train, equip, and field the world’s most powerful, lethal, and technologically advanced military to protect our interests, deter wars, and—if necessary—win them quickly and decisively, with the lowest possible casualties to our forces.” That’s fine, but making good on it is not solely an American question. China seems not to accept this American ambition, and Beijing gets a say in whether we achieve it and at what cost.

A third world war, this time primarily between the United States and China, is not inevitable. But protracted conflict with China is indeed inevitable, and managing it requires both strategic clarity and moral clarity. China is not our friend, nor is China going to become our friend, because our ambitions and our values clash. That doesn’t mean we can’t have mutually beneficial trade relations, in the ordinary comparative-advantage sense. We can welcome China’s ideological challenges to the superiority of our system as an opportunity to argue in its favor. We can hold to our view that our “China problem” lies not with the Chinese people but with the Chinese Communist Party. In Trumpian terms, we can acknowledge and welcome the desire in Beijing to make China great again insofar as it can be peacefully reconciled with great-again America. But our relations with China will also have a darker side. To pick a mild example, we need a covert capability to steal China’s technological advances in areas where they surpass us—if we don’t already have one, which would surprise me.

The strategic and moral clarity we need to be effective in maintaining our position as China continues to rise is not just a matter for policymakers and elected officials. It includes the American people as well. Some Trump acolytes have been doing their best to persuade Americans to turn wholly inward—or perhaps more accurately, to persuade American leaders that the people have turned inward. All the talk of “endless wars,” which claims to reflect public opinion, is more an attempt to influence elite opinion against exercising American leadership in the world. It’s having its moment, though Trump himself has had no qualms about bombing Iran, the Houthis, Venezuelan drug-runners, and Islamists in Nigeria—and has enjoyed substantial public support for such actions.

These are not sideshows, but China is the main problem. To navigate it, Trump and his successors will need support in American public opinion. With the proper framing, they will have it. It entails a clear articulation of the value to Americans of the American way of life and the threat the global ambition of the Chinese government poses to it. The proper framing is “what we stand for versus what they stand for.”

Selling advanced American chips to China does not fit with that framing. It’s a rebuke to the proposition that our security interests and China’s are not aligned, a case of business as usual in an area where most Americans can plainly see the potential for peril. Whatever the American ambivalence, or worse, about the coming of AI, it is certain that Americans prefer American dominance in AI over Chinese dominance. The same is true for all other tech areas of consequence. Trump’s H200 decision arises in the context of this competition. It invites the conclusion that this tech competition is no big deal. The next time a proposed tech sale with national security implications arises, it invites the remark, Even Trump thought selling high-end GPUs to China was fine. China will cheerfully exploit this precedent, as will U.S. commercial interests when opportunities arise. Also, the argument that we need to buttress our military capabilities to counter China’s growing power at the same time as we’re selling them chips that can contribute to their growing power doesn’t exactly roll trippingly off the tongue. We’re not necessarily at the Cold War level of worry that “the capitalists will sell us the rope with which to hang them,” in the pithy statement misattributed to Lenin. But we shouldn’t act to compromise the proposition that we ought not sell our adversary the rope with which to hang us.

_____________

Late in 2025, I played a war game set from 2028 to 2032 involving an attempt by China to take Taiwan by force. The purpose of the game was not to find out how such a move would turn out, but rather to test the effect of a military capability China is developing that the United States currently has no plans to meet: a conventionally armed (not nuclear-tipped) intercontinental ballistic missile capable of striking anywhere in the United States. But the game was illuminating on the broader question nonetheless.

I played on the China team. This turned out to be a relatively straightforward proposition. China’s objectives, as our team articulated them, were clear. First, obtain Taiwan. Second, do so at the lowest possible cost militarily. Third, reduce U.S. influence in East Asia. The China team understood that achieving the third objective would flow by itself from achieving the first objective. The best path toward achieving the second objective was to do everything possible to avoid provoking the United States into a full-scale war over Taiwan. So no initial Chinese attack on U.S. bases, ships, and military personnel. China’s pretext, in the game, was that Beijing was resolving an internal Chinese dispute over “splittist” tendencies on Taiwan, which Beijing asserts is part of China and the United States, diplomatically, does not dispute.

The China team observed no similar clarity of purpose from the team playing the United States. As the U.S. war-gaming team sent U.S. carriers steaming with uncertain purpose toward the conflict zone, American diplomats busied themselves seeking to reassure U.S. allies of the American commitment to their security. China’s diplomats were busy themselves, reminding U.S. allies that an internal Chinese dispute over Taiwan had nothing to do with them, and that they should stay out—not, by the way, that the United States was actually urging allies to mount a common defense of Taiwan. The U.S. team seemed to think Washington could thwart China’s third goal, reducing American influence in the region, while remaining equivocal about how far the U.S. should or could go to thwart China’s first goal, conquering Taiwan. It’s hard to reassure treaty allies while abandoning a de facto ally under attack. China would not hesitate to draw allies’ attention to this contradiction and the questions it raises about the U.S. commitment to them.

The problem is that “strategic ambiguity”—we might just defend Taiwan, our current declared intent—is a peacetime posture designed to deter. It’s not a policy that directs action if deterrence fails and shooting starts. I think China understands this. In coming years, the most effective deterrent to a Chinese military move may not be the prospect of the U.S. Navy riding the waves to Taiwan’s rescue, its Pacific allies sailing in the wake; it may be a (nonnuclear) Taiwanese capability to inflict harm on China within the power of Taipei to direct.

I would hate to think that the abandonment of the U.S. position in the Pacific, including our allies and our commitment to keep sea lines of communication open, began with Trump’s announcement about H200 sales to China. But such is the possibility that has arisen in his glaring departure from clarity on the China challenge.

This article was originally published on January 20, 2026 in Commentary.

The Disease of Presentism

Review of ‘Violent Saviors’ by William Easterly 

William Easterly’s Violent Saviors is a libertarian tract on global economic development and political economy. But as its subtitle—The West’s Conquest of the Rest—demonstrates, this is a magical moment and angle for such a polemic. Easterly presents Violent Saviors as an economic history, but it is equally a work of intellectual history. Violent Saviors tells the story of bad ideas running amok, and the good ideas that warred with the bad.

Easterly, a professor at New York University, begins with European powers and their colonies and early imperial conquests, including the destruction or removal of local populations typically described as “savages” by the newcomers. The slave trade of the 17th and 18th centuries looms prominently, as does slavery after the Revolution and Jim Crow after the Civil War. He recounts the Belgian King Leopold’s atrocities in late-19th-century Congo as harrowingly as anyone ever has, as well as other bad scenes from the British Empire in India and the Caribbean. The American victory over Spain in 1898 yielded the spoils of the Philippines, which the United States proceeded to despoil. He also recounts the coercive depredations of Lenin and Stalin as they remade Russia into the Soviet Union, resulting in the death of tens of millions, including in the Holodomor, the vast Stalin-induced starvation in Ukraine. Hitler, for his part, saw the conquest of the lands of the inferior “race” of Slavs to the east as essential to German development, and of course, the Jews had to die. The Communist revolution in China led to still more scores of millions of deaths, which Mao Zedong regarded as an acceptable price for the modernization of China.

Easterly connects the dots of this history by citing the recurring justifications of the words of the perpetrators and conquerors, what he calls the “Development Right of Conquest.” The powerful and prosperous countries of Europe, eventually encompassing “the West,” with the United States in the lead, justified their expansionism either in the name of bringing development to the benighted locals or—if the benighted locals were unable or unwilling to advance—in the name of making better use of the land and resources of the territory in question. In all cases, the colonizers and conquerors proceeded entirely without the consent of local populations, especially over the question of whether they actually wished to develop. Often, these powers assigned themselves the role of civilizing the savages and spreading true religion. The same was true of the Soviet Union and China, which brought Communist ideology into the mix in pursuit of their own visions of progress. The Hitler regime proceeded on the basis of supposed Aryan racial superiority.

Human “agency” or “dignity” is precisely what those acting on the Development Right of Conquest denied to those in their way. Easterly notes that “extermination” used to have the additional meaning of “driving out.” This, the developers often did, though sometimes they resorted to enslavement (often rationalized as an improvement in the living conditions of those enslaved) or “extermination” in the modern sense of mass killing and genocide.

Easterly’s story is mostly one of bad actors. In the New World of North America, he starts with the Puritan John Winthrop, the first governor of the Massachusetts Bay Colony. Before he set sail for America in 1630, he set forth his justification for those who questioned the righteousness of his plans. Easterly writes:

Winthrop argued that the conquerors [had] a right to the land because of their ability to improve it. Surely, God had not intended “a whole Continent as fruitful and convenient for the use of man to lie waste without any improvement.” … The natives in New England had failed to fulfill God’s mission. “This savage people” did not develop the land or themselves. “They enclose no land, neither have they any settled habitation, nor any tame cattle to improve the land by.” … Winthrop in 1629 reassured his audience that the English seizure of Indian lands was actually beneficial for the Indians, because the English would teach them the arts of improvement….A fateful us and them had entered the lexicon of progress. The idea of “us” conquering “them” for their own good—the imaginative and fateful mixture of coercion, paternalism, and superiority—was destined for a momentous career for the next four centuries.

Violent Saviors does not lack for additional examples along Winthrop’s lines, and Easterly relishes skewering the purveyors.

But there are heroes as well. They are the economists and other thinkers who upheld the essential elements of freedom and consent at the heart of classical economic thinking, starting with Adam Smith. Voluntary consent, not coercion, should be the basis on which human beings interact with one another, in the marketplace and in all other respects.

To good effect, Easterly juxtaposes Smith and the Marquis de Condorcet, the 18th-century French philosophe for whom political and economic decision-making should be the province of experts. The expert governance for which he advocated was for the good of those governed, whether they liked it or not. Most of the major problem areas Easterly explores—colonialism, slavery, forced migration, etc.—also produced in opposition classically liberal thinkers in the mold of Smith. These figures were willing to cut against the grain of their times to deplore the deplorable, even if they often lost their arguments to contemporaneous forces of coercion. The liberals would be vindicated in time—through such developments as the end of slavery and Jim Crow, the extension of property and voting rights to women, and self-determination or national liberation for the colonized (though independence often served to usher in a new crew of oppressors).

_____________

The word “libertarian” appears only twice in Violent Saviors, and only once as a description of the school of thought that informs its perspective on historical events. Easterly prefers the term “liberal” for those heroically aligned in their time with his ideal of noncoercive policy action that accords with equal human dignity by putting freedom or liberty first. And indeed, the individuals he elevates warrant that label. But they are not alone, and unfortunately, this is where the book loses its way—and finds its odd congruence with the “presentism” of our times. With just a few grudging asides to the contrary, Easterly joins the mighty chorus of dismissal of the past and its people as morally and intellectually indefensible—because their views are so out of sync with the wiser opinions of today.

Now, to be fair, many of the popularly held opinions of today are indeed wiser than those of yesteryear. The argument in favor of slavery was just as bad when slavery was a matter of current controversy as it would be if you could find anyone propounding it today. One must note that Thomas Jefferson was a slave owner as well as the author of the Declaration of Independence. The fact that he didn’t personally exemplify the principles he espoused does not negate the validity of the principles, or their historical impact on the spread of liberty.

This is the general point that Easterly leaves out of Violent Saviors. If the only true liberals of the past were those whose views turned out to be sufficiently in accord with the views of the present, it’s hard to see how liberalism could have managed to attain its dominance in the modern world. Easterly in the end calls for a resolution of the “us-versus-them” problem through an expanding sense of “us.” That’s fine, but it ignores the extent to which the status of “us” has already expanded historically.

Perhaps Easterly’s answer as to why and how it expanded is that the truth of liberal principles, including those of neoclassical economics, is enduring. But to attain a purchase in the world, these principles must have a purchase on human beings—most of whom, like Jefferson, have additional and often contrary drivers behind the actions they take. Should Jefferson have abstained from the Louisiana Purchase because of its implications for Native Americans and the westward expansion of slavery? The implication of Easterly’s effort to put noncoercion first would seem to be yes. And indeed, in good libertarian fashion, he quotes John Quincy Adams on the foreign policy aims of the United States, which “goes not abroad in search of monsters to destroy. She is the well-wisher to the freedom and independence of all. She is the champion and vindicator only of her own.” Easterly then cuffs Adams for staying silent on forced Indian migration, which demonstrates how he has established a purity test no politician in history has ever passed.

The past is monstrous yet great, harrowing yet inspiring. In no sense is it merely the motion of ideas—though ideas both good and bad have animated those who made history. To the extent there has been real-world “progress” in economics or politics—and there has been—it has never been unsullied by wickedness.

On the North Sentinel Island in the Bay of Bengal lives a tribe of 400 to 500 indigenous people who have had next to no contact with the wider world. They are among a small number of isolated tribes that have no record of violence or conquest (though the Sentinelese do not take well to visitors and murdered a Christian missionary in 2018 who had the effrontery to step onto the sand of their beach). With such possible exceptions, all the rest of us are the sons and daughters of conquerors who extinguished the bloodlines of the conquered. No one has a rightful claim to a smug superiority to history.

This article was originally published on November 24, 2025 in Commentary.

The Assassination Fan Base

Eras creep in and taper off without clear demarcation; only in retrospect can we classify a single event as the beginning of one or the end of another. With the two assassination attempts on Donald Trump as well as the successful hits on United Healthcare CEO Brian Thompson and conservative activist Charlie Kirk, we must now ask whether a new era of assassinations is upon us, an era comparable to the one that gripped the country between 1963 and the early 1980s.

The assassination of JFK in November 1963 shocked America to its core. The America of 1963 did not need a “visual” to be shocked; it would be nearly 12 years before the public got a chance to see the “Zapruder film,” the grainy, black-and-white home movie of Kennedy’s last moments as his motorcade passed the Texas School Book Depository in Dallas and an assassin’s bullet tore through his skull. The mere notion that anyone might kill the president of the United States was itself borderline unthinkable—in a way, perhaps, even for those charged with the safety of the president. Riding in the back of a limo open to the air was as normal for presidents and politicians in its day as it has been unthinkable ever since.

That kind of weird innocence persisted in the immediate wake of the assassination. The authorities quickly located the assassin and arrested Lee Harvey Oswald. They could not imagine that the open way they disclosed plans about Oswald’s movements in custody would provide an opportunity to a man with a gun and murderous intent to get so close. Photographers were on hand to capture Jack Ruby firing a single shot at close range. The best-known image of Lee Harvey Oswald is the one in which he is already dying—a split second after being hit, a stunned expression on his face and his mouth slightly agape.

With a president and his assassin both dead, the conclusion of investigative commissions that Oswald was “a lone gunman acting alone” instantly had to vie with numerous other scenarios that emerged from elaborate chains of speculation. And does, to this day. We are used to writing off such speculation by invoking the term “conspiracy theory,” which is a way of dismissing those who challenge widely accepted accounts of the supposed facts of a situation. But throughout history, assassinations have more often than not been conspiracies. While some American killers—like “disappointed office seeker Charles Guiteau,” who shot President James Garfield because he didn’t get a patronage job—did the job themselves, John Wilkes Booth was not “acting alone” when he assassinated Lincoln, just as Brutus was the leader of a conspiracy to murder Julius Caesar.

Only 49 years before JFK was killed, numerous conspiring individuals with bombs and guns had stationed themselves on Archduke Franz Ferdinand’s path through Sarajevo in 1914 before Gavrilo Princip got him, setting World War I in train. Puerto Rican nationalists worked together to try and assassinate Harry Truman in 1950. Thus it was hardly irrational to inquire into the possibility of a conspiracy, especially since Oswald was a known Communist who had defected to the Soviet Union five years earlier before giving up and returning to the United States. Law enforcement always considers the possibility that more than one person is involved in a difficult-to-solve murder and sometimes finds a conspiracy at work. When the conclusion is otherwise, as it was with the Warren Commission’s finding in the Oswald case, it’s an easy leap for conspiracy-hunters to conclude that law enforcement must have been in on it.

The impact of the JFK assassination and its presence in our common cultural conversation did not wane over time, in part because assassinations and political violence started to become commonplace in its wake. It was the first in a series of high-profile murders or assassinations, or attempts thereof, that persisted for more than two decades.

The Kennedy assassination marked the turn as well to a period of volatility in American politics in a bizarre conflation of the civil rights movement, campus protest, early feminism, a new intellectual radicalism, and the escalation of and mounting opposition to the war in Vietnam—as well as resistance to all these trends.

There had even been a prologue to the Kennedy assassination some months before in 1963: the assassination of civil rights activist Medgar Evers, the NAACP’s field officer in Mississippi. Evidence pointed to a member of the Ku Klux Klan, who in 1964 was charged and brought to trial. All-white juries hung twice, letting him go free. (In a controversial retrial in 1994, a mixed-race jury convicted Byron De La Beckwith of the murder.)

After Kennedy, the next high-profile American assassination was that of the militant black nationalist Malcom X, in 1965. This was indeed the product of a conspiracy. Multiple gunmen opened fire on him as he was about to give a speech. In this case, however, the deed was a product of an internecine struggle, since the perpetrators were members of the Nation of Islam, from which Malcom X had grown increasingly estranged in recent years.

The impression of the 1960s as an assassination spree solidified with the slayings of civil rights giant Martin Luther King Jr. in April 1968 and, mere months later, President Kennedy’s brother and former Attorney General Robert F. Kennedy, then himself a presidential candidate.

James Earl Ray, whose racist views were unconcealed, shot King with a high-powered rifle from a building across from King’s Memphis motel room. King and his colleagues had stepped outside onto the walkway of their second-floor room. A photographer who was staying in a room nearby heard the shot and rushed onto the walkway, where he captured an image of the mortally wounded King collapsed on the floor as members of his retinue, arms outstretched, point in the direction from which the shot came.

Riots broke out across the country, wreaking devastation in urban areas. Ray, who fled the scene but was quickly identified as the prime suspect, was apprehended abroad, traveling on a counterfeit passport, in June 1968. He confessed and was sentenced to 99 years, though he later recanted and unpersuasively alleged a conspiracy. In 1975, however, Americans learned that J. Edgar Hoover’s FBI had been surveilling King as part of its COINTELPRO (Counterintelligence Program) activities, which let loose a fresh torrent of conspiratorial speculation.

Bobby Kennedy was a senator from New York and, by June 1968, a leading candidate for the 1968 Democratic presidential nomination. On June 4, he was in California celebrating his primary victories that day in California and South Dakota. As Kennedy and his entourage made their way out of the hotel through its kitchen shortly after midnight, Sirhan Sirhan, 24 years old, rushed RFK, shooting the senator three times, including once at close range in the head. Sirhan wounded several others before he was subdued. Photographers captured iconic images of a busboy kneeling next to the fallen RFK trying to comfort him. Kennedy died in a hospital 26 hours later.

Sirhan was a Palestinian Christian who had emigrated with his family from Jordan to the United States after Israel’s War of Independence. He was blunt about his anti-Semitic motive. As Sirhan saw it, RFK’s support for Israel in the Six-Day War in 1967 and for sending Phantom fighter jets to the Jewish state in its aftermath warranted his murder. Convicted at trial, he received a sentence of death, later commuted to life in prison. Though eligible for parole, he has been denied every time, most recently by Governor Gavin Newsom in 2023. He was also repeatedly denied motions for a new trial, alleging that he had been drugged or brainwashed as part of a conspiracy.

In May 1972, Alabama Governor George Wallace was on the presidential campaign trail in Laurel, Maryland. With television cameras rolling, Wallace took off his suit coat and began to work the crowd. Arthur Bremer, 21, stepped up and fired multiple times, gravely wounding Wallace, who survived but remained paralyzed from the waist down. The television footage, captured at close range, is graphic. Wallace falls to the blacktop on his back, and blood spreads on his white shirt. Bremer’s diary, which Harper’s published to substantial controversy as a self-portrait of a sociopath living in troubled times, claimed he had shot Wallace in pursuit of notoriety. Once again, conspiracy theories abounded, including one advanced by the left-wing literary provocateur Gore Vidal. He claimed the diary had been a plant by the Nixon White House. The jury rejected Bremer’s insanity defense, and he spent 35 years in prison.

Assassinations were only one part of the broader story of political violence in the United States and abroad in this period. U.S. troop deployment in Vietnam peaked at more than 530,000 in 1968, and protests began to accelerate. During the Democratic National Convention in 1968, the streets and parks of Chicago saw violent clashes between police and thousands of demonstrators protesting the war. The revolutionary Black Panther Party, which espoused a doctrine of armed resistance, was involved in shoot-outs with police in Oakland, Chicago, Los Angeles, and New Orleans. Members were also charged with plotting to plant bombs in public buildings. To “bring the war home,” the Weather Underground, a revolutionary spin-off of the left-wing Students for a Democratic Society, launched a bombing campaign targeting police stations and government buildings, including the Pentagon and the Capitol. Police who found themselves the target of rocks generally broke up protests with tear gas, but in the case of Kent State University in 1970, members of the National Guard opened fire on student protesters, killing four.

Nor was the United States alone in political violence. At the 1972 Summer Olympics in Munich, the Palestinian group Black September took Israeli athletes hostage and killed 11 with the world watching. “Bloody Friday” in Northern Ireland involved more than 20 separate bombings orchestrated in Belfast by the Irish Republican Army in little more than an hour. Prime ministers of Jordan and Spain were among the more prominent victims of assassins in 1971 and 1973, respectively. The first president of Bangladesh was slain alongside most of his family in a coup in 1975.

Meanwhile, in the course of less than three weeks in September 1975, there were two attempts on the life of President Gerald R. Ford. The first was by a follower of the notorious cult leader and convicted murderer Charles Manson. Lynette “Squeaky” Fromme pointed a gun at Ford but didn’t fire it. She said she wanted to draw attention to environmental causes. The second would-be assassin, Sara Jane Moore, who later said she sought to spark a violent revolution, got a shot off but missed. A man nearby grabbed her arm as she fired a second time, deflecting the shot, which wounded a bystander. Film crews captured both attempts, and the first impression the footage leaves, when viewed 50 years later, is of a sudden outburst of confusing motion. If one didn’t know what one was seeing, one wouldn’t. Fromme and Moore each received life sentences and won parole after serving more than 30 years. (Moore died in September at the age of 95.)

In the mid-to-late 1970s, the Red Army Faction in Germany murdered 34 politicians and industrialists, while the Red Brigades in Italy kidnapped and slaughtered leading Italian politician Aldo Moro. In the United States, following the resignation of President Nixon, the brief Ford administration, and the 1976 election of Jimmy Carter, American history journeyed through a truly dismal period, one that prominently featured the assassination of San Francisco Mayor George Moscone by political rival Dan White in 1978. Moscone had won the election only with the support of a radical minister named Jim Jones, who later fled to Guyana along with nearly 1,000 members of his People’s Temple. When Representative Leo Ryan went to the Jones compound to make sure his constituents weren’t being held captive, he was murdered on Jones’s orders. Jones then coerced his flock into consuming a poisoned fruit drink—a mass murder-suicide that took more than 900 lives.

The sense that America had been spinning out of control helped put Ronald Reagan in the White House by a staggering margin of 10 points and 40 states in 1980. Though a victory of such magnitude indicated an electorate deeply fatigued by the period’s malaise, there would be no instantaneous exit. Barely three months after Reagan took office, John W. Hinckley shot Reagan as he was leaving an event at the Washington Hilton. Network news cameras captured the shooting, and the footage aired within minutes. Reagan recovered, but his injuries were far more grave than initially reported. A jury found Hinckley not guilty by reason of insanity (he had committed the crime to attract the attention of the teenage actress Jodie Foster), and he was institutionalized at Saint Elizabeth’s Hospital in Washington and released in 2016. Federal law at the time of the shooting required the government to prove the defendant was compos mentis rather than requiring the defendant to prove he wasn’t. After the Hinckley verdict, lawmakers reversed the burden.

Less than two months later, Mehmet Ali Agca shot and critically wounded Pope John Paul II in Vatican City’s St. Peter’s Square. Video captured John Paul II collapsing in the open-air Popemobile as it sped off. Agca, a Turkish national, had previously been imprisoned for the 1979 murder of a Turkish newspaper editor. He then escaped. Agca told multiple conflicting stories about the motive behind the assassination attempt. Italian authorities quickly determined that Agca did not act alone. His lengthy stay in a luxury hotel in Sofia established a “Bulgarian connection” that pointed back through Bulgarian intelligence and perhaps the East German Stasi to the KGB—and thus to the highest levels of the Soviet Union. The danger the Polish pope posed to the Soviet bloc was undeniable, but Soviet apologists denied any such connection, of course, and the evidence was pooh-poohed or simply ignored by many on the grounds that it would aggravate U.S. relations with Moscow. The Pope, for his part, forgave Agca, met him in prison, and urged his release.

One more stop abroad will suffice in this account: In 1984, the Irish Republican Army set off a massive bomb targeting UK Prime Minister Margaret Thatcher in her hotel at a Tory party gathering in Brighton. It killed five people, and Thatcher herself was a narrow miss. Images of the hotel in the aftermath of the blast show a ragged V-shaped crater in the upper floors of the hotel and just to the left of the center of the façade. Patrick Magee, the IRA bomber, had planted the bomb and its timer during a stay at the hotel four weeks before. In this case, neither the perpetrators nor their motive was in doubt: The IRA issued a statement claiming responsibility and promising to try again. Police arrested Magee and other IRA members in London in 1985.

_____________

And then the assassination era came to an end, after two decades in which it was one of the dominating facts of our common life. Of course, political violence didn’t end altogether, nor will it ever. Consider the anti-government bombing of the federal Murrah Building in Oklahoma City in 1995, which claimed 168 lives and injured hundreds more. Horrific it was, but thankfully, it proved to be a one-off. (The 9/11 attack six years later belongs in a separate category.)

The new source of recurring violent shock to the American psyche was the mass shooting, especially school shootings, which are distinctive not for high-profile victims but for the random ordinariness of the mise-en-scène. The Columbine High School shooting in Colorado in 1999 brought the matter home to the suburbs, where it remains. Anti-Semitic violence is a more recent recurring disruption.

Now, however, we are at least several attempts, some of them successful, into what may be a new era of assassinations. The dramatic near miss against Trump at a campaign rally in Butler, Pennsylvania, in July 2024 was Exhibit A. Next was a second, fortunately bullet-free, attempt on Trump at his golf course in Florida. Third was the slaying of UnitedHealthcare’s Thompson in midtown Manhattan in December 2024. Finally, and most dramatically, was the assassination of Charlie Kirk at a college campus event in Utah in September. Other noteworthy recent entries include the slaying of the Minnesota state house’s Democratic majority leader in June 2025, an aborted attempt on Justice Brett Kavanaugh in June 2022, and an arson attack in April 2025 on the governor’s mansion in Harrisburg, Pennsylvania, intended to kill the state’s governor, Josh Shapiro, as he and his family slept. At a further remove, mass shootings took place at a GOP congressional baseball practice in 2017 and at a constituent meeting in Arizona with Democratic Representative Gabby Giffords in 2011. Though some were wounded in these events, the lawmakers survived.

If a new era of assassinations is underway, it has not supplanted but rather overtaken the era of mass shootings. These have continued, with churches and Jews increasingly prominent among the targets.

But why assassinations then? And why now?

The potential victims of assassins haven’t changed. They are prominent individuals whom assassins have targeted specifically. (Political violence in the form of terrorism typically doesn’t have a particular individual as a target; its design is to terrify large populations.) Among the would-be assassins themselves, certain commonalities also emerge: a desire for notoriety, to leave an otherwise unattainable mark on history, and to pursue a political agenda.

On the latter, it’s worth noting that animus among the killer or killers toward the victim is about as close to an inescapable feature of assassination attempts as one gets. This is true of necessity in the case of a conspiracy. “Loners seeking notoriety” don’t work for groups operating secretly. But it must hold true for the loners as well. The prominence of the victim has specific qualities, and the murder, or attempt, can’t be separated from animus related to what has made the intended victim famous. Supposedly, John Hinckley was willing to try to kill Jimmy Carter, but he actually did try to kill Reagan. Bremer said he would kill Wallace or Nixon—but not George McGovern or Hubert Humphrey, the top two Democrats in the race for their party’s nomination. The efforts to deny the leftward orientation of the political motivation in the assassination of Charlie Kirk would be laughable were they not a symptom of our current era. In general, it’s hard to find a would-be assassin who professed undying love and support for the individual he was attempting to kill. The will to annihilate is specific—the target is not a president but this one.

If assassins are trying to change the course of history, which of course many are, they are attempting to do so by eliminating an obstacle that stands in the way of their vision, whatever it may be. The living JFK was an obstacle Oswald could and did overcome, leaving an indelible stamp. But how did history change? In ways we can never really know, and certainly not in ways that could be known in advance by an assassin. What if Lincoln or Kennedy had lived? The question invites those reflecting on it to project onto the past their current-day political preferences for how history might have been different. The deed may have been undertaken in pursuit of sweeping change, but in most cases, we are left with only the deed itself and the consequences that flow from it directly: better presidential security after JFK, the extension of Secret Service protection to presidential candidates after RFK, a national holiday and memorial on the National Mall for MLK. But would the Vietnam War or race relations have turned out differently? No one can know. The melodramatic assertion that the assassination of Franz Ferdinand caused World War I doesn’t survive the reality of a chain of decisions that could have gone differently after the assassination.

That political violence in the form of assassination has political motives, and that they are often wildly out of sync with what the assassination will achieve, are constants not just in the recent American experience but throughout history. The big difference between the late-20th-century era of assassinations and the present is that the former was largely a story of the targets and the perpetrators (whether an individual or a conspiracy). Now, however, the story is about the targets on one side—and the perpetrators (alone or in conspiracy) and the supporters of the perpetrators on the other.

Consider the JFK assassination. This is high history, an individual inserting himself indelibly into the nation’s story via the act of assassinating the president. The nation is an onlooker (which is the reason I made so much, in my brief catalogue of the previous period, of the visuals we have from these assassinations and attempts). We, the people, were not involved. We absorbed the information about events, and we responded accordingly, typically and normally with distress and outrage. Now, we mustn’t be naive. There were, no doubt, Americans whose black hearts welcomed the death of one or both Kennedys, and that’s likely all the truer in the case of King. But if so, they mostly kept it to themselves or articulated it only in the presence of intimates. You could say that the public square, notwithstanding the First Amendment and broader commitments to free speech, placed a cordon sanitaire around permissible opinion, keeping out such noxiousness as assassination celebration and consigning it to a fringe communicating through the mails with mimeograph sheets, and to private homes. A public culture of good manners also has the effect of cultivating well-mannered people and perhaps as well a moral sensibility of actual decency.

In the previous era of assassinations, Americans also had at their disposal a social resource that went largely unappreciated at the time—the ability to ignore. If you were the craziest person out of a million Americans in the 1980s, when there were 250 million Americans, you were pretty socially isolated from the 250 or so people who were just as crazy as you. Or make it the craziest in 100,000: isolated from your 2,500 peers nationwide. The latter might have proved sufficient for a gathering in a windowless big-city room. But that’s not quite enough to make a revolution.

Now, through social media of all kinds, the 2,500 worst among us can easily find and interact with each other on a regular basis, exchanging views on whom to hate and perhaps who constitutes the gravest peril to the life they want to live. But now this is not a matter of just a single set of 1-in-100,000 sociopaths, nor is it obvious that sociopathy becomes dangerous only as it affects the 1-in-100,000 worst. Perhaps 8 million to 10 million people in America have been or are incarcerated for violent crimes. Out of 260 million adults, that’s at least 1 in 50. Meanwhile, there are multiple overlapping and non-overlapping sets of sociopathic individuals based on the particulars of the sociopathy. In addition, the term “sociopath” may not describe a fixed quality, in the sense that one either is or is not sociopathic—or evil. Someone on the fence can be cultivated by a sociopath to turn sociopathic. One can even imagine an individual who has no intention of personally killing a member of some specified “out group” nevertheless encouraging someone else to kill through the mere addition of a “like” click on social media. In the context of terrorism, this process is generally known as “radicalization.” In the context of American polarization and the ways in which we increasingly dehumanize those with whom we disagree, we might call this “sociopathization.” I think, given recent examples, these processes do produce would-be assassins, including successful ones. But I also think they have produced something of significantly broader importance—in fact, the defining characteristic of the new era.

It’s the assassination fan base.

The wounded Reagan quipped to the lead doctor on his trauma team, “I hope you’re all Republican.” What made the quip amusing is that both Reagan and the team knew it mattered not in the least whether its members were Republican. The doctor, a Democrat, amusingly but perhaps a bit solemnly replied, “Today, we’re all Republicans.”

I think most Americans would like to live in a world where such an exchange is still possible. I’m not sure it is.

A significant number of Americans took to Bluesky, TikTok, Reddit, and the streets to express their regret that Trump’s would-be assassins had been unsuccessful and to praise the assassins of Charlie Kirk and UnitedHealthcare’s Thompson. In the case of the latter two, many asked or offered their opinion on who should be next. (I won’t cite any examples. If you are at all online, you have seen them in abundance, and if not, you may want to spare yourself.)

At present, the assassination fan base is pretty much a left-wing subculture. So far, it has applauded attempts on the lives of a former president, a conservative activist, a corporate CEO, and a conservative Supreme Court justice. The closest thing on the right is the online coterie claiming that Trump supporters who stormed the Capitol on January 6, 2021, did nothing wrong, either because they were let in or were duped into entering by a government plot. But to speak up on behalf of J6 defendants, even to the point of alleging conspiracies, is not the same as celebrating the assassinations of Kirk and Thompson and lamenting the misses on Trump. I hope no comparable figure on the left becomes a target that thereby allows us to ascertain whether there is a comparable fan base for assassination on the right.

We should also note that even “lone gunmen, acting alone” have to get their ideas about whom to target from somewhere. They, too, have social networks, which likely traffic in in-group suggestions about who in the out-group are the worst of the worst. So we are now living in a political culture in which a potential would-be assassin can count on a social network for inspiration and an outpouring of public support after the fact. This is fertile ground for evil, perhaps because assassins always believe they are doing good. And we may be cultivating more and more of them.

This article was originally published on October 21, 2025 in Commentary.

A Power-Driven Foreign Policy

The question is no longer whether we should have a “power-based foreign policy,” but rather how we most effectively assert our power.

Over the past eight months, it has become clear that on foreign policy, the biggest contrast between the Trump administration and the Biden administration is their respective attitudes toward American power. Note that the actual power capabilities of the United States did not change with the transition. Rather, it was the willingness to assert American power that increased dramatically—as was sorely needed.

The Biden administration’s liberal internationalist “values-driven foreign policy” saw the United States as playing a global leadership role in creating and preserving a “rules-based international order” with which US national interests need not conflict. The Biden administration’s values emphasized process and the pursuit of agreement about the rules needed to achieve the level of cooperation and multilateral engagement that, according to liberal internationalism, will lead to global stability. American power was at best a backdrop for these aspirational endeavors, and more often than not, it was seen as an inconvenience or even an outright impediment.

Ironically, this aversion to American power resulted in a world far less hospitable to the very rules the Biden administration wanted to see followed. More alarming, it put American interests squarely in jeopardy. The fiasco of the Afghanistan withdrawal was as flagrant a show of powerlessness as the United States has put on since the fall of Saigon, made all the more horrific by the senseless loss of 13 American servicemembers. This impotence undoubtedly emboldened Vladimir Putin in weighing his decision to invade Ukraine, after which the Biden administration’s support for Ukraine was timid, heavy on rhetoric while light on military value, for fear of Russian escalation. Thus, the brutal stalemate continues today.

While happy to convene “summits for democracy” and needlessly castigate allies for failing to embrace progressive policy preferences, the Biden administration appeased and coddled repressive regimes like Iran and Venezuela but spurned the Iranian and Venezuelan people who were desperately fighting for their unalienable rights. And rather than recognizing China as an adversary and taking sufficient actions to deter the threat, the Biden administration clung naively to hopes that it could convince Beijing to be a responsible global actor. Under the Biden administration, the United States, though not actually weak, consistently acted as if it were.

In stark contrast, the Trump administration has little reservation about flexing and deploying American power. Rather than the chaotic images of Kabul, we now saw American B-2 stealth bombers fly untouched into the heart of Iran and deal a devastating blow to its nuclear program. After declaring, quite correctly, that Venezuelan strongman Nicolás Maduro was not only an illegitimate dictator but also the head of a narco-terrorist organization that partnered with China, Russia, and Iran to threaten American interests, the Trump administration deployed a guided missile cruiser, attack submarines, destroyers, and amphibious vessels reportedly containing a Marine expeditionary unit just off the Venezuelan coast. In response to the Trump administration’s threats to take back the Panama Canal in response to growing Chinese influence over this critical choke point for open sea lines of communication, Panama agreed to withdraw from China’s Belt and Road Initiative. President Donald Trump’s insistence that European allies spend more on defense paid off following Putin’s full-scale invasion of Ukraine in our allies’ realization that European security must be their primary concern.

While not every flex has necessarily been productive—threatening to annex Greenland and failing to sufficiently distinguish allies from adversaries on things like tariffs come to mind—Trump rightly grasps that national power is the foundation upon which everything rests. Contrary to the wishes of Wilsonians, international “rules” and “norms” are not self-executing but require power to maintain.

Moving forward, the question is no longer whether we should have a “power-based foreign policy,” but rather how we most effectively assert our power. That said, this assessment need not and should not be strictly confined to cold realpolitik calculations. Every great power in history has sought to shape the world to reflect, or at least be compatible with, the values it views as most essential to its own identity—not in lieu of advancing its national interests but as a means to advance its interests.

To be compatible with our national character, American foreign policy should take into consideration how our power can further an enlightened view of our national interests, consistent with our founding principles—the belief in the God-given dignity of every person, a commitment to ordered liberty at home, and a strong preference for free and open spaces abroad. Our foreign policy should be centered on strength and promoting good behavior and deterring bad behavior among international actors, and not be measured against amorphous globalist ideals or progressive policy preferences, but against the standard of what serves American interests as enlightened by those founding principles.

This shift to a power-based foreign policy is made all the more urgent by the reality that the United States has enemies. We do not have to invent them, nor do we seek them out. They announce themselves and their grievances with our way of life and the international order our power currently upholds with some regularity. Though they are free to criticize as they please, when states or non-state actors take action, especially violent action, in opposition to US national interests, the game must change.

The United States should be unsparing in countering such challenges, meeting them in proportion to the danger they pose. We must deter those who identify themselves as our enemies as best we can and defeat their challenges when we must. We should cajole them when possible, threaten them when necessary, and punish them for bad actions that harm us.

In an era when adversaries like China, Russia, North Korea, and Iran are coordinating to undermine our values, interests, and leadership around the world, the World War II and Cold War-era moral distinction between good and evil, or just better and worse, is due for a comeback. This distinction should not serve as a set of handcuffs—America has worked with despots before, and will need to again now, in order to stave off greater threats—but rather as a needed rejection of any moral equivalence between the United States and our chief adversaries. While liberal internationalists will persist in hoping that these repressive regimes will become responsible global citizens if we can just find the right words to persuade them to reform, the people of Xinjiang, Ukraine, and Israel know far too well how naïve this is.

The irony is that a strong United States pursuing its national interest in furtherance of its founding principles is the best way to secure and advance a freer, more open world. But American power must remain the top priority, as there is no order without power to sustain it. This may be less sentimental or moralistic than what we have been used to, but there’s a good chance it will be more successful.

This article was originally published on September 5, 2025 in The National Interest by Tod Lindberg and Corban Teague.

The Good Books

Review of ’13 Novels Conservatives Will Love (but Probably Haven’t Read)’ by Christopher J. Scalia

Christopher J. Scalia has written a book that takes a valiant stand against the self-obsessed screen-culture spirit of our times. It’s called 13 Novels Conservatives Will Love (but Probably Haven’t Read)Scalia, now a senior fellow at the American Enterprise Institute, is a former English professor, and he has a deep and abiding love for literature as well as an evangelical streak that compels him to spread the joy. “Why read fiction?” he asks, and replies, “Simple: great fiction is a source of beauty, and beauty is good.”

This sentiment stands sharply in contrast to the milieu in which Scalia’s book appears. For years now, it seems that every day has brought a new story about how young people find it hard to read at book length, so thoroughly steeped are they in other media, especially short-form video content on their phones. While some enter the rejoinder that old people have been complaining about the declining literacy of youth for generations, this observation doesn’t clear the hurdle of the possibility that the literacy of youth actually has been declining for generations, with TikTok the impetus for the latest fall-off.

The arrival of AI in 2023 has made matters worse. Now you can get a summary of any book you want out of ChatGPT or Grok (though there is substantial risk AI will tell you something that isn’t true). Beauty aside, whatever utility that once came from reading a book—say, the ability to write an assigned paper—is capable of fulfillment by more efficient means. Nowadays, one can also bypass the summary and prompt AI to write the whole paper.

Under such circumstances, what is the utility function of reading? This is the first problem Scalia is up against. The second, which paradoxically points toward a solution to the first, is our gaping political polarization. With regard to literature, the leftward extreme has little to no use for works from the past, the authors of which suffer from the fundamental deficiency of bad character in the view of their modern readers—actually, their modern nonreaders—who feel themselves to be undeniably superior in sense and sensibility. At best, the malignant dead offer passages that can be pressed into service in support of one’s position in current controversies—a use of literature that is hardly novel, though rarely admirable.

A recent example is an article in the New York Times that recasts Jane Austen’s Mansfield Park as an extended hidden polemic against slavery. I’d say the urge to read Mansfield Park as an abolitionist tract begins with the recognition that Austen’s genius is undeniable—and therefore deserving of a context, however stretched, in which it can resonate with today’s bien-pensant opinion. Although this reading, if true, would reduce Mansfield Parkto a middlebrow problem novel—which it isn’t—the bright side here is that a new reader looking for a self-affirming anti-slavery allegory might make the pleasant discovery of something different and better.

The ransacking of the past in search of good material has at least the virtue of proceeding from a core hypothesis that literature has intentional meaning accessible to careful readers. The denial of this proposition is an even worse practice of the literary left. I have a higher degree of tolerance for critical theory than most in my political demographic—but not to the point at which authorial intention and meaning get dismissed as inaccessible and irrelevant as reader confronts text. Yet that view began its takeover of mainstream academia more than two generations ago, starting with Roland Barthes’s 1967 proclamation of “The Death of the Author.”

Within a decade, deconstruction and other fashions of theory had become so dominant that two UK novelists and professors, David Lodge and Malcolm Bradbury, could put forth a years-long tag-team procession of campus novels hilariously lampooning the phenomenon and its practitioners. Theory fully flowered as an object of satire with John L’Heureux’s 1996 The Handmaid of Desire, in which an ambitious professor at a school resembling Stanford (where L’Heureux taught) plots to replace the English Department with the “Department of Discourse and Theory”—even as he keeps locked in a desk drawer a copy of Austen’s Emma, to which he secretly repairs in times of stress.

The combination of screen culture, runaway presentism, and the triumph of theory over author gives Scalia the opportunity he has seized: making the case that conservatives should devote some of their time and intellectual energy to conserving the literary tradition of the novel. Ruling out already well-known candidates such as the Austen novels, 1984 and Bonfire of the Vanities, Scalia has picked 13 entries for his list, offering for each a summary interpretive essay, including relevant biographical details, and consideration of how the work resonates with conservative sensibility. He spells out the elements of the conservative disposition he sees reflected in his selections as follows:

They include the preference for gradual social and political change over sudden innovation and revolution; the recognition of the imperfectability of mankind and the consequent dangers—and inevitable doom—of utopian projects; an inclination toward time-tested traditions over abstract theory and untested innovation; a respect for religious belief, particularly in the Judeo-Christian tradition; and an emphasis on the institutions of civil society, especially the family.

It’s off to the books, then, with chapters starting with Samuel Johnson’s 1759 Rasselas, Prince of Abyssinia, and proceeding chronologically to Christopher Beha’s 2020 The Index of Self-Destructive Acts. And here’s where a certain competitive streak, as well as a certain modesty, must kick in.

In Lodge’s 1976 novel Changing Places, which kicked off the duet with Bradbury, a young British professor of English, on exchange at a university resembling Berkeley, introduces his American colleagues to a party game from home. It’s called “Humiliation.” Players take turns naming a work they haven’t read, and they get one point for each player in the group who hasread it. So to win, the humiliation one inflicts is upon oneself. In the Changing Places installment of the game, one assistant professor, seized by competitive spirit, blurts out “Hamlet!” No one believes him, but he swears an oath that he’s telling the truth, eventually storming out of the room over colleagues doubting his veracity. So, yes, a member of the English faculty who hasn’t read Hamlet. For his department colleagues, this is a bit much. He unexpectedly flunks his tenure review three days later and is driven into exile.

Scalia’s list immediately causes one to do a conservative literacy tally in line with “Humiliation.” To play this version, just give yourself a minus-one for each unread novel from the two above on Scalia’s list and the 11 following: Fanny Burney’s Evelina, Walter Scott’s Waverley, Hawthorne’s Blithedale Romance, George Eliot’s Daniel Deronda, Willa Cather’s My Ántonia, Zora Neale Thurston’s Their Eyes Were Watching God, Evelyn Waugh’s Scoop, Muriel Sparks’s The Girls of Slender Means, V.S. Naipaul’s A Bend in the River, P.D. James’s The Children of Men, and Leif Enger’s Peace Like a River.

I won’t tell you my score. But I will admit that I hadn’t read Johnson’s Rasselas or Beha’s Index before I agreed to review Scalia’s book. To check his work, I then did read them. It turns out that he’s a reliable and entertaining guide.

“Johnson wrote Rasselas over the course of a week in 1759 quite simply because he needed the money,” Scalia notes. And indeed, it reads like something written in a week by someone who needed the money—provided the someone in question was the towering literary figure of the eponymous Age of Johnson. In it, a young Abyssinian prince and his sister break out of their elegant captivity in the “Happy Valley” in the company of an older and wiser man, Imlac, who has traveled the world. He guides Rasselas in the search for his “choice of life.”

Rasselas is at times very funny. For example, Imlac tells the travelers about the time he once spent with one of the greatest astronomers in the world. The astronomer’s studies of the movement of celestial bodies have forced him to the conclusion that through his influence on them, he has the power to control the weather—though he has concluded it best not to do so. Seeking inner peace, the troubled astronomer solemnly transfers his unique power to Imlac. At the conclusion of Imlac’s tale, his auditors are amused to varying degrees by the madness of the astronomer. Imlac upbraids them: “Few can attain this man’s knowledge and few practice his virtues, but all may suffer his calamity. Of the uncertainties of our present state, the most dreadful and alarming is the uncertain continuance of reason.”

Scalia notes that the travelers’ journey becomes a stage for Johnson’s depiction of his ceaseless “belief in a universal human nature.” In the end, the journey is one toward an understanding of human potential and its perils, not a preparation for a culminating “choice of life.”

Of The Index of Self-Destructive Acts, Scalia notes that “Beha bristles when reviewers and interviewers compare him to Tom Wolfe.” But the family resemblance to Bonfire is unmistakable: money and influence in upper-crust New York City, ambition, selfishness, bad choices leading to the inexorable pressing in of fearsome consequences.

Though Wolfe’s powers of observation are keener, Beha has more psychological depth and wider intellectual range. That includes an exceptionally well-rendered character, Margo, a sometimes-aspiring poet whose interior monologues brim with unattributed passages from Wordsworth. She has set herself to the slow-moving task of seducing the novel’s married protagonist, Sam, for whom the affair is a close-run thing: “He was trying to do something impossible. He wanted to become someone else, but to do it while staying himself. He wanted to be the person who slept with Margo Doyle while remaining the person who was faithful to Lucy. It contradicted the foundational laws of Boolean logic.” Sam is a data journalist.

Scalia rightly calls Index “a novel about endings,” which he relates to George F. Will’s contention that the “foundational conservative insight” is that “nothing lasts.”

One could raise principled objections to Scalia’s project in its entirety. The appeal of great or even good literature is universal and should not be contingent on its consanguinity with the political preferences of today’s readers. We should read novels for their beauty and insight, not in search of affirmation of our pre-existing convictions. The problem is that while everybody used to think that, it’s now a view that many reject. Its remaining supporters are almost by definition culturally conservative.

Scalia didn’t pick this fight with progressive presentism, or with the threat screen culture poses to art. The fight began with an assault on the beauty and insight of the great “content creators” and “influencers” of the past. It’s ongoing, and Scalia is right to join it.

This article was originally published on August 26, 2025 in Commentary.

Reason to Believe

Review of ‘Believe’ by Ross Douthat 

The title of Ross Douthat’s new book, Believe, is a verb in command form referring to God. Yet the ambition of the author is hardly on par with that of a revivalist preacher or a biblical prophet warning of God’s coming justice. Rather than a command, Douthat offers an invitation—to set aside the modern secular prism through which most of our assessment of contemporary morals, manners, and politics refracts and to reopen our eyes to the possibility of reasonable belief that God created our world and ourselves, has intervened in it from time to time in the past, may be doing so now, and may again.

Discourse that isn’t avowedly religious these days is instead thoroughly secular. Douthat, a believer and a columnist for the New York Timeswhose work often takes up religious matters, is the exception who proves the rule. I wouldn’t proffer the claim that no one else at the Times believes in God. But if they do, they certainly don’t let it affect their work. The world the journalists describe is one of natural causes exclusively, straight back to the Big Bang or some other “forever.” They believe human existence itself, including the thoughts we harbor and the actions we choose to take, emerges from an evolutionary process that began with life far less than human. History is one thing after another, perhaps bending toward progress, but certainly not providential. The Times will report as needed on opinions human beings have about supernatural or God matters—the fact that people believe. But the truth or falsity of any such belief is a nullity with regard to explaining how the world works.

For example, as Mark Lilla remarks in a discussion of modern religious revivalism in his recent book Ignorance and Bliss, “A national poll in 2012 revealed that well over half of all Americans believe in the possibility of demonic possession.” He notes, “The exorcist for the Archdiocese of Indianapolis told a journalist in 2018 that he had received seventeen hundred phone or email requests for exorcisms in that year alone.” He adds: “Which is madness.” I have little doubt Lilla is mostly correct in his judgment. But entirely? It falls to Douthat to inquire whether one reason the numbers are rising is that demonic possession is real. He holds that possibility open—and even cites the profusion of such beliefs and cases in a supposedly secular age as evidence that we are missing something. That something would be the ongoing “enchantment” of a world supposedly “disenchanted,” that is, done with God and the supernatural.

_____________

Douthat asks us to conduct a thought experiment: Imagine living in a world in which pretty much everybody believed in God. This is not necessarily a closed off and benighted Dark Ages in which all secular activity must defer to ecclesiastical authority. It’s also the predecessor to our world now. You could pray to God for a good harvest, but doing so would not relieve you of the responsibility of being a competent farmer. You could wonder at God’s creation while rigorously investigating how it works, from the movement of celestial objects to the workings of the human body—in order to more fully appreciate and give thanks for God’s handiwork.

His conclusions from this thought experiment are twofold. First, there is nothing essentially incompatible about a world of belief and a world in which science and technology proceed apace. Second, the conclusion that secularization is an irreversible process that has permanently supplanted belief in God among rational human beings is nonsense. God remains a distinct possibility.

The first chapters of Believe spell out Douthat’s case for why believing is reasonable and indeed preferable to nonbelief. What caused the Big Bang, that light in the void? What caused a lifeless world to sprout vegetation and, a couple of days or eons later, to teem with living creatures? How is it that humans have consciousness and minds, including free will? Douthat is neither a physicist nor a biologist nor a neurologist investigating the workings of the brain. Rather, he is a journalist of an endangered species, endowed with seemingly limitless skeptical curiosity to find out as much as he can about subjects that really matter. He has read widely and deeply enough to bring others up to date on the latest science and its compatibility with belief. In other words, he has updated the maps of the various cul-de-sacs in which science and reason find themselves in their search for a godless explanation for all that is—the final ruling out of the Almighty.

At times, however, Douthat wants to go further, inferring the necessity of a designer from the appearance of design in the makeup of the universe, life, and mind—that is, a rational human necessity to believe. Science has shown that the universe is constructed according to such rigorous specifications that if even one step in the instruction manual had varied infinitesimally, the whole thing would be impossible. Without an omniscient and omnipotent creator, there could be no creation or universe at all. This argument attempts to fill with a logical leap the gap between the limit of what science can aspire to know and a creator-god.

The problem here, as I see it, is that it makes no sense to speak of our universe in terms of its probability or improbability. It’s here, and we’re in it. Though those of us so inclined should investigate its workings as thoroughly as possible, its factuality is self-evident, requiring no explanation by us. William James, in Varieties of Religious Experience, recounts how the 19th-century transcendentalist philosopher Margaret Fuller declared, “I accept the universe”—to which her contemporary Thomas Carlyle immortally responded, “Gad, she’d better.” Even the transcendentalists understood they had little choice in the matter. We may believe the universe exists because God created it, but knowledge in the sense of empirically verifiable science eludes us, and the universe rolls on.

_____________

Thus we have the broad contour of Douthat’s case that “the basic justifications for a religious worldview are readily accessible to a reasonable human being.” But his book also has a subtitle: Why Everyone Should Be Religious. That raises the thorny question of which if any religious tradition to embrace beyond one’s personal faith in God. Douthat is Roman Catholic, but he states in his introduction that “my aim for this book is to be useful to readers who might take many different religious paths.” He saves his Catholic apologetics for the last chapter and presents it as a case study of the broader faith formation for which he advocates. He allows for readers who will “choose to close the book just before that chapter.”

At this point, Believe becomes something of a self-help book. Its hypothetical audience comprises those readers Douthat has successfully persuaded of the reasonableness of belief to the point of their actual belief. What then? Douthat imagines a “Spirituality” section of a secular bookshop. Side by side are works on Judaism, Christianity, Islam, Buddhism, Hinduism, Mormonism, the occult, demonology, Wicca, astrology, and more. His advice for pilgrims in choosing a book is to look at those representing the biggest religions first—simply because their success in winning adherents suggests that they’re on to something. And because his case for God opens the door to demons that Mark Lilla keeps shut, Douthat warns off those embarked on his self-help journey from dabbling in the occult and other forms of asking for trouble.

Douthat explicitly acknowledges that he is at risk of “perennialism” here—that is, of judging all major religions to be converging on “permanent truths about God and the cosmos” and perhaps as well on a set of common moral and ethical teachings about the good life and how to lead it, quite apart from the particular revelations at the core of each tradition. His book is at its least attractive with this guidance for seekers: “If you find the general case for faith convincing but Islam’s traditional attitude toward women retrograde or the Catholic Church’s teaching on, say, masturbation ludicrous, then you should seek out the forms of religion that agree with you, build them up and let them try to build you up; become the change you seek in the religious world.”

Taking this statement more seriously than it deserves, I’d say it smacks a bit of reserving for man the right to show God who’s boss—which invites the divine rejoinder “Where were you when I laid the foundations of the earth?” And, of course, promoting belief in the premise underlying God’s question to Job is, after all, Douthat’s main point. He’s just too nice to say: Disobey God’s law and his prophets if you wish, but be prepared for the possibility of consequences.

As to where the foundations of the earth came from, that’s not something science or logic can tell us. But apropos of Douthat’s primary contention, it’s entirely reasonable to believe the answer is God.

This article was originally published on May 15, 2025 in Commentary.

Leading the Free World

The United States has led the free world for eight decades, helping to usher in an era of unprecedented human flourishing. For much of that period, democracies expanded in number and quality. Governments recognized and increasingly protected human rights.1 Americans identified the solidity of their own democracy with the support of freedom abroad, and those seeking to abridge fundamental rights appeared increasingly ineffectual and anachronistic.2 The United States, working with an expanding number of free nations, enjoyed greater security, prosperity, and liberty.

Developments over recent years have pulled in the opposite direction. Global democracy has contracted over the past decade, and autocracies such as China and Russia are both newly emboldened and working together.3 Populations doubt democracy’s efficacy to a greater degree than before, and some find attraction in the notion of strongman rule.4Transnational repression and foreign malign influence have risen, along with the movement of illicit funds across borders.5 Americans increasingly question the quality of their own democracy, along with their traditional role in supporting rights and freedoms abroad.6

The next year could mark a turning point in the contest between freedom and authoritarianism. With a presidential election looming, now is the time for the United States to reassert global leadership on democracy and human rights. Doing so strengthens the U.S. position amid strategic competition with key autocracies and helps protect America’s own democratic way of life. Failure to do so would amount to unilateral disarmament in a defining contest of the 21st century. 

This short paper urges U.S. policymakers to seize the moment, recommit to a values-based foreign policy agenda, and combine their defense of U.S. democracy with an affirmative effort to support democracy and human rights abroad. The time to act is now.

The next year could mark a turning point in the contest between freedom and authoritarianism.

For the United States, supporting democracy is a matter of both values and interests. It helps mobilize the American public around U.S. foreign policy. It provides purpose and direction to Washington’s international efforts, beyond narrowly construed national interests. And history demonstrates that democracy promotion has been a powerful way to advance global stability.7

The task is urgent today. Freedom declined across the world for an 18th consecutive year in 2023.8 Beijing and Moscow seek a global order conducive to their own forms of authoritarian governance, and they work increasingly with countries such as Iran and North Korea in the pursuit of their preferred norms. They see their assault on democracy as a pursuit of strategic advantage, enabling them to enhance their own power by eroding the internal cohesion of democracies and the solidarity of democratic alliances.9 They wish to show that pluralism fragments a population, leaving it unable to produce results or project power that can match the strongmen. The future shape of global and domestic politics—whether based on liberal order and universal values or autocracy and might-makes-right—will be determined in significant part by how Washington engages the contest.

U.S. leadership in that contest appears particularly ill-timed to some. America’s domestic maladies are obvious and include deep partisan divisions, political gridlock, declining respect for democratic institutions and processes, and even politically motivated violence.10Some observers suggest that, given America’s difficulties, it simply lacks the credibility to stand up for democracy and freedom elsewhere.11 Others cite close U.S. ties with autocrats and wars of regime change to emphasize inconsistency and hypocrisy.12 Amid sharpened great power competition, still others argue that a values-based foreign policy agenda is a luxury better suited for less contested times.13

Beijing and Moscow seek a global order conducive to their own forms of authoritarian governance, and they work increasingly with countries such as Iran and North Korea in the pursuit of their preferred norms.

Yet promoting democracy abroad and addressing deficiencies at home are not mutually exclusive activities—they are, rather, reinforcing lines of effort. Threats to democracy, after all, do not respect borders. The United States is not unique in experiencing political violence, deep divisions, or eroded trust in democratic processes. These and other pressing challenges, including state-based political interference, are often best addressed in concert with partners and allies.14 And societies, including our own, ebb and flow, but genuine democracies retain fixed ideals. America should embrace its founding principles and expand the enjoyment of universal rights and liberties. To abandon the effort because of our own flaws would be unfaithful to the fundamental idea of America.

It would also undermine U.S. security. Fostering democratic values not only aligns with America’s deepest ideals, but also helps create a more secure, stable world in which the United States can advance its national interests.15 Democracies are unlikely to go to war with one another, the United States’ closest allies are democracies, and its most reliable trade and investment partners are liberal societies.16 Open, transparent governance abroad is good for U.S. diplomatic, defense, and commercial relationships. A world in which the institutions of liberal democracy are strong is safer for the United States than one in which autocracy is on the prowl.17

If U.S. policymakers decline to seize the values imperative, they risk a world shaped by autocratic preferences and dominated by dictatorships. Acting now is the best chance to defend democracy and champion a robust agenda for protecting and enlarging the free world. The current administration and the next should recommit to putting human rights and democracy at the center of U.S. foreign policy. 

A world in which the institutions of liberal democracy are strong is safer for the United States than one in which autocracy is on the prowl.

The effort will need to go well beyond rhetorical exhortations. Washington must combine all instruments of national power to reinforce democracies, sustain them, and make them successful. Tradeoffs with other objectives will be inevitable, and an exhaustive list of activities goes beyond the scope of this analysis. We recommend, for a start, the actions below.

Expose human rights abuses and corruption. The U.S. government has effectively collected and declassified information about Russian depredations amid its war in Ukraine.18Washington should do the same in cases of human rights violations. Doing so would employ public exposure to hold governments accountable for their actions—and possibly deter further abuses. At a minimum, such efforts could catalog human rights abuses and corruption for future efforts at accountability. 

Counter corruption and hybrid threats. Washington should prioritize countering and building resilience to hybrid threats from authoritarian governments, including but not limited to weaponized corruption. Washington should identify and publicize instances of corruption by malign foreign actors. The Department of Justice can play an anticorruption role by boosting its efforts to monitor illicit commercial spyware and enforce antibribery laws such as the Foreign Corrupt Practices Act. The Treasury Department may need greater funding to monitor corruption and implement sanctions against foreign transgressors. 

Speak out. Senior policymakers and members of Congress should lend their voices and images to democratic dissidents and activists. This tool has fallen out of favor over the years but previously had been used to great effect by Republicans and Democrats alike.19Meetings with political opposition figures, independent media actors, and others should be a regular feature of congressional delegations and administration trips abroad. Members of Congress and administration officials should direct speeches, opinion articles, resolutions, and statements at both the general human rights conditions of particular countries and individual actors who may be at risk. 

Emphasize the role of Congress. Congress has historically served as a guardian of the democracy agenda and it should continue to do so today. In 2019, for instance, Republicans and Democrats alike blocked the White House’s attempt to slash foreign aid by an estimated $4 billion.20 A bipartisan group introduced legislation earlier this year to hold Georgian officials accountable for corruption, human rights abuses, and antidemocratic efforts.21Congressional initiative is essential to maintain focus on these issues across administrations and among competing White House priorities. 

Protect individual privacy. Dictatorships today are more resilient in part because of how they harness technology—often of U.S. design and source—to influence and control their citizens.22 The United States should take better care to protect individual privacy as a basic right and foundation of democratic societies. The U.S. government has already taken steps to design an export and sanctions regime aimed at preventing the proliferation of hacking tools, facial recognition technology, and other surveillance technologies.23 These efforts should be expanded. Future administrations should, for instance, examine the export of advanced U.S. semiconductors that can train AI systems aimed at social repression. 

Partner with other democracies. The United States cannot defend its values without like-minded partners. In the past, for instance, each country subject to election interference responded individually and on an ad hoc basis.24 A coalition of key democracies, adopting a mechanism akin to NATO’s Article 5, should pledge a collective, nonmilitary response to election interference by foreign states, such as compromising voting machines or illicitly hacking campaigns.25 Democratic partners should also forge new issue-based multilateral groupings that collaborate in areas such as technology, foreign aid, and electoral assistance to ensure that each is infused with democratic values. 

Contest authoritarianism in international organizations. Key international bodies, ranging from the UN Human Rights Council to more tailored groupings such as the International Telecommunication Union, have emerged as venues of competition between democracies and their autocratic opponents.26 Washington should work with its allies to counter authoritarian influence in the multilateral system and promote liberal values in decision-making arenas. 

Combat transnational repression. Transnational repression (TNR)—actions by a government to reach beyond its borders to stifle dissent, most commonly by suppressing democracy and human rights advocates—is on the rise. Countries such as China have attempted to harm dissidents even inside the United States, infringing on rights intrinsic to U.S. democracy.27 A more vigorous approach is necessary to prevent autocratic waves from washing onto American shores. Washington should allocate increased funding to identify and combat such efforts, including through indictments, extraditions, sanctions, and other measures, and work more closely with allies to expose and arrest TNR elsewhere. 

To these ends, policymakers should consider an incentive framework to deter acts of transnational repression, perhaps one modeled on the Trafficking in Persons (TIP) framework. The TIP report, published annually by the U.S. government, places countries into one of three tiers based on their efforts to combat human trafficking.28 A similar framework could assess a government’s acts of transnational repression or its efforts to mitigate such acts. 

Publicizing tiered rankings based on TNR, as has been done with the TIP framework, could incentivize governments to act. If incentives prove insufficient, more punitive approaches should be considered. Engaging in acts of transnational repression on U.S. soil, for example, could incur reduced arms sales or diplomatic sanctions. U.S. officials should also take steps to better enforce existing laws, including by implementing Section 6 of the Arms Export Control Act.29 That provision enables the president to prohibit arms transfers to countries that habitually intimidate or harass individuals in the United States. 

Increase transparency. Military assistance to front-line states such as Ukraine is vital, as are sanctions and other punitive measures levied against Russia and other autocratic aggressors. Oversight of and transparency in such regimes would help ensure that Washington is aiding only those worthy of American support and punishing entities undermining key values. It may also enhance the domestic political sustainability of such efforts. Instead of neglecting congressional oversight for enormous defense aid packages, for instance, Washington should integrate oversight mechanisms into them across both the executive and legislative branches.30 Providing Congress with the details of sanctions regimes should receive similar attention.

Prioritize budgets. Effective work on democracy or human rights requires appropriate funding—some of which is wanting. The administration’s FY25 budget request allocates over $3 billion for bolstering global democracy—up $88 million compared with the FY23 enacted level, including investments in the Summit for Democracy.31 That funding is, however, spread across projects as varied as election assistance and new infrastructure in emerging democracies—making it a less impressive amount than at first glance.32 It also remains far outpaced by the increased global demand for democracy assistance. Individuals around the world risk their lives in the pursuit of democracy and liberal values, from soldiers fighting in Ukraine to Afghan women resisting social and political exclusion.33Washington, together with its allies, should meet their needs with the urgency—and financing—they deserve.

Conclusion

The above actions are designed to be illustrative rather than exhaustive. More important than any one recommendation is adopting a sense of gravity and urgency. Democracy and human rights are declining around the world.34 Autocracies are on the march and working together to overturn key elements of liberal international order.35 The United States, for all its flaws and other priorities, remains the indispensable champion of values and rights abroad. Now is the time to fuse values with interests at the core of U.S. foreign policy and to go on the offense. The United States must contest the expansion of dictatorship, outcompete autocracies, and demonstrate that democracies work—individually and together—more effectively than strongmen ever can. 

The good news is that America is wholly equipped to answer such a call to arms. It possesses the population, the geography, the resources, the allies, and the experience to support democracy and fundamental rights everywhere. Doing so is no easy task, and it involves difficult tradeoffs and judgments. But the global situation has grown more dire. It requires a renewed American commitment today.

This paper is the product of a bipartisan task force that examined the role that democracy and human rights do and should play in U.S. foreign policy. While individual signatories may differ on particular points herein, all endorse the broad scope of the paper’s analysis and recommendations.

This article was originally published on October 28, 2024 in The Center for American Security by Richard Fontaine, Shanthi Kalathil, Tod Lindberg, Tom Malinowski, Sarah Margon, Gibbs McKinley, Derek Mitchell, Nicole Bibbins Sedaca, Corban Teague, and Daniel Twining

A Theory of Rawls

Review of ‘Liberalism as a Way of Life’ by Alexandre Lefebvre

Alexandre Lefebvre’s Liberalism as a Way of Life belongs to the school of Anglo-American political philosophy whose defining figure was Harvard’s John Rawls, author of A Theory of Justice and Political Liberalism. The Rawls school views the pursuit of justice as the cornerstone of a liberal society. But Lefebvre’s insightful account is also something of a departure, an original and at times exciting contribution to our understanding of liberalism—in the classical as opposed to the partisan political sense.

A professor at the University of Sydney, Lefebvre describes himself as a “liberal all the way down.” For him, as for Rawls, in deciding on laws and social arrangements that can perpetuate a just society, we must place ourselves behind a “veil of ignorance” in “the original position” of a member without attributes of such a society—that is, without knowledge of one’s place in it, whether one is rich or poor, favorably endowed with genetic and environmental gifts or encumbered by their absence. From this position, Rawls argues, reasonable people—knowing nothing about where they would fall in such a society—would write its rules in such a way as to favor the least advantaged among them, because once the veil is lifted, they could find they occupy exactly the least advantaged position.

No society has ever been created from such a premise, of course, and from a certain angle, it could look as if Rawls was rejecting all claims of justice on behalf of societies—or nation-states—that fail his test of putting the least advantaged first. Thus, one could read Rawls—and many did—as calling for radical reform and repudiating the legitimacy of states organized according to other priorities. But that’s not how Rawls saw it. His liberal theory of justice didn’t encompass a corresponding theory of injustice, according to which all societies reflecting deviations from reasonable conclusions behind the veil of ignorance were so disconnected from justice as to warrant condemnation. He thought they could improve.

In any event, Lefebvre notes, Rawls himself became somewhat dissatisfied with A Theory of Justice—notwithstanding its colossal success in his field and its standing as perhaps the most influential work of political philosophy of our time ever since its publication in 1971—on the grounds that it was “unrealistic.” So he turned to another question in his later work, Political Liberalism (1993). “How is it possible,” Rawls asked, “for there to exist over time a just and stable society of free and equal citizens, who remain profoundly divided by reasonable religious, philosophical, and moral doctrines?”

The answer is that competing but reasonable “comprehensive doctrines” at work among people could yield to a “liberal political conception” in which big-picture doctrines would be respected insofar as they were reasonable. For Rawls, a comprehensive doctrine is anything that spells out the details of how to live a good life: as an Orthodox Jew, say, or an Opus Dei Catholic, or a Communist, or a cultivator of Aristotelian virtue. Political liberalism would never seek that status of a “comprehensive doctrine,” but it could be the organizing and limiting principle according to which adherents of various doctrines could live in a stable society of free and equal citizens.

Lefebvre writes about Rawls’s evolution as a thinker very well. But his distinctive achievement is to note that nowadays Rawls needs turning around. That’s because, he says, “decade by decade, year by year, and day by day, liberal ideals and sensibilities have spread to every nook and cranny of the background culture of liberal democracies.” Further, “so ubiquitous is liberalism that it has performed that special trick of disappearance achieved only by omnipresence: to have become invisible by infiltrating everything.” Lefebvre’s conclusion: “Love it or hate it, we all swim—we positively marinate—in liberal waters. And here is my critique: the firewall that political liberalism draws between comprehensive doctrines and a liberal political conception obscures this changed landscape.” Lefebvre doesn’t quite say that liberalism has become a “comprehensive doctrine,” indeed the defining comprehensive doctrine, of modern democracies. But he ought to have.

_____________

It is true that liberalism offers no single formula for how to live a good life of the sort that once characterized states and their gods or ideologies. In that sense, it is not “comprehensive,” leaving to individual judgment or conscience many important questions about how to live. But liberalism does include at least one doctrinal element that overrides any and all presumptions of any and all comprehensive doctrines that might contradict it. That is the “reasonableness standard.” Liberals insist that adherents of comprehensive doctrines, whether they count themselves liberal or not, be “reasonable” in their adherence. Indeed, the word is central to Rawls’s research question—so much so that one could say he slipped his answer into the question itself. An unreasonable “comprehensive doctrine”—that is, a coercive or violent doctrine—cannot be part of a “just and stable society of free and equal citizens.” It is up to these contending comprehensive doctrines, almost all of which have historical associations with coercion and violent propagation, to modify themselves as necessary to become “reasonable.” Whether their adherents profess allegiance or opposition to a liberal political conception, their behavior must conform to it, or there will be adverse consequences for them.

Lefebvre has the acuity to see that, generally speaking, the behavior of individuals and organized groups in modern democratic societies is liberalism-compliant. He addresses his book mainly to those who identify themselves as through-and-through liberals in the sense of both the Rawls of A Theory of Justice and the Rawls of Political Liberalism— people who are, if not “liberals all the way down” like himself, then most of the way. Unfortunately, this nudges him into two related observational errors. As Peter Berkowitz notes, Lefebvre is too stingy in recognizing the genuineness of the liberalism of people who, for whatever reason, don’t identify themselves as liberals. As an exercise, Lefebvre would like liberals to imagine themselves in the “original position” when thinking about the fair distribution of social goods. That might be sufficient for a certain subset of liberals of the left-progressive sort, but if those were the only people practicing “liberalism as a way of life,” there would be no book to write about a society imbued with liberalism. The point is that most conservatives, most Orthodox and other Jews, most devout conservative and liberal Catholics, most evangelical Protestants, and many other non-progressives in Western societies, are nevertheless practicing liberals in daily life. They follow their “comprehensive doctrines” in a reasonable way—which is to say, within the overriding noncoercive liberal comprehensive doctrine. Although it may be tough for those steeped in the ways of Anglo-American political philosophy to accept, even the vast majority of Trump supporters are functionally liberal—not those who stormed the Capitol or those who think storming capitols is a good idea, but most everyone else. It may also be tough for people who spend their lives theorizing politics to accept that many Americans and other denizens of modern society don’t care very much about politics at all, and that’s fine.

His second observational miscue lies in his characterization of the gap between liberalism as the pursuit of justice or fairness and the actuality of the liberal world we live in. Rawlsian justice is what “liberals all the way down” want; but “liberaldom,” in Lefebvre’s coinage, is what we (all of us, whether we want it or not) actually have. The liberalism of liberaldom falls far short of what sweet reason would yield behind the veil of ignorance. In his own characterization, Lefebvre’s liberaldom is to liberalism as Kierkegaard’s “Christendom” is to Christianity—a complacent world in which we are all failures unable to live up to our professed ideals. In his view, liberals in liberaldom have much to answer for. He admits that he and his wife “spend a lot of money to send our daughter to private school” for the additional opportunity it provides, even though that might betray the egalitarianism required by Rawls’s conception of justice. “Now it’s your turn” to start your self-help program by making your own necessary admissions, he admonishes his liberal readers.

That is refreshingly honest, even though what such admissions really serve to illuminate is a key problem with Rawls’s approach to the pursuit of justice—which is that we will always strive to do more for those we love and who are in our own personal care, and that there is nothing unjust or immoral in that. Lefebvre is accordingly a little too hard on himself and on liberalism as we live it.

This article was originally published on October 15, 2024 in Commentary