The Conversation

Iran's regime was built for survival — and a long war is now likely

The joint US–Israel strikes on Iran, which killed the Iranian supreme leader, Ayatollah Ali Khamenei, and Tehran’s retaliatory strikes on Israel and neighbouring Arab countries have again plunged the Middle East into war.

US President Donald Trump and Israeli Prime Minister Benjamin Netanyahu said their aim is to bring about a favourable regime change in Iran. The implications of this for Iran, the region and beyond should not be underestimated.

Although Khamenei’s killing is a significant blow to the Islamic regime, it is not insurmountable. Many Iranian leaders have been killed in the past, including Qassem Soleimani, Tehran’s regional security architect, who was assassinated by the US in January 2020.

But they have been replaced relatively smoothly, and the Islamic regime has endured.

Khamenei’s departure is unlikely to mean the end of the Islamic regime in the short run. He anticipated this eventuality, and reportedly last week arranged a line of succession for his leadership and that of senior military, security and political leaders if they were “martyred”.

However, Khamenei was both a political and spiritual leader. He has commanded followers not only among devout Shias in Iran, but also many Muslims across the wider region. His assassination will spur some of them to seek revenge, potentially sparking a wave of extremist violent actions in the region and beyond.

A regime built for survival

Under a constitutional provision of the Islamic Republic, the Assembly of Experts – the body responsible for appointing and dismissing a supreme leader – will now meet and appoint an interim or long-term leader, either from among their own ranks or outside.

There are three likely candidates to be his successor:

  • Gholam-Hossein Mohseni-Eje’i, the head of the judiciary
  • Ali Asghar Hejazi, Khamenei’s chief-of-staff
  • Hassan Khomeini, the grandson of the founder of the Islamic Republic, Ayatollah Rohullah Khomeini.

The regime has every incentive to do what it must to ensure its survival.There are many regime enforcers and defenders, led by the Islamic Revolutionary Guard Corps (IRGC) and its subordinate paramilitary Basij group, across the country to suppress any domestic uprisings and fight for the endurance of the regime.

Their fortunes are intimately tied to the regime. So are a range of administrators and bureaucrats in the Iranian government, as well as regime sympathisers among ordinary Iranians. They are motivated by a blend of Shi’ism and fierce nationalism to remain loyal to the regime.

Trump and Netanyahu have called on the Iranian people – some 60% of whom are below the age of 30 – to topple the regime once the US-Israeli operations have crippled it.

Many are deeply aggrieved by the regime’s theocratic impositions and dire economic situation and took to the streets in protests in late 2025 and early 2026. The regime cracked down harshly then, killing thousands.

Could a public uprising happen now? So far, the coercive and administrative state apparatus seems to be solidly backing the regime. Without serious cracks appearing among these figures – particularly the IRGC – the regime can be expected to survive this crisis.

Global economic pain

The regime has also been able to respond very quickly to outside aggression. It has already hit back at Israel and US military bases across the Persian Gulf, using short-range and long-range advanced ballistic missiles and drones.

While many of the projectiles have been repelled, some have hit their targets, causing serious damage.

The IRGC has also set out to choke the Strait of Hormuz – the narrow strategic waterway that connects the Persian Gulf to the Gulf of Oman and Indian Ocean. Some 20% of the world’s oil and 25% of its liquefied gas flows through the strait every day.

The United States has vowed to keep the strait open, but the IRGC is potentially well-placed to block traffic from going through. There could be serious implications for the global energy supply and broader economy.

Both sides in this conflict have trespassed all of the previous red lines. They are now in open warfare, which is engulfing the entire region.

A prolonged war looks likely

If there was any pretence on the part of Washington and Jerusalem that their attacks would not lead to a regional war, they were wrong. This is already happening.

Many countries that have close cooperation agreements with Iran, including China and Russia, have condemned the US-Israeli actions. The United Nations secretary-general António Guterres has also urgently called for de-escalation and a return to diplomatic negotiations, as have many others.

But the chances for this look very slim. The US and Iran were in the middle of a second round of talks over Tehran’s nuclear program when the attacks happened. The Omani foreign minister, who mediated between the two sides, publicly said just days ago that “peace was within reach”.

But this was not enough to convince Trump and Netanyahu to let the negotiations continue. They sensed now was the best time to strike the Islamic Republic to destroy not just its nuclear program but also its military capability after Israel degraded some of Tehran’s regional affiliates, such as Hamas and Hezbollah, and expanded its footprint in Lebanon and Syria over the last two and a half years.

While it is difficult to be definitive about where the war is likely to lead, the scene is set for a long conflict. It may not last days, but rather weeks. The US and Israel do not want anything short of regime change, and the regime is determined to survive.

With this war, the Trump leadership is also signalling to its adversaries – China, in particular – that the US remains the preeminent global power, while Netanyahu is seeking to cement Israel’s position as the dominant regional actor.

Pity the Iranian people, the region and the world that have to endure the consequences of another war of choice in the Middle East for geopolitical gains in an already deeply troubled world.The Conversation

Amin Saikal, Emeritus Professor of Middle Eastern Studies, Australian National University; The University of Western Australia; Victoria University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Stephen Colbert was right

Talk show host Stephen Colbert made headlines on Feb. 17, 2026, when he wrapped a network statement in a dog-waste bag and tossed it in the trash.

He did it live, while on air.

The move came after CBS lawyers reportedly told him he could not broadcast a scheduled interview with Democratic Texas Senate candidate James Talarico on his show, Late Night with Stephen Colbert. According to Colbert, the network warned him that broadcasting the interview could trigger the Federal Communications Commission’s equal time rule, which requires broadcasters to allow political candidates equal access to the nation’s airwaves.

CBS said it gave Colbert “legal guidance” that airing the segment could raise equal time concerns and suggested other options.

Colbert countered that in decades of late-night television, he could not find a single example of the rule being enforced against a talk show interview. He ultimately posted his Talarico interview on YouTube instead, where broadcasting rules don’t apply.

As a media scholar, I believe Colbert is right about the law. Congress has deliberately protected editorial discretion to prevent equal time rules from chilling political speech. And the FCC has extended this privilege to shows like his.

To understand why, you have to go back to 1959 and to a forgotten fight over the role of broadcasting in a democratic society.

Amending ‘equal time’

Because the airwaves have been viewed as a scarce public resource, radio and television broadcasting have been regulated to balance the First Amendment rights of the press with public interest obligations. That includes the need to provide reasonable access to the airwaves for candidates for office – so citizens can hear what they have to say, whether in the form of paid advertising or unpaid news coverage.

After first appearing in the Radio Act of 1927, the equal time provision was codified in Section 315 of the Communications Act of 1934.

That law created the FCC and still governs the use of the nation’s airwaves today. It requires broadcast licensees to provide “equal opportunities” to legally qualified candidates in a given election if they allow one candidate to “use” their facilities. The requirement was intended to prevent broadcasters from favoring one candidate over another and to foster robust political debate that would serve the public interest.

But the statute did not clearly define what counted as a “use.”

That ambiguity was a known issue, but it came to a head in 1959, when Lar Daly, a fringe Chicago mayoral candidate, filed a complaint with the FCC. He argued that if stations aired news clips of his opponents – including the incumbent mayor – as part of their routine coverage, he was entitled to equal time on air.

The FCC agreed. And it created a ruling that meant even routine news coverage of a candidate could trigger equal time obligations.

Broadcasters immediately warned that the decision would make political journalism nearly impossible. If every news interview or campaign clip required providing comparable time to every rival – including minor or fringe candidates – stations would either have to book everyone or drastically scale back political coverage.

NBC president Robert Sarnoff issued a thinly veiled threat in a message that was not lost on politicians who would be affected by the change: “Unless the gag is lifted during the current session of the Congress, a major curtailment of television and radio political coverage in 1960 is inevitable.”

Later that year, Congress stepped in and amended Section 315 to create explicit exemptions for “bona fide” newscasts, news interviews, news documentaries and on-the-spot coverage of news events. As my colleague Tim P. Vos and I note in our research on the history of the amendment, Congress rejected calls to repeal equal time altogether.

Instead, lawmakers preserved the rule for candidate-sponsored advertising while shielding news programming. Persuaded by broadcasters, lawmakers determined that professional journalism, guided by norms of balance and fairness, would best serve democratic discourse.

In signing the 1959 legislation, President Dwight D. Eisenhower highlighted the “continuing obligation of broadcasters to operate in the public interest and to afford reasonable opportunity for the discussion of conflicting views on important public issues.”

Eisenhower concluded by appealing to the good intentions of the nation’s broadcasters: “There is no doubt in my mind that the American radio and television stations can be relied upon to carry out fairly and honestly the provisions of this Act without abuse or partiality to any individual, group, or party.”

The talk show exemption

Over the decades, the FCC has interpreted the 1959 exemptions broadly.

Programs ranging from Meet the Press to The Jerry Springer Show to The Tonight Show and other interview-based broadcasts have been treated as “bona fide news interviews,” even when hosted by comedians. That’s why Colbert’s claim that there is no enforcement history against late-night talk shows is accurate.

It’s important to remember that equal time still applies in other contexts. If a candidate purchases or receives airtime for an advertisement, opponents are entitled to comparable access.

Equal time also applies to non-exempt entertainment programming, such as Saturday Night Live. Donald Trump’s hosting gig on SNL in November 2015 triggered an equal time request from four opposing primary candidates. And NBC obliged by providing a comparable amount of airtime for their campaign messages.

FCC Chairman Brendan Carr recently signaled he was considering eliminating the talk-show exemption, arguing that some programs are “motivated by partisan purposes.”

As of now, no legal change has occurred. And it seems to me that CBS has acted out of caution, responding to political and regulatory pressure rather than to an actual rule change. That makes this episode unusual: The equal time rule was perhaps applied indirectly, through corporate self-censorship, not through direct FCC enforcement.

Why this moment matters

Either way, the Colbert incident highlights the growing restrictions on editorial independence during the second Trump administration – either imposed by government threat or corporate fear.

Whether through direct regulatory intervention or indirect corporate influence, this incident and others like it show an increased willingness to interfere with the editorial independence of media producers.

The dispute is part of what some critics view as an ongoing effort by the Trump administration to silence criticism. Trump is no fan of Colbert and has targeted comedians before.

CBS already announced in 2025 that Colbert’s show will be canceled in May 2026, leading many to suggest CBS was trying to appease Trump and his FCC, particularly ahead of a then-pending merger that required FCC approval.

The 1959 amendment that created the equal time exemption aimed to preserve editorial independence and protect free expression by limiting equal time claims and ensuring vibrant political discourse. The decision reflected a judgment that professional editorial discretion, not mandatory equivalence, best served citizens.

If the FCC alters the exemption, it would represent a major shift in U.S. media policy and would almost certainly face legal challenges. The government has an important role to play in promoting free expression and protecting free speech, but this is a good time to be wary of efforts to alter regulations to control content.The Conversation

Seth Ashley, Professor of Communication, Boise State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How Britain's right wing is benefiting from the Epstein scandal

The arrest of Andrew Mountbatten-Windsor on suspicion of misconduct in public office will heap yet more pressure on the beleaguered government of Prime Minister Keir Starmer.

Mountbatten-Windsor’s arrest over allegations he passed government documents to sex offender Jeffrey Epstein comes directly on the heels of the resignation of Peter Mandelson, Starmer’s ambassador to the United States, due to his own alleged associations with Epstein.

The fallout from the scandal is hugely damaging to public trust in both the political establishment and institutions in the United Kingdom, including the royal family.

Trust in the royals already declining

It’s hard to separate the fate and popularity of the royal family from the institutions of British governance because they’re very much part of it.

The monarchy, specifically the Crown, is part of the British constitution. The monarch gives assent to all legislation that’s passed by parliament (in other words, he or she has to sign it for it to pass). While that might seem like a rubber-stamping exercise and that the monarch is a mere symbol in British politics, King Charles and, in slightly different ways, Queen Elizabeth II certainly have had their political preferences.

And despite the impression you get during royal occasions like weddings, funerals and coronations, the royals don’t enjoy unanimous support in Britain. In fact, public support has been declining in recent years, especially among the young.

In an Ipsos survey released this week, just 47% of Britons said they had a favourable opinion of the royal family on the whole (a seven-point decline from November). And just 28% of Britons believe the royal family has handled the allegations against Mountbatten-Windsor well, compared to 37% in November.

Importantly, there’s been a long-term trend of steady decline in support for the monarchy since 1983, when the British Social Attitudes survey first asked about this.

More broadly, and in common with many other liberal democracies, there is a pervasive sense the Epstein scandal is more evidence of the existence of a self-serving, corrupt elite making good for itself and harming others, while many people in the “left behind” and “squeezed middle” of society are struggling.

Politically, this perception adds further fuel to the notion that the inequality between the rulers and the ruled has become unjustifiable. Something has to change.

Pressure mounting on Labour

Starmer’s Labour government was already deeply unpopular before Mandelson’s alleged ties to Epstein were revealed. Now, it has entered some sort of permanent crisis mode.

Mandelson was one of the key figures behind the so-called “New Labour” project associated with the leadership of Prime Minister Tony Blair from 1997–2007.

New Labour has a dual legacy in British politics. On one level, it was the most electorally successful Labour government ever. But that electoral success seemed to come at the expense of a clearly defined sense of what a Labour Party stood for. Key players like Mandelson courted wealthy backers and moved Labour to the centre of British politics to, not unreasonably, win elections.

As such, many Labour supporters started to drift away from the party and towards other, at times diametrically opposed, political parties. In Scotland, this benefited the pro-independence parties. In England, it benefitted the radical-right Reform UK.

Reform has precious little governing experience, but that is its appeal. Its radical messages are finding traction with a large number of voters, many of whom formerly supported Conservative or Labour.

So in this context, when Mandelson, an already divisive figure, was named ambassador to the US in the belief he could help manage President Donald Trump, Starmer’s political gamble to reinstate him to a public role backfired.

Reform could ultimately benefit

The British government’s travails represent another gilt-edged opportunity for Reform UK to capitalise on the unpopularity of Starmer, Labour and politics more broadly. But there is a risk for Reform, too.

Radical-right parties tend to place a great emphasis on the figure of the leader. For Reform UK, this is Nigel Farage.

Farage has had an incredible impact on British politics, especially since Brexit. But Farage, a former merchant banker, is also part of this global elite, despite pitching his politics at the “left behinds”. He has spent years courting Trump’s friendship. So, while there are no allegations against him related to Epstein, the public anger towards elites in general may eventually rebound on Farage, too.

Reform UK, however, is positioning itself successfully as an alternative to the two major parties in the UK, and could form a minority government at the next UK-wide elections in 2029.

The Conservative Party has shot its bolt as a result of its 14 years in government. And Labour came to power more as a rejection of the Conservatives than an endorsement of its policies. It has thus far excelled in failing to meet these low expectations, to Reform’s benefit.

Excluding a by-election in February, the first major political test will be local government elections in England, and elections to the Scottish Parliament and Welsh Senedd in May. A poor Labour showing will quite possibly lead to a leadership challenge against Starmer, whose government seems incapable of stemming the rise of support for an emboldened Reform.

A boost to republicanism

“Unprecedented” is an over-worn term. However, the arrest of a member of the royal family is the first in England since 1647 (it didn’t end well).

Prince William is still very popular. But there could still be very serious consequences for support for the monarchy in the various nations of the United Kingdom.

There isn’t the same sort of support for republicanism in England as there is in Australia, where republicans can de-legitimnise the king as a “foreign” monarch. Although this argument is made by republicans in Northern Ireland, English republicanism needs to be driven by some other sentiment.

And the Epstein crisis could be it, given it is drawing attention to gross inequality and damaging entitlement. It’s hard to see where exactly all this will end up, but it is quite possible this will give the greatest boost to anti-monarchical sentiment in England for some centuries.

It is important not to forget the women and girls who were victims of this rich man’s cabal. Yet, one great harm of the Epstein scandal in Britain is the further damage done to trust in institutions of governance and the boost it provides for the illiberal critics of what seems like a decaying order.The Conversation

Ben Wellings, Associate Professor in Politics and International Relations, Monash University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Clementine Barnabet: The Black woman blamed for serial murders in the Jim Crow South

In April 1912, a young Black woman named Clementine Barnabet confessed to murdering four families in and around Lafayette, Louisiana. The widespread news coverage at the time effectively branded her a serial killer.

Her confession, however, did not align with the timeline of crimes that had gripped America’s rice belt region with fear. Even today, her guilt is debated.

From November 1909 until August 1912, an unknown assailant – or assailants – zigzagged across southwestern Louisiana and southeastern Texas. Many Black families were slaughtered in their homes under the cover of darkness. An ax – the telltale weapon – was almost always found in the bloody aftermath.

All but one of the scenes were located within a mile of the Southern Pacific Railroad’s Sunset Route. In each case, a mother and child were always among the victims. Evidence of additional weapons was often found nearby, suggesting a deliberate cruelty to the carnage.

Dubbed the “axman”, the unknown assailant eluded the authorities and terrified local Black communities.

Today, when scholars and laypeople alike discuss Clementine Barnabet, they oscillate between two extremes: portraying her as a fear-inducing, cult-leading Black female serial killer, or as an innocent young Black woman caught in circumstances beyond her control.

In more than a decade of researching Clementine Barnabet, I’ve been struck by how print media created overtly sensationalized accounts of the mythology of the axman and, by extension, the axwoman. Whether Barnabet committed the crimes she said she did – or any of the axman murders, for that matter – is irrelevant to the primary motive the media constructed for her fatal violence: religion.

Diverse faith traditions

In Jim Crow Louisiana, various expressions of faith were possible. The state’s history as a French colony – one that also practiced slavery – meant it was home to the largest percentage of Black Catholics in the United States.

At the same time, religions like Voodoo, that originated in West Africa, reached the region on slave ships. Voodoo was not necessarily at odds with Catholicism; enslaved practitioners creatively adapted their ancestral faith to that of their enslavers.

Some displays of faith were not organized religions at all, but folkways. Hoodoo, for example, has West African origins, though it also draws upon European and Native American elements. Hoodoo practitioners – sometimes called doctors – and their clients often practice a religion, yet they also seek comfort in the supernatural possibilities of their craft.

This craft involves the physical manipulation of earthly elements such as graveyard dirt or plants like John the Conqueror root to achieve magical ends, often resulting in conjures – or ritual objects – needed to bring about desired goals. Conjures are believed to help people protect themselves, harm one’s adversaries, alter one’s circumstances, intervene in one’s relationships and more.

In their most powerful form, believers contend that conjures can bring about a person’s death.

For some believers, elements of Catholicism, Voodoo, Protestantism and hoodoo combine into syncretic faith practices. Incorporating multiple systems of beliefs has been an aspect of many Louisianans’ identities for generations. Most of the time, this blending of practices, ideologies and communities is depicted as a quirky – even “backward” – way to make sense of the world.

Yet during the axman’s reign in the early 1900s, a Black woman’s confession to murder was interpreted through the lens of religious deviance rather than diversity.

A timeline of events

When Barnabet confessed in April 1912, it was technically the second time she had done so. The first time was in November 1911 in the aftermath of the Randall family murder. Five members of the Randall family and their overnight guest had been brutally slaughtered in Lafayette, Louisiana at the end of the month.

According to regional newspapers, Barnabet was in the crowd that had gathered near the Randall family’s home after the murders were discovered. Reportedly, she caught the attention of the local sheriff. Not only did she live near the slain, but, according to a New Orleans daily, the authorities found “her room saturated with blood and covered with human brains.”

Barnabet was given a “third degree” examination – meaning she was tortured – by the New Orleans Police Department, and then supposedly confessed that she had killed the Randalls because, according to a Midwestern newspaper, they “disobeyed the orders of the church.” That church would become a topic of scrutiny and sensationalism by regional lawmen and news outlets alike throughout much of 1912.

At that time, Barnabet is also said to have confessed to killing another family in Lafayette.

Thus, Barnabet had already been in jail for over four months before her springtime confession. Between January and March 1912, four more families had been axed to death between Crowley, Louisiana and Glidden, Texas. In April, when Barnabet re-confessed, she added two more families to her victim roster.

In aggregate, the four families Barnabet confessed to killing had been slain between November 1909 and November 1911. Four more families had been murdered between her arrest and second confession, meaning she was in jail when they occurred. After her second confession and while she was still in custody, another three families were attacked with an ax, though for the first time, people survived the axman.

This convoluted timeline, in which more than half of the axman murders occurred after Barnabet had been apprehended, presented a challenge for investigators. They generally believed the crimes were related. Yet Barnabet could not have physically carried out the attacks in 1912.

To explain the continuation of the killings despite Barnabet’s incarceration, local lawmen leveraged the young woman’s own statements that had landed her in jail in the first place: that religion compelled her to murder.

It was this November 1911 confession that gave investigators the motive of religious fanaticism to attach to the axman crimes. Then, in January 1912, when the Broussards – another Black family – were murdered with an ax in Lake Charles, Louisiana, the local police found a Bible verse scrawled on their front door. This overtly religious symbol appeared roughly two months after Barnabet’s first confession and seemed to confirm her claims.

By April 1912, the idea of religiously motivated serial murder had been circulating in the rice belt region for months.

Hoodoo, conjures, and sensationalism

Barnabet’s confession was transcribed by R. H. Broussard (no relation to the victims), a newspaper reporter for the “New Orleans Item,” in April 1912.

According to the report, Barnabet claimed that she and four friends purchased conjures from a local hoodoo doctor one evening while socializing. They paid the practitioner for his services. Supposedly, the group then used the charms to move about undetected while committing murder.

In both her November 1911 and April 1912 confessions, Barnabet offered faith-based motives, albeit different ones. In the first case, it was the victims who reportedly erred in their religious duties. In the second, it was Barnabet’s own belief in hoodoo that facilitated such carnage. White media outlets did not interpret either of these statements as evidence of the region’s deep history of diverse faith expressions.

Instead, they labeled Barnabet “a black borgia,” “the directing head of a fanatical cult,” and the “Priestess of [a] Colored Human Sacrifice Cult.”

Moreover, sensationalized news coverage labeled the church Barnabet mentioned as the “Sacrifice Church.” Not surprisingly, the press depicted it as a cult-like organization, portraying Barnabet as either a low-level member or the “high priestess.” Sometimes, news reports also conflated the Sacrifice Church with Voodoo, thereby criminalizing a legitimate West African-derived religion as a cult.

According to unsubstantiated media accounts, the so-called Sacrifice Church promoted human sacrifice to gain immortality. Simultaneously, newspapers treated the conjure Barnabet possessed as proof of her fanaticism, reporting her claim that the only reason she confessed was because she had lost her charm.

Combined these selective – and sensational – interpretations of Barnabet’s supposed religious beliefs ignored the possibility of diverse spiritual practices that enriched life in the rice belt region.

Jim Crow and Black faith

I have yet to find evidence the Sacrifice Church existed. My research suggests the white press conflated the word “sacrifice” with the word “sanctified.” This might have been due, in part, to both sensationalism and ignorance.

Pentecostalism, a branch of evangelical Christianity that emphasizes baptism by the Holy Spirit and direct communication from God, started growing in popularity in the U.S. in the early 1900s. Many Pentecostal denominations call their adherents saints and their churches sanctified. Since sanctified churches were relatively new to Louisiana and some Pentecostal teachings – like speaking in tongues – challenged more mainstream Protestant doctrine, Pentecostalism might have contributed to the media’s reporting.

Although the Sacrifice Church may have simply been a linguistic error in reference to any number of sanctified churches in the rice belt, it is possible that Barnabet did indeed possess a conjure. The hoodoo doctor she accused of selling her and her comrades their charms was arrested and questioned by the Lafayette authorities. The statements he gave to the police aligned with hoodoo practices even as he denied knowing Barnabet or being involved in such folkways.

Given the variety of faith practices in Jim Crow Louisiana, it is possible both that Barnabet believed in her conjure and that sanctified churches were growing in popularity in the region. Whether she ever attended one is hard to know, just as the legitimacy of either confession is difficult to determine.

What is clear is that faith anchored the statements Barnabet made to the authorities. The other anchor, however, was murder. The consequences of how these events aligned reverberate in how Barnabet has been depicted.

Barnbet was front-page news in 1912. People knew her name, even as they debated her guilt. When she was convicted of murder, she was sentenced to life at the Louisiana State Penitentiary. A little over a decade later, she was released and disappeared from public view.

Today, however, no Black female serial killer occupies a similar place in America’s collective memory.

In recent years, there have been calls for a more serious acceptance of Black women’s experiences, knowledge and beliefs within the dominant culture. This shift also invites, I believe, a fresh look at Barnabet’s confessions and the crimes that were attributed to her.The Conversation

Lauren Nicole Henley, Assistant Professor of Leadership Studies, University of Richmond

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Why ‘The West Wing’ went from a bipartisan hit to a polarized streaming comfort watch

When the early 2000s hit series “The West Wing” returned on Netflix in December 2025, it spurred conversation about how the idealistic political drama would play in Donald Trump’s second term.

The series features a Democratic presidential administration led by President Josiah “Jed” Bartlet, played by Martin Sheen, and his loyal White House staff negotiating political challenges with character, competence and a fair bit of humor.

It sparked cultural commentary long after it ceased its original run in 2005.

In 2016, The Guardian’s Brian Moylan asserted that the “The West Wing” was appealing because it portrayed “a world where the political system works. It reminds us of a time, not too long ago, when people in political office took their jobs very seriously and wanted to actually govern this country rather than settle scores and appeal to their respective bases.”

In 2025, Vanity Fair’s Savannah Walsh mused that “The West Wing” might be dismissed by younger audiences as a “form of science fiction” or lauded by the demographic currently watching “Jed Bartlet fancams scored to Taylor Swift’s ‘Father Figure’” on TikTok.

Audiences have been comfort-streaming the “The West Wing” since Trump’s first term. Interest in the series spiked after Trump’s election in 2016, and it served as an escape from the contentious 2020 campaign.

When the cast reunited at the 2024 Emmy awards, the Daily Beast’s Catherine L. Hensley remarked that the series’ “sense of optimism about how American government actually functions … rang hollow, almost like watching a show from another planet.”

Nonetheless, Collider’s Rachel LaBonte hailed its Netflix return in late 2025 as a “balm for these confusing times.”

“The West Wing’s” transition from broadcast television behemoth to “bittersweet comfort watch” in today’s streaming era reveals a lot about how much our media and political landscapes have changed in the past 25 years.

As professors of media studies and political communication, we study the fracturing of our media and political environments.

The shifting appeal of “The West Wing” during the past quarter century raises a sobering question: Is political competence and an idealized respect for democratic norms losing popularity in 2026? Or does the new political reality demand engagement with the seamier side of politics?

‘The West Wing’s’ optimistic big tent

“The West Wing” premiered on NBC in the fall of 1999, blending political intrigue with workplace drama in a formula audiences found irresistible. The show surged in viewership in its second and third seasons, as it imagined responses from a Democratic administration to the values and ideology of the newly installed Republican President George W. Bush.

But the series was undergirded by an ethic of political cooperation, reinforcing the idea that, according to Walsh, “we’re all a lot more aligned than we realize.” In 2020, Sheen observed in an interview that writer “Aaron Sorkin never trashed the opposition,” choosing instead to depict “people with differences of opinion trying to serve.”

In 2019, The New York Times observed that the “The West Wing” presented “opposition Republicans, for the most part, as equally honorable,” and noted that the show earned fan mail from viewers across the political spectrum.

At its height of popularity, episodes of “The West Wing” garnered 25 million viewers. Such numbers are reserved today only for live, mass culture events like Sunday night football.

Of course, “The West Wing” aired in a radically different television environment from today.

Despite competition from cable, that era’s free, over-the-airwaves broadcasters like NBC accounted for roughly half of all television viewing in the 2001-02 season. Currently, they account for only about 20%.

Gone are the days of television’s ability to create the “big tents” of diverse audiences. Instead, since “The West Wing’s” original airing, television gathers smaller segments of viewers based on political ideology and ultraspecific demographic markers.

Darker, more polarized media environment

The fracturing of the television audience parallels the schisms in America’s political culture, with viewers and voters increasingly sheltering in partisan echo chambers. Taylor Sheridan has replaced Sorkin as this decade’s showrunner, pumping out conservatively aligned hits such as “Yellowstone” and “Landman.”

Liberals, conversely, now see “West Wing” alumni recast in dystopian critiques of contemporary conservatism. Bradley Whitford morphed from President Bartlet’s political strategist to a calculating racist in Jordan Peele’s “Get Out,” and a commander in “The Handmaid’s Tale’s” misogynist army.

Allison Janney, who played “The West Wing’s” earnest and scrupulous press secretary, is now a duplicitous and potentially treasonous U.S. president in “The Diplomat,” whose creator in fact got her start on “The West Wing.”

Even Sheen has been demoted from serving as America’s favorite fictional president to playing J. Edgar Hoover in the film “Judas and the Black Messiah,” whom Sheen described as “a wretched man” and “one of the worst villains imaginable.”

Television as equipment for living

Philosopher Kenneth Burke argued that stories function as “equipment for living.” Novels, films, songs, video games and television series are important because they not only reveal our cultural predilections, they shape them, providing us with strategies for navigating the world around us.

Films and series like “Get Out,” “The Handmaid’s Tale,” “The Diplomat” and “Judas and the Black Messiah” urge audiences to confront the racism and sexism ever-present in media and politics. That includes, as some scholars and viewers have noted, the often casual misogyny and second-string roles for some women and Black men in “The West Wing.”

As U.S. citizens protest authoritarianism in the streets from Portland, Oregon, to Portland, Maine, a comfort binge of a series in which the White House press secretary, as Vanity Fair said, “dorkily performs ‘The Jackal’ and doesn’t dream of restricting West Wing access – even on the administration’s worst press days” is appealing.

But indulging an appetite for what one critic has called “junk-food nostalgia for a time that maybe never even existed” may leave audience members less equipped to build the healthy democracy for which the characters on “The West Wing” always strived. Or it may invigorate them.The Conversation

Karrin Vasby Anderson, Professor of Communication Studies, Colorado State University and Nick Marx, Professor of Film and Media Studies, Colorado State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Pharaohs in Dixieland: How 19th-century America reimagined Egypt to justify slavery

When Napoleon embarked upon a military expedition into Egypt in 1798, he brought with him a team of scholars, scientists and artists. Together, they produced the monumental “Description de l’Égypte,” a massive, multivolume work about Egyptian geography, history and culture.

At the time, the United States was a young nation with big aspirations, and Americans often viewed their country as an heir to the great civilizations of the past. The tales of ancient Egypt that emerged from Napoleon’s travels became a source of fascination to Americans, though in different ways.

In the slaveholding South, ancient Egypt and its pharaohs became a way to justify slavery. For abolitionists and African Americans, biblical Egypt served as a symbol of bondage and liberation.

As a historian, I study how 19th-century Americans – from Southern intellectuals to Black abolitionists – used ancient Egypt to debate questions of race, civilization and national identity. My research traces how a distorted image of ancient Egypt shaped competing visions of freedom and hierarchy in a deeply divided nation.

Egypt inspires the pro-slavery South

In 1819, when lawyer John Overton, military officer James Winchester and future president Andrew Jackson founded a city in Tennessee along the Mississippi River, they christened it Memphis, after the ancient Egyptian capital.

While promoting the new city, Overton declared of the Mississippi River that ran alongside it: “This noble river may, with propriety, be denominated the American Nile.”

“Who can tell that she may not, in time, rival … her ancient namesake, of Egypt in classic elegance and art?” The Arkansas Banner excitedly reported.

In the region’s fertile soil, Chancellor William Harper, a jurist and pro-slavery theorist from South Carolina, saw the promise of an agricultural empire built on slavery, one “capable of being made a far greater Egypt.”

There was a reason pro-slavery businessmen and thinkers were energized by the prospect of an American Egypt: Many Southern planters imagined themselves as guardians of a hierarchical and aristocratic system, one grounded in landownership, tradition and honor. As Alabama newspaper editor William Falconer put it, he and his fellow white Southerners belonged to a race that “had established law, order and government over the earth.”

To them, Egypt represented the archetype of a great hierarchical civilization. Older than Athens or Rome, Egypt conferred a special legitimacy. And just like the pharaohs, the white elites of the South saw themselves as the stewards of a prosperous society sustained by enslaved labor.

Leading pro-slavery thinkers like Virginia social theorist George Fitzhugh, South Carolina lawyer and U.S. Senator Robert Barnwell Rhett and Georgia lawyer and politician Thomas R.R. Cobb all invoked Egypt as an example to follow.

“These [Egyptian] monuments show negro slaves in Egypt at least 1,600 years before Christ,” Cobb wrote in 1858. “That they were the same happy negroes of this day is proven by their being represented in a dance 1,300 years before Christ.”

A distorted view of history

But their view of history didn’t exactly square with reality. Slavery did exist in ancient Egypt, but most slaves had been originally captured as prisoners of war.

The country never developed a system of slavery comparable to that of Greece or Rome, and servitude was neither race-based nor tied to a plantation economy. The mistaken notion that Egypt’s great monuments were built by slaves largely stems from ancient authors and the biblical account of the Hebrews. Later, popular culture – especially Hollywood epics – would continue to advance this misconception.

Nonetheless, 19th-century Southern intellectuals drew on this imagined Egypt to legitimize slavery as an ancient and divinely sanctioned institution.

Even after the Civil War, which ended in 1865, nostalgia for these myths of ancient Egypt endured. In the 1870s, former Confederate officer Edward Fontaine noted how “Veritable specimens of black, woolyheaded negroes are represented by the old Egyptian artists in chains, as slaves, and even singing and dancing, as we have seen them on Southern plantations in the present century.”

Turning Egypt white

But to claim their place among the world’s great civilizations, Southerners had to reconcile a troubling fact: Egypt was located in Africa, the ancestral land of those enslaved in the U.S.

In response, an intellectual movement called the American School of Ethnology – which promoted the idea that races had separate, unequal origins to justify Black inferiority and slavery – set out to “whiten” Egypt.

In a series of texts and lectures, they portrayed Egypt as a slaveholding civilization dominated by whites. They pointed to Egyptian monuments as proof of the greatness that a slave society could achieve. And they also promoted a scientifically discredited theory called “polygenesis,” which argued that Black people did not descend from the Bible’s Adam, but from some other source.

Richard Colfax, the author of the 1833 pamphlet “Evidence Against the Views of the Abolitionists,” insisted that “the Egyptians were decidedly of the Caucasian variety of men.” Most mummies, he added, “bear not the most distant resemblance to the negro race.”

Physician Samuel George Morton cited “Crania Aegyptiaca,” an 1822 German study of Egyptian skulls, to reinforce this view. Writing in the Charleston Medical Journal in 1851, he explained how the German study had concluded that the skulls mirrored those of Europeans in size and shape. In doing so, it established “the negro his true position as an inferior race.”

Physician Samuel George Morton’s “Crania Aegyptiaca,” an 1844 study of Egyptian skulls, reinforced this view. He argued that the skulls mirrored those of Europeans in size and shape. In doing so, noted the Charleston Medical Journal in 1851, Morton established “the Negro his true position as an inferior race.”

Physician Josiah C. Nott, Egyptologist George Gliddon and physician and propagandist John H. Van Evrie formed an effective triumvirate: Through press releases and public lectures featuring the skulls of mummies, they turned Egyptology into a tool of pro-slavery propaganda.

“The Negro question was the one I wished to bring out,” Nott wrote, adding that he “embalmed it in Egyptian ethnography.”

Nott and Gliddon’s 1854 bestseller “Types of Mankind” fused pseudoscience with Egyptology to both “prove” Black inferiority and advance the idea that their beloved African civilization was populated by a white Egyptian elite.

“Negroes were numerous in Egypt,” they write, “but their social position in ancient times was the same that it now is, that of servants and slaves.”

Denouncing America’s pharaohs

This distorted vision of Egypt, however, wasn’t the only one to take hold in the U.S., and abolitionists saw this history through a decidedly different lens.

In the Bible, Egypt occupies a central place, mentioned repeatedly as a land of refuge – notably for Joseph – but also as a nation of idolatry and as the cradle of slavery.

The episode of the Exodus is perhaps the most famous reference. The Hebrews, enslaved under an oppressive pharaoh, are freed by Moses, who leads them to the Promised Land, Canaan. This biblical image of Egypt as a land of bondage deeply shaped 19th-century moral and political debates: For many abolitionists, it represented the ultimate symbol of tyranny and human oppression.

When the Emancipation Proclamation went into effect on Jan. 1, 1863, Black people could be heard singing in front of the White House, “Go down Moses, way down in Egypt Land … Tell Jeff Davis to let my people go.”

Black Americans seized upon this biblical parallel. Confederate President Jefferson Davis was a contemporary pharaoh, with Moses still the prophet of liberation.

African American writers and activists like Phillis Wheatley and Sojourner Truth also invoked Egypt as a tool of emancipation.

“In every human breast, God has implanted a principle, which we call love of freedom,” Wheatley wrote in a 1774 letter. “It is impatient of oppression and pants for deliverance; and by the leave of our modern Egyptians, I will assert that the same principle lives in us.”

Yet the South’s infatuation with Egypt shows how antiquity can always be recast to serve the powerful. And it’s a reminder that the past is far from neutral terrain – that there is rarely, if ever, a ceasefire in wars over history and memory.

This article has been updated to correctly attribute Samuel George Morton as the author of “Crania Aegyptiaca,” not as the author of the Charleston Medical Journal article. Quoted texts from Phillis Wheatley and William Falconer have also been slightly amended for accuracy.The Conversation

Charles Vanthournout, Ph.D. Student in Ancient History, Université de Lorraine

This article is republished from The Conversation under a Creative Commons license. Read the original article.

‘Which Side Are You On?’ American protest songs have emboldened social movements for generations

The presence of Department of Homeland Security agents in Minnesota compelled many people there to use songs as a means of protest. Those songs were from secular as well as religious traditions.

On Jan. 8, 2026, the day after Immigration and Customs Enforcement agent Jonathan Ross killed Minneapolis resident Renée Good on Portland Avenue, an anonymous post appeared on Reddit that featured an uncredited text clearly adapted from the lyrics of a Depression-era protest song from Appalachia, “Which Side Are You On?” The Reddit text criticized the recent federal presence in Minnesota and implored Minnesotans to take a stand.

In our town of Minneapolis,
There’s no neutrals here at home.
You’re either marching in the streets
or you kill for Kristi Noem
Which side are you on,
Oh which side are you on?
Which side are you on,
Oh which side are you on?
ICE is a bunch of killers
who hide behind a mask.
How do they get away with this?
That’s what you have to ask.
Which side are you on …

For centuries, songs have served as vehicles for expressing community responses to sociopolitical crises, whether government repression or corporate exploitation. “Which Side Are You On?” resonated with Minnesotans, in part because it has been recorded by numerous artists over the decades.

The song dates back to another societal struggle that occurred in another part of the United States during another crisis moment in American history. “Which Side Are You On?” has consoled and empowered countless people for generations during struggles in red as well as blue states. It has also inspired people to write new protest songs in the face of new crises.

Birth of a protest anthem

“Which Side Are You On?” was composed in 1931, a woman’s spontaneous response to a coal company’s effort to prevent miners in Harlan County, Kentucky, from joining the United Mine Workers of America. Those miners hoped the labor union would improve their working conditions and overturn imposed reductions to their wages.

In support of the coal company, sheriff J. H. Blair and armed deputies broke into the house of union organizer Sam Reece to apprehend him and locate evidence of union activity. Reece was in hiding elsewhere, but his wife, Florence, and their children were present. After ransacking the house, the sheriff and deputies left.

Florence tore a page out of a calendar and jotted down lyrics for an impromptu song, which she recalled setting to the melody of a Baptist hymn “I’m gonna land on the shore.” Others have observed that the melody in Florence’s song was similar to that of the traditional British ballad “Jack Monroe,” which features the haunting refrain “Lay the Lily Low.”

A black-and-white photo of a man playing guitar
Woody Guthrie, one of America’s most celebrated folk singers of the 20th century, sang many protest songs. Al Aumuller, via the Library of Congress


“Which Side Are You On?” channeled Florence’s reaction to that traumatic experience. Throughout the 1930s, she and others sang the song during labor strikes in the Appalachian coalfields, and the lyrics were included in union songbooks. Then, in 1941, the Almanac Singers, a folk supergroup featuring Woody Guthrie and Pete Seeger, recorded the song, and it reached many people beyond Appalachia.

Since then, a range of musicians – including Charlie Byrd; Peter, Paul and Mary; the Dropkick Murphys; Natalie Merchant; Ani DiFranco; and the Kronos Quartet – performed “Which Side Are You On?” in concert settings and for recordings. A solo live performance with a concert audience joining the chorus was a focal point of Seeger’s “Greatest Hits” album in 1967.

The Academy Award-winning documentary film “Harlan County U.S.A.” (1976) included a clip of Florence Reece singing her song during a 1973 strike. “Which Side Are You On?” was translated into other languages – a testament to its universal theme of encouraging solidarity to people confronting authoritarian power.

Florence Reece sings ‘Which side are you on?’ four decades after she wrote the song.


Protest songs of the modern era

While the American protest song tradition can be traced back to the origins of the nation, “Which Side Are You On?” served as a prototype for the modern-era protest song because of its lyrical directness. Many memorable, risk-taking protest songs were composed in the wake of, and in the spirit of, “Which Side Are You On?”

Noteworthy are numerous protest classics in the folk vein, epitomized by a sizable part of Guthrie’s repertoire, by early Bob Dylan songs like “Masters of War” (1963), “The Times They Are a-Changin’” (1964) and “Only A Pawn in Their Game” (1964), and by Phil Ochs’ mid-1960s songs of political critique, such as “Here’s to the State of Mississippi” (1965).

But protest songs have hailed from all music genres. Rock and rhythm and blues, for instance, have spawned many iconic recordings of protest music: Sam Cooke’s “A Change Is Gonna Come” (1964), Buffalo Springfield’s “For What It’s Worth” (1966), Creedence Clearwater Revival’s “Fortunate Son” (1969), Edwin Starr’s “War” (1970) and Crosby, Stills, Nash and Young’s “Ohio” (1970) among many others.

Blues, country, reggae and hip-hop have spawned broadly inspirational protest songs, and jazz too has yielded classic protest recordings, such as Abel Meeropol’s “Strange Fruit” (1939), popularized by Billie Holiday, and Gil Scott-Heron’s 1971 recording of the jazz-poem “The Revolution Will Not Be Televised.”

Indeed, there are so many enduring contributions to the American protest song canon that a list like Rolling Stone’s recent “100 Best Protest Songs of All Time” is only the tip of the iceberg. Regardless of the genre, effective protest songs retain their power to move and motivate people today despite having been composed in response to past situations or circumstances. And protest songs from the past are often adapted to help people more effectively respond to the crisis of the moment.

Songs for this moment

“Which Side Are You On?” was sung – and its theme invoked – in Minnesota throughout January 2026. On Jan. 24, shortly after Border Patrol agents killed Alex Pretti on Nicollet Avenue, Minneapolis Mayor Jacob Frey referred to the song’s title during a public address to his constituents: “Stand up for America. Recognize that your children will ask you what side you were on.” That same day, the grassroots organization 50501: Minnesota posted online an appeal to those in power: “[E]very politician and person in uniform must ask themselves one question – which side are you on?”

The next day, Minnesota Gov. Tim Walz acknowledged divisions in the U.S. during a televised briefing, urging citizens in his state and across the nation to consider the choice before them: “I’ve got a question for all of you. What side do you want to be on?”

People protesting ICE and Customs and Border Protection actions in Minnesota and elsewhere have been singing “Which Side Are You On?” and other well-known protest songs, but musicians have also been writing new protest songs about the crisis. On Jan. 8, the Dropkick Murphys posted on social media a clip of “Citizen I.C.E.,” a revamped version of the group’s 2005 song “Citizen C.I.A.,” augmented by video of the Jan. 7 fatal shooting of Renée Good. On Jan. 27, British musician Billy Bragg released “City of Heroes,” which he composed in tribute to the Minneapolis protesters.

Following suit was Bruce Springsteen, a longtime champion of the protest song legacy. On Jan. 28, Springsteen released online his newly composed and recorded “Streets of Minneapolis.” Millions of people around the world heard the song and saw its accompanying video.

On Jan. 30, Springsteen made a surprise appearance at the Minneapolis club First Avenue, performing his new song at the “Defend Minnesota” benefit concert, organized by musician Tom Morello to raise funds for the families of Good and Pretti.

Bruce Springsteen’s ‘Streets of Minneapolis’ rages against the killings of Renée Good and Alex Pretti.


Making a difference

On the day Pretti was shot dead, hundreds of Minneapolis protesters attended a special service at Minneapolis’ Hennepin Avenue United Methodist Church. Pastor Elizabeth MacAuley, in a televised interview with CNN’s Anderson Cooper, reflected on the role of song in helping people cope: “It’s been a time when it is pretty tempting to feel so disempowered. … [T]he singing resistance movement … brought out the hope and the grief and the rage and the beauty.”

Cooper asked: “Do you think song makes a difference?” MacAuley replied: “I know song makes a difference.”The Conversation

Ted Olson, Professor of Appalachian Studies and Bluegrass, Old-Time and Roots Music Studies, East Tennessee State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

This right-wing social network isn't just biased – it's actively radicalizing users

A new study published today in Nature has found that X’s algorithm – the hidden system or “recipe” that governs which posts appear in your feed and in which order – shifts users’ political opinions in a more conservative direction.

Led by Germain Gauthier from Bocconi University in Italy, it is a rare, real-world randomised experimental study on a major social media platform. And it builds on a growing body of research that shows how these platforms can shape people’s political attitudes.

Two different algorithms

The researchers randomly assigned 4,965 active US-based X users to one of two groups.

The first group used X’s default “For You” feed. This features an algorithm that selects and ranks posts it thinks users will be more likely to engage with, including posts from accounts that they don’t necessarily follow.

The second group used a chronological feed. This only shows posts from accounts users follow, displayed in the order they were posted. The experiment ran for seven weeks during 2023.

Users who switched from the chronological feed to the “For You” feed were 4.7 percentage points more likely to prioritise policy issues favoured by US Republicans (for example, crime, inflation and immigration). They were also more likely to view the criminal investigation into US President Donald Trump as unacceptable.

They also shifted in a more pro-Russia direction in regards to the war in Ukraine. For example, these users became 7.4 percentage points less likely to view Ukrainian President Volodymyr Zelenskyy positively, and scored slightly higher on a pro-Russian attitude index overall.

The researchers also examined how the algorithm produced these effects.

They found evidence that the algorithm increased the share of right-leaning content by 2.9 percentage points overall (and 2.5 points among political posts), compared with the chronological feed.

It also significantly demoted the share of posts from traditional news organisations’ accounts while promoting or boosting posts from political activists.

One of the most concerning findings of the study is the longer-term effects of X’s algorithmic feed. The study showed the algorithm nudged users towards following more right-leaning accounts, and that the new following patterns endured even after switching back to the chronological feed.

In other words, turning the algorithm off didn’t simply “reset” what people see. It had a longer-lasting impact beyond its day-to-day effects.

One piece of a much bigger picture

This new study supports findings of similar studies.

For example, a study in 2022, before Elon Musk had bought Twitter and rebranded it as X, found the platform’s algorithmic systems amplified content from the mainstream political right more than the left in six out of the seven countries.

An experimental study from 2025 re-ranked X feeds to reduce exposure to content that expresses antidemocratic attitudes and partisan animosity. They found this shifted feelings towards their political opponents by more than two points on a 0–100 “feeling thermometer”. This is a shift the authors argued would have normally taken about three years to occur organically in the general population.

My own research offers another piece of evidence to this picture of algorithmic bias on X. Along with my colleague Mark Andrejevic, I analysed engagement data (such as likes and reposts) from prominent political accounts during the final stages of the 2024 US election.

Our findings unearthed a sudden and unusual spike in engagement with Musk’s account after his endorsement of Trump on July 13 – the day of the assassination attempt on Trump. Views on Musk’s posts surged by 138%, retweets by 238%, and likes by 186%. This far outstripped increases on other accounts.

After July 13, right-leaning accounts on X gained significantly greater visibility than progressive ones. The “playing field” for attention and engagement on the platform was tilted thereafter towards right-leaning accounts – a trend that continued for the remainder of the time period we analysed in that study.

Not a niche product

This matters because we are not talking about a niche product.

X has more than 400 million users globally. It has become embedded as infrastructure – a key source of political and social communication. And once technical systems become infrastructure, they can become invisible – like background objects that we barely think about, but which shape society at its foundations and can be exploited under our noses.

Think of the overpass bridges Robert Moses designed in New York in the 1930s. These seemed like inert objects. But they were designed to be very low, to exclude people of colour from taking buses to recreation areas in Long Island.

Similar to this, the design and governance of social media platforms also has real consequences.

The point is that X’s algorithms are not neutral tools. They are an editorial force, shaping what people know, whom they pay attention to, who the outgroup is and what “we” should do about or to them – and, as this new study shows, what people come to believe.

The age of taking platform companies at their word about the design and effects of their own algorithms must come to an end. Governments around the world – including in Australia where the eSafety Commissioner has powers to drive “algorithmic transparency and accountability” and require that platforms report on how their algorithms contribute to or reduce harms – need to mandate genuine transparency over how these systems work.

When infrastructure become harmful or unsafe, nobody bats an eye when governments do something to protect us. The same needs to happen urgently for social media infrastructures.The Conversation

Timothy Graham, Associate Professor in Digital Media, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Astrologers think Donald Trump's destiny is tied to the eclipse

The Moon crossed the Sun’s path on February 17, causing what is known as an annular solar eclipse. The Sun was not covered completely, but the Moon blocked enough of its light to leave a fiery ring. Unless you’re deep in the southern hemisphere, you won’t have noticed.

However, astrologically speaking, eclipses have effects regardless of who is watching. In astrology, an ancient tradition that lacks scientific grounding, eclipses are regarded as being powerful and politically significant celestial events. They are traditionally associated with the destiny of rulers – and some astrologers think Donald Trump is no exception.

Astrologers interpret the meaning of eclipses through horoscopes, celestial maps that locate the Sun, Moon and planets within the 12 signs of the Zodiac that encircle our solar system. During the eclipse, the Sun and Moon were at the edges of the sign Aquarius, a position astrologers associate with endings and shakeups.

This, alongside various other factors including Trump being born during a lunar eclipse in 1946, has led some astrologers to suggest that the eclipse could mark the start of a severe crisis for the US president – even his death.

Predictions like this come around fairly often, and Trump has outlasted many of them before. But these extreme forecasts follow a very old script. For thousands of years, eclipses have been treated as political events, read as omens about kingdoms and their rulers.

Bad omens

Eclipses have been connected with the fate of rulers since at least ancient Mesopotamia, around 4,000 years ago. Keen observers there, in what is now modern-day Iraq, kept lists of phenomena they believed were linked to specific outcomes.

“If a lizard gives birth in the walkway of a house, the household will fall” and “if a white partridge is seen in the city, commercial activity will diminish” are two examples. But one omen has long outlived the others: “if there is an eclipse, the king will die”.

With such high stakes, ancient astronomers invested in systematic observation, record-keeping and calculation to predict eclipses with ever-greater accuracy. This enabled the so-called “substitute king” ritual, where royals tried to avoid their fate by temporarily making someone else king until an eclipse passed.

The link between eclipses and the death of kings spread widely in the ancient world. Egyptian papyri show evidence of this belief, and Greek and Roman history is full of stories connecting eclipses with prominent deaths.

Roman historian Cassius Dio recorded a solar eclipse around the death of the first Roman emperor, Augustus, in AD14, during which “most of the sky seemed to be on fire”. In the gospels of Matthew, Mark and Luke, the death of Jesus is also marked by darkened Sun.

In the medieval period, when Arabic chroniclers recorded eclipses, they usually noted concurrent deaths of rulers. And in Europe, a solar eclipse in 1133 was so closely associated with the 1135 death of King Henry I of England that it became known as “King Henry’s Eclipse”.

Premodern rulers often hired astrologers to interpret their birth charts – the horoscope cast for the moment they were born. Ideally, the astrologer would pick out an aspect of the chart they could say justified the ruler’s leadership and foretold a long and prosperous reign. This was useful astrological propaganda.

But rulers were less happy when astrologers did this without authorisation – especially if they forecast illness or death. Astrologers were expelled from ancient Rome on numerous occasions for doing just that.

In his book, Lives of the Caesars, Roman historian Suetonius recounted the fate of an astrologer called Ascletarion (or Ascletario). Ascletarion’s predictions of the Emperor Domitian’s imminent downfall in the first century AD prompted the angry emperor to order his execution.

More than 1,400 years later, an astrologer in Oxford was executed for predicting the death of the reigning English monarch, Edward IV. And in 1581, Queen Elizabeth I of England made it a felony to use horoscopes to predict her death or her successor.

Similarly in France, royal pronouncements in 1560, 1579 and 1628 prohibited astrological predictions about princes, states and public affairs. Around the same time, astrologers in Italy got into serious trouble for predicting the deaths of popes.

This was not just a matter of anxiety on the part of rulers. It was also a question of maintaining public order and political stability. State powers were concerned with the ability of astrological predictions to cause general chaos and even prompt protests and rebellions.

They were right to worry. In a time when astrology was taken very seriously, predictions could cause collective panic. During the so-called wars of the three kingdoms, a series of conflicts fought between 1639 and 1653 in England, Scotland and Ireland, astrologers’ radical political predictions about the fate of the English monarchy fed revolutionary sentiment.

One of these astrologers, Nicholas Culpeper, published predictions of the downfall of all European monarchies on the basis of a solar eclipse in 1652.

Astrology left the world of universities and political courts in the 17th century, but astrologers did not stop making political predictions. In 1790s London, an astrologer called William Gilbert predicted the death of King Gustav III of Sweden. His prophecy was fulfilled a few months later.

And after his attempted assassination in 1981, the then-US president, Ronald Reagan, asked astrologer Joan Quigley whether she could have predicted it. She said yes. Quigley worked for the Reagans for many years, and claimed that she provided advice not just on personal affairs but also on matters of the state, including the best timing to make political announcements.

Although astrology is no longer counted as a science, it remains a player in contemporary politics. Whether or not eclipse predictions come to pass is almost besides the point. Historically, what made eclipses politically dangerous was the speculation often attached to them.The Conversation

Michelle Pfeffer, Research Fellow in Early Modern History, University of Oxford

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How Jesse Jackson embodied Southern politics − and changed American elections

Editor's Note: Rev. Jesse Jackson, the legendary civil rights activist and two-time presidential candidate who fundamentally reshaped American politics and inspired generations of African Americans to seek elected office, has died. He was 83.

Jackson's passing marks the end of an era in American political and social history. From his emergence as a leader in the civil rights movement in the early 1960s to his groundbreaking presidential campaigns in 1984 and 1988, Jackson's life was defined by an unwavering commitment to social and economic justice.

The article that follows, originally published by The Conversation last year, examines how Jackson's Southern identity shaped his life's work and his enduring influence on American politics. It is reprinted here as a tribute to his legacy.

Holding hands with other prominent Black leaders, the Rev. Jesse Jackson crossed the Edmund Pettus bridge in Selma, Alabama, on March 9, 2025, to commemorate the 60th anniversary of “Bloody Sunday.” Like several survivors of that violent day in 1965, when police brutally attacked civil rights protesters, Jackson crossed the bridge in a wheelchair.

Jesse Louis Jackson was born Oct. 8, 1941, in Greenville, South Carolina, a town firmly entrenched in the racially segregated Deep South. This time and place aren’t footnotes to Jackson’s life, but rather key facts that shaped his civil rights activism and historic runs for the U.S. presidency.

Growing up in the segregated South shaped Jackson’s attitudes, opinions and outlook in ways that remain apparent today. While he lived in Chicago for most of his adult life, he remained a Southerner. And other Southerners viewed him as such.

Jackson biographer David Masciotra said the South gave Jackson “a sense of the oppression and the persecution that he wanted to fight.”

As scholars of Southern politics, we see Jackson’s Southern identity as essential to understanding his life. Southerners often identify with the region, even after leaving the geographic South. As sociologist John Shelton Reed once wrote, Southernness has more to do with attitude than latitude.

A segregated childhood

In the South Carolina of Jackson’s youth, water fountains, bathrooms, swimming pools and lunch counters were all segregated. While white people his age attended Greenville High School, Jackson attended the all-Black Sterling High School, where he was a star quarterback and class president.

His experience of segregation shaped how Jackson views his life.

“I keep thinking about the odds,” Jackson told his biographer and fellow South Carolinian Marshall Frady in 1988, marveling at the “responsibility I have now against what I was expected then to be doing at this stage of life.”

“Even mean ole segregation couldn’t break in on me and steal my soul,” he later told Frady.

If Jackson had been white, a star student like him might have enrolled at Clemson University or the University of South Carolina. Or he might have said yes when he was offered a contract to play professional baseball.

Instead, Jackson rejected the contract because the pay would be approximately six times less than a white player’s and went North, to the University of Illinois.

He did not find a more welcoming atmosphere in Champaign, Illinois. According to biographer Barbara Reynolds, the segregation that he thought he had left behind “cropped up in Illinois to convince him that was not the place to be.”

In the fall of 1960, Jackson transferred to North Carolina Agricultural and Technical State University, a historically Black college in Greensboro, North Carolina, to complete his sociology degree.

His return to the South marked Jackson’s emergence as a leader in the growing Civil Rights Movement.

Greensboro was a center of this struggle, with large, regular demonstrations, often led by local students of color. Six months prior to his arrival in Greensboro, four Black students from North Carolina A&T refused to leave the whites-only Woolworth lunch counter, launching a sit-in movement that soon drew national attention.

Jackson himself led protests to integrate Greensboro businesses. After one pivotal student march on City Hall, he was arrested and charged with inciting a riot. In jail, Jackson wrote a “Letter From a Greensboro Jail,” a rhetorical tip of the hat to Martin Luther King Jr.’s “Letter from a Birmingham Jail.”

A move north

Jackson’s second move north, in 1964, stuck.

Like so many other Black Southerners who participated in what later became known as the “second great migration,” Jackson went to Chicago. He attended Chicago Theological Seminary, inspired not by a deep love of scripture but by what Jackson perceived as the church’s ability to do good on this earth.

As North Carolina A&T’s president, Dr. Sam Proctor, advised Jackson, “You don’t have to enter the ministry because you want to save people from a burning hell. It may be because you want to see his kingdom come on earth as it is in heaven.”

Jackson thought his time in Chicago “would be quiet and peaceful and I could reflect.”

It was anything but. Following the path of King and other religiously inspired civil rights activists, Jackson continued his civil rights organizing, leading Operation Breadbasket, an initiative of King’s to boycott businesses that did not employ Black workers.

Presidential aspirations

Over the next few years, Jackson took on ever more high-profile organizing, patterned after the life and work of King – another Southerner. As the former King aide Bernard Lafayette once said, “I mean, he cloned himself out of Martin Luther King.”

In 1984, Jackson turned to politics. He became the second African American to run for the nation’s highest office, following in the footsteps of Shirley Chisholm and her 1972 candidacy.

Announcing his bid, Jackson pledged to “help restore a moral tone, a redemptive spirit, and a sensitivity to the poor and dispossessed of this nation.”

But the campaign always represented more than a policy platform. Jackson wanted to mobilize more Americans to vote and to run for office, especially the “voiceless and the downtrodden.”

Jackson finished third in the 1984 Democratic primary but with a remarkably strong showing, taking 18% of all primary votes. He performed especially well south of the Mason-Dixon Line, winning both Louisiana and the District of Columbia. He also performed well in the Mississippi and South Carolina Democratic caucuses.

This surprising success inspired Jackson to run for president again. In 1988, he did even better, winning nearly 7 million votes and 11 contests, and sweeping the South during the primary season.

He won the South Carolina caucuses and the Super Tuesday states of Alabama, Georgia, Louisiana, Mississippi and Virgina. In his second run, Jackson more than doubled his share of the white vote, from 5% in 1984 to 12% in 1988.

Jackson finished second in the Democratic primary to Massachusetts Gov. Michael Dukakis, who would go on to lose the 1988 presidential election to George H.W. Bush. But Jackson’s strong results solidified his position as a major figure in American politics and a power broker in the Democratic Party.

A towering figure in American politics

Jesse Jackson’s two presidential runs fundamentally altered the U.S. political landscape.

Beyond being the first Black candidate to win a state primary contest, Jackson also helped end the primary system by which the winner of a state would receive all the state’s delegates. Jackson claimed the system hurt Black and minority candidates and advocated to implement reforms that had been first recommended following the 1968 Democratic primary.

Back then, the party had pushed for a system in which delegates could be allocated based on the proportion of the vote won by each candidate, but it wasn’t adopted in every state.

Starting in 1992, following Jackson’s intervention, candidates receiving at least 15% of the vote officially received a proportion of the delegates. These reforms opened up the possibility that a minority candidate could secure the Democratic nomination through a more proportional allocation of delegates.

Jackson’s background also reinforced the importance of the Black church in Black political mobilization.

Perhaps most importantly, Jackson expanded the size and diversity of the electorate and inspired a generation of African Americans to seek office.

“It is because people like Jesse ran that I have this opportunity to run for president today,” said Barack Obama in 2007.

The long Southern strategy

Jackson’s political rise coincided with and likely encouraged the exodus of racially conservative white voters out of the Democratic Party.

The Republican Party’s Long Southern Strategy – an opportunistic plan to cultivate Southern white voters by capitalizing on “white racial angst” and conservative social values – had been underway before Jackson’s presidential bids. But his focus on social and economic justice undoubtedly helped drive conservative Southern whites to the GOP.

Today, some political thinkers question whether a distinct “Southern politics” continues to exist.

The life and career of Jesse Jackson reflect that place still matters – even for people who have left that region for colder pastures.The Conversation

Gibbs Knotts, Professor of Political Science, Coastal Carolina University and Christopher A. Cooper, Professor of Political Science & Public Affairs, Western Carolina University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Multiple warning signs suggest trouble ahead for Trump — and the GOP

On Feb. 7, 2026, Chasity Verret Martinez won a special election to fill a vacant seat in the Louisiana House. That’s an outcome that might not mean very much to people outside of the state or even outside her Baton Rouge-area district.

But Martinez is a Democrat who took 62% of the vote in a district that had given Donald Trump a 13-percentage-point victory in the 2024 presidential race. And her win came a week after Democrats seized a Texas Senate district that had supported Trump even more strongly – a result that immediately triggered concern in Republican circles.

Because fewer people turn out for special elections, they’re considered an early predictor of partisan enthusiasm heading into regularly scheduled elections. And with the 2026 midterm elections less than nine months away, analysts are already scrambling for indications of the likely outcome.

As a political scientist who studies congressional elections, I’m interested in the question of whether special elections can really tell us which way the political winds are currently blowing.

Democrats, of course, are hoping for a “blue wave” like they rode in 2018, when they picked up 40 House seats and won a majority in that chamber, while Republicans want to hang on to the very slim margins they have in both the House and Senate.

In the 2026 election cycle, as in previous ones, prognosticators and political professionals are looking to the outcomes of these intermittent races at various levels of government as a gauge of how voters are feeling about the two parties. And the results from the first 15 months of the second Trump administration appear to spell very bad news for the Republicans.

Setting a baseline

Since Election Day 2024, 88 special elections featuring candidates from both major parties have taken place for institutions including state legislatures and the U.S. House.

When analyzing the results of these races, it’s important to have figures to compare them to. After all, a Democrat just barely squeaking by in a state legislative race may not look very impressive on its face – but if that race took place in the rural heart of a red state, it could raise hackles among Republicans.

Most political analysts agree that the best available comparison point for special elections are the results for the most recent presidential election in that same district. There are a few reasons for this.

The nationalization of party politics means there are few members of Congress representing states or districts that voted for the other party for president. So the best comparison is to the only truly national election in the U.S.

Second, using presidential results creates the same baseline for all races. By comparing special election results to the prior election environment, all the special election results get compared to the same standard.

Finally, and perhaps most importantly, recent midterm elections have typically served as a referendum on the party in power, particularly the president. In trying to measure how voters are reacting to Trump’s second term, it makes sense to measure their behavior against the last time Trump was on the ballot.

Are special elections predictive?

With this baseline in mind, it’s easy to compare the results of special elections in particular districts to the results of the last presidential election in that same district.

In the 2022 cycle, for example, Democrats running in special elections underperformed President Joe Biden’s 2020 results in their districts by about 4 percentage points on average, which translated into a 3-percentage-point loss nationwide in U.S. House races in the November 2022 midterms and the loss of their majority in the chamber.

Conversely, in 2018 – like this year, a midterm following a Trump election – Democrats bested Republicans by 8 percentage points in November, after overperforming Hillary Clinton’s 2016 margins in special elections throughout the previous two years by 9 percentage points on average.

The 2024 cycle is a clear exception to this pattern of regular elections closely following special election results: Prior to the presidential election, Democrats outperformed in special elections by an average of 4 percentage points but ended up losing nationally by 3 percentage points in November.

Like special elections, midterm contests tend to turn out fewer but more engaged voters than presidential years. Therefore, it may be that special elections are more predictive of midterm results than presidential cycles. At any rate, if previous midterm outcomes are any guide, the numbers being posted by Democrats in special elections so far in the 2026 cycle are impossible to ignore.

On average, they’re running ahead of Harris’s 2024 margins by a whopping 13 percentage points. That’s better than they did in 2018, when they ultimately picked up 40 seats in the House and seven governorships across the country.

What’s different about specials?

Democrats, however, may not want to pop the champagne corks just yet. Many roadblocks remain in their quest to take back control of Congress. For one thing, the U.S. Senate map remains a difficult one for Democrats. Even if they end up creating a 2018-like election environment with an unpopular president, many Senate contests are taking place in solidly red states.

It’s also always worth bearing in mind that there’s no telling how the events of the next nine months might reshape public opinion.

And special elections, while useful metrics, are far from perfect barometers of public opinion. They take place at different times, and could be just as reflective of hyperlocal factors, such as flawed candidates, as they are of nationalized partisan conditions.

Special elections tend to have far lower turnout than regular midterm or presidential contests. It’s also difficult to tell whether overperformance is due to highly motivated partisans or persuasion of independents and voters from the other party.

Using all the tools available

Still, special elections do have key advantages over traditional polling. Although polls do their best to approximate voters’ political attitudes, elections reveal these attitudes through voters’ actual, observed behavior – exactly the type of behavior that analysts are trying to predict in November.

Generally, this is preferable to asking a hypothetical in opinion polls, which are getting more difficult than ever to do well.

In the end, special elections are just one piece of the prediction puzzle. But the other puzzle pieces are also spelling out potential bad news for the GOP.

The generic ballot, a standard polling question that asks voters’ intent to vote for one party or the other in November without naming specific candidates, has the GOP about 6 percentage points behind the Democrats. Trump’s approval rating, meanwhile, continues to hover below 40%.

There’s no telling for sure whether these indicators will turn out to be truly predictive until November. But all of them should be sounding alarm bells for Republicans.The Conversation

Charlie Hunt, Associate Professor of Political Science, Boise State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How a Trump campaign contractor learned how to read your mind

The dealings that have been revealed between Cambridge Analytica and Facebook have all the trappings of a Hollywood thriller: a Bond villain-style CEO, a reclusive billionaire, a naïve and conflicted whistle-blower, a hipster data scientist turned politico, an academic with seemingly questionable ethics, and of course a triumphant president and his influential family.

Much of the discussion has been on how Cambridge Analytica was able to obtain data on more than 50m Facebook users – and how it allegedly failed to delete this data when told to do so. But there is also the matter of what Cambridge Analytica actually did with the data. In fact the data crunching company’s approach represents a step change in how analytics can today be used as a tool to generate insights – and to exert influence.

For example, pollsters have long used segmentation to target particular groups of voters, such as through categorising audiences by gender, age, income, education and family size. Segments can also be created around political affiliation or purchase preferences. The data analytics machine that presidential candidate Hillary Clinton used in her 2016 campaign – named Ada after the 19th-century mathematician and early computing pioneer – used state-of-the-art segmentation techniques to target groups of eligible voters in the same way that Barack Obama had done four years previously.

Cambridge Analytica was contracted to the Trump campaign and provided an entirely new weapon for the election machine. While it also used demographic segments to identify groups of voters, as Clinton’s campaign had, Cambridge Analytica also segmented using psychographics. As definitions of class, education, employment, age and so on, demographics are informational. Psychographics are behavioural – a means to segment by personality.

This makes a lot of sense. It’s obvious that two people with the same demographic profile (for example, white, middle-aged, employed, married men) can have markedly different personalities and opinions. We also know that adapting a message to a person’s personality – whether they are open, introverted, argumentative, and so on – goes a long way to help getting that message across.

Understanding people better

There have traditionally been two routes to ascertaining someone’s personality. You can either get to know them really well – usually over an extended time. Or you can get them to take a personality test and ask them to share it with you. Neither of these methods is realistically open to pollsters. Cambridge Analytica found a third way with the assistance of University of Cambridge academic Aleksandr Kogan.

Kogan sold Cambridge Analytica access to 270,000 personality tests completed by Facebook users through an online app he had created for research purposes. Providing the data to Cambridge Analytica was, it seems, against Facebook’s internal code of conduct, but only now in March 2018 has Kogan been banned by Facebook from the platform. In addition, Kogan’s data also came with a bonus: he had reportedly collected Facebook data from the test-takers’ friends – and, at an average of 200 friends per person, that added up to some 50m people.

While not all of these people had provided personality test responses, it is possible to reverse-engineer a personality profile from Facebook activity. Decades of psychological research has formed around the lexical hypothesis, that personality traits can be inferred by studying the subject’s use of language. Facebook patented a process to do just this in 2012, as part of its commercial aims to provide more targeted advertising, by mapping the contents of posts and likes against the “Big Five” model of psychological traits, sometimes known as OCEAN (openness, conscientiousness, extroversion, agreeableness, neuroticism). Whether you choose to like pictures of sunsets, puppies or people apparently says a lot about your personality: a 2015 study by other academics from the Cambridge psychology lab found that the model of predicting personality traits using Facebook data could generate a personality profile with the same accuracy as a spouse with just 300 likes.

Kogan developed his own model along the same lines and cut a deal with Cambridge Analytica. Armed with this bounty – and combined with additional data gleaned from elsewhere – Cambridge Analytica built personality profiles for more than 100m registered US voters. It’s claimed the company then used these profiles for targeted advertising.

Imagine for example that you could identify a segment of voters that is high in conscientiousness and neuroticism, and another segment that is high in extroversion but low in openness. Clearly, people in each segment would respond differently to the same political ad. But on Facebook they do not need to see the same ad at all – each will see an individually tailored ad designed to elicit the desired response, whether that is voting for a candidate, not voting for a candidate, or donating funds.

Cambridge Analytica worked hard to develop dozens of ad variations on different political themes such as immigration, the economy and gun rights, all tailored to different personality profiles. There is no evidence at all that Clinton’s election machine had the same ability.

Behavioural analytics and psychographic profiling are here to stay, no matter what becomes of Cambridge Analytica – which has robustly criticised what it calls “false allegations in the media”. In a way it industrialises what good salespeople have always done, by adjusting their message and delivery to the personality of their customers. This approach to electioneering – and indeed to marketing – will be Cambridge Analytica’s ultimate legacy.

Updated: This piece was amended on 13 Feb 2026 to make clear that while Michal Kosinski and David Stillwell’s research had demonstrated the effectiveness of using Facebook data to generate personality profiles, they were not involved with Cambridge Analytica and their work was not used by Cambridge Analytica.The Conversation

Michael Wade, Professor of Innovation and Strategy, Cisco Chair in Digital Business Transformation, International Institute for Management Development (IMD)

This article is republished from The Conversation under a Creative Commons license. Read the original article.

A forgotten Supreme Court case can help prevent Trump's takeover of elections

The recent FBI search of the Fulton County, Georgia, elections facility and the seizure of election-related materials pursuant to a warrant has attracted concern for what it might mean for future elections.

What if a determined executive branch used federal law enforcement to seize election materials to sow distrust in the results of the 2026 midterm congressional elections?

Courts and states should be wary when an investigation risks commandeering the evidence needed to ascertain election results. That is where a largely forgotten Supreme Court case from the 1970s matters, a case about an Indiana recount that sets important guardrails to prevent post-election chaos in federal elections.

Congress’s constitutionally-delegated role

The case known as Roudebush v. Hartke arose from a razor-thin U.S. Senate race in Indiana in 1970. The ballots were cast on Election Day, and the state counted and verified the results, a process known as the “canvass.” The state certified R. Vance Hartke as the winner. Typically, the certified winner presents himself to Congress, which accepts his certificate of election and seats the member to Congress.

The losing candidate, Richard L. Roudebush, invoked Indiana’s recount procedures. Hartke then sued to stop the recount. He argued that a state recount would intrude on the power of each chamber, the Senate or the House of Representatives, to judge its own elections under Article I, Section 5 of the U.S. Constitution. That clause gives each chamber the sole right to judge elections. No one else can interfere with that power.

Hartke worried that a recount might result in ballots that could be altered or destroyed, which would diminish the ability of the Senate to engage in a meaningful examination of the ballots if an election contest arose.

But the Supreme Court rejected that argument.

It held that a state recount does not “usurp” the Senate’s authority because the Senate remains free to make the ultimate judgment of who won the election. The recount can be understood as producing new information – in this case, an additional set of tabulated results – without stripping the Senate of its final say.

Furthermore, there was no evidence that a recount board would be “less honest or conscientious in the performance of its duties” than the original precinct boards that tabulated the election results the first time around, the court said.

A state recount, then, is perfectly acceptable, as long as it does not impair the power of Congress.

In the Roudebush decision, the court recognized that states run the mechanics of congressional elections as part of their power under Article I, Section 4 of the U.S. Constitution to set the “Times, Places and Manner of holding Elections for Senators and Representatives,” subject to Congress’s own regulation.

At the same time, each chamber of Congress judges its own elections, and courts and states should not casually interfere with that core constitutional function. They cannot engage in behaviors that usurp Congress’s constitutionally-delegated role in elections.

Evidence can be power

The Fulton County episode is legally and politically fraught not because federal agents executed a warrant – courts authorize warrants all the time – but because of what was seized: ballots, voting machines, tabulation equipment and related records.

Those items are not just evidence. They are also the raw materials for the canvassing of votes and certification of winners. They provide the foundation for audits and recounts. And, importantly, they are necessary for any later inquiry by Congress if a House or Senate race becomes contested.

That overlap creates a structural problem: If a federal investigation seizes, damages, or destroys election materials, it can affect who has the power to assess the election. It can also inject uncertainty into the chain of custody: Because ballots are removed from absentee envelopes or transferred from Election Day precincts to county election storage facilities, states ensure the ballots cast on Election Day are the only ones tabulated, and that ballots are not lost or destroyed in the process.

Disrupting this chain of custody by seizing ballots, however, can increase, rather than decrease, doubts about the reliability of election results.

That is the modern version of “usurpation.”

From my perspective as an election law scholar, Roudebush is a reminder that courts should be skeptical of executive actions that shift decisive control over election proof away from the institutions the Constitution expects to do the judging.

There is another institutional reason courts should be cautious about federal actions that seize or compromise election materials: The House already has a long-running capacity to observe state election administration in close congressional races.

The Committee on House Administration maintains an Election Observer Program. That program deploys credentialed House staff to be on-site at local election facilities in “close or difficult” House elections. That staff observes casting, processing, tabulating and canvassing procedures.

The program exists for a straightforward reason: If the House may be called upon to judge a contested election under Article I, Section 5, it has an institutional interest in understanding how the election was administered and how records were handled.

That observation function is not hypothetical. The committee has publicly announced deployments of congressional observers to watch recount processes in tight House races throughout the country.

I saw it take place first-hand in 2020. The House deployed election observers in Iowa’s 2nd Congressional District to oversee a recount of a congressional election that was ultimately certified by a margin of just six votes.

Democratic and Republican observers from the House politely observed, asked questions, and kept records – but never interfered with the state election apparatus or attempted to lay hands on election equipment or ballots.

Congress has not rejected a state’s election results since 1984, and for good reason. States now have meticulous recordkeeping, robust chain-of-custody procedures for ballots, and multiple avenues of verifying the accuracy of results. And with Congress watching, state results are even more trustworthy.

When federal investigations collide with election materials

Evidence seizures can adversely affect election administration. So courts and states ought to be vigilant, enforcing guardrails that help respect institutional boundaries.

To start, any executive branch effort to unilaterally inject itself into a state election apparatus should face meaningful scrutiny. Unlike the Fulton County warrant, which targeted an election nearly six years old, warrants that interrupt ongoing state processes in an election threaten to usurp the constitutional role of Congress. And executive action cannot proceed if it impinges upon the ultimate ability of Congress to judge the election of its members.

In the exceedingly unlikely event that a court issues a warrant, a court should not permit seizure of election equipment and ballots during a state’s ordinary post-election canvass. Instead, inspection of items, provision of copies of election materials, or orders to preserve evidence are more tailored means to accomplish the same objectives. And courts should establish clear chain-of-custody procedures in the event that evidence must be preserved for a future seizure in a federal investigation.

The fear driving much public commentary about the danger to midterm elections is not merely that election officials will be investigated or that evidence would be seized. It is that investigations could be used as a pretense to manage or, worse, disrupt elections – chilling administrators, disorganizing record keeping or manufacturing doubt by disrupting custody of ballots and systems.

Roudebush provides a constitutional posture that courts should adopt, a recognition that some acts can usurp the power of Congress to judge elections. That will provide a meaningful constraint on the executive ahead of the 2026 election and reduce the risk of intervention in an ongoing election.The Conversation

Derek T. Muller, Professor of Law, University of Notre Dame

This article is republished from The Conversation under a Creative Commons license. Read the original article.

'Unprecedented': Expert condemns Trump admin's credibility crisis with judges

The word “unprecedented” is getting a workout after a grand jury in Washington on Feb. 10, 2026, rebuffed an attempt by federal prosecutors to get an indictment against perceived enemies of President Donald Trump.

It began with an unprecedented video in November 2025 featuring six Democratic lawmakers alerting military and intelligence community members that they had the duty to disobey illegal orders. That enraged Trump, who in an unprecedented move said the lawmakers were guilty of sedition, which is punishable by death. The U.S. attorney for the District of Columbia, Jeanine Pirro, made the unprecedented attempt to indict the lawmakers. The final element in this drama – the federal grand jury’s rejection of Pirro’s request – wasn’t itself unprecedented. That’s because it’s only the latest in an unprecedented string of losses for the Trump administration before grand juries.

Dickinson College President John E. Jones III, a former federal judge, spoke with The Conversation politics editor Naomi Schalit about the role of grand juries, why a grand jury would not indict someone – and how all of this is a reflection of the administration’s remarkable loss of credibility with judges and the citizens who make up grand juries.

How does the grand jury process work?

The grand jury really dates back to before the Bill of Rights, but for our purposes it’s memorialized in the Fifth Amendment within the Bill of Rights. It is meant to be a mechanism that screens cases brought by prosecutors.

Ordinary citizens, not fewer than 16 or more than 23, have the facts presented to them by a United States attorney or assistant United States attorney. They must make a determination as to whether or not there is probable cause to believe that a crime has been committed. It is not the purview of grand jurors to determine guilt or innocence, but merely to determine whether there is probable cause sufficient to indict.

So that means that a prosecutor will come to a grand jury and present them with the facts that they have chosen to present them with. There’s no defense at that point, and the grand jury then, relatively routinely, says OK, “Indict that person,” or “Indict those people”?

That’s correct. It’s a very one-sided process. There are no defense attorneys present. There’s a court reporter, the grand jury, the United States attorney, and such witnesses as the United States attorney decides to call. While the target of a grand jury can endeavor to present witnesses, including themselves, that generally never happens because of the danger of self-incrimination. The grand jurors can ask questions of the witnesses, but the United States attorney can choose the evidence that it wants to present to the grand jury, and typically they present only such evidence as is necessary in order to establish probable cause that a crime has been committed.

Does the public know what is presented in a grand jury room by the prosecutor?

The grand jury proceedings are absolutely secret and they remain that way, unless a federal judge authorizes that they be unsealed. So in the case involving the six lawmakers, we don’t know what the prosecutor presented to the grand jury. We just know that the grand jury refused to return an indictment. As far as I know, we don’t even know what crimes were put before the grand jury, let alone what testimony was presented. What we do know is that in all six cases, the grand jury refused to vote in favor of the indictment that was requested by the United States attorney.

Why would a grand jury refuse to give the prosecutor what they want?

It’s unprecedented, although we now see a wave of grand juries pushing back against the government. I don’t recall a single instance, during the almost 20 years I served as a U.S. District judge, when a grand jury refused to return a true bill, an indictment. It just is completely aberrational. The grand jury would have to totally reject the whole premise of the case that’s being presented to them by the United States attorney because, remember, there are typically no witnesses appearing before the grand jury to dispute the facts. The grand jury is clearly saying, “Even accepting the facts you’re putting before us as true, we don’t think under these circumstances this case is worthy of a federal indictment.”

Can a prosecutor just try again?

They can return to the well, so to speak, and they did that in Virginia in the case of Letitia James. But it’s pretty perilous because, bluntly, it’s a way that a prosecutor can get their head handed to them twice.

Originally, as set out in the Fifth Amendment to the Constitution, the grand jury was supposed to be a vigorous and robust check against prosecutors simply charging people with crimes. But over time, it’s become far less than that. And there is the famous quote by Judge Sol Wachtler in New York that a grand jury can be made to “indict a ham sandwich.”

So to see a grand jury fail to return true bills multiple times over the past couple of months is remarkable and unprecedented. It occurs to me that what is happening here is kind of parallel to what’s taking place with the administration and federal judges. I think we now have entered a world where the Department of Justice has lost its credibility with the judiciary.

We’re seeing that time and again in appearances in court where judges simply don’t believe what U.S. attorneys are telling them, based on past demonstrable falsehoods that have been stated in open court. And now we see grand juries that are also doubting the credibility of federal prosecutors. And these grand jurors are not blind to what is taking place in the world around them.

I think that this is further polluted by the fact that the president of the United States, for example, in the case of the six defendants from Congress and the Senate, said that they had committed seditious acts – which is punishable by death.

Obviously, this tilts the scales and is fundamentally unfair because it is destroying the concept of due process of law. People notice what the president says, and I am happy to see that the average citizen serving on a grand jury has retained what I think is a fundamental sense of fairness, even in the face of a pretty stacked deck.

What does it mean if you have a court system, judges and the grand juries who do not have faith in the administration and its legal claims?

It’s a complete drag on our system of justice. For all of the time that I sat on the federal bench, I had great respect for the Department of Justice, and the department had tremendous credibility. They were straight shooters. The prosecutors who appeared in front of me were professionals. I didn’t always agree with their arguments, of course, nor did I agree with a few of their charging decisions, but I can tell you that not once did I see a federal prosecution in front of me that I felt strongly should never have been brought at its inception.

But we now have a system where, because of the whims of the president, the Department of Justice has become utterly weaponized against his perceived enemies, and that’s a gross misuse of our prosecutorial power at the federal level.

Also, if, for example, these members of Congress had been indicted, they’d have to lawyer up, they’d have to fight their way out. That would take a lot of resources.

So, yes, the judiciary can be a bulwark against improvident prosecutions. But that comes at a cost to the defendant, and it’s been said that the process itself is the punishment. I suspect that’s what the president wants; it’s the trauma that you put somebody through that can be almost as bad as being convicted. And, of course, there’s the reputational harm as well.The Conversation

John E. Jones III, President, Dickinson College

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Newly released Epstein files suggest secret intel operations involving wealthy elites

For obvious reasons, the secretive world of intelligence agencies and the people who revolve in its orbit remains opaque. So much so, that some of those people may not even be aware of any involvement in the secret world.

The Epstein papers have thrown up speculation about whether the late financier and sex offender might have performed services for one or another of the big intelligence agencies. And in the wake of that speculation, it has been noted that the father of Epstein’s one-time girlfriend, Ghislaine Maxwell, was the late Robert Maxwell, well-known as a larger than life publisher and newspaper proprietor in the UK from the 1950s to the early 90s. He, too, was the subject of much speculation that he might have been involved in intelligence work.

Epstein is now better known for his sex trafficking network and Maxwell for stealing from his employees’ pension funds. But their examples point to how intelligence, high finance and influence work.

Generally speaking there are three main classes of people involved in state intelligence gathering. “Officers” are full-time employees of state intelligence agencies such as MI6. They run their groups of “agents”, who are not formally employed by the state but who deliberately and knowingly gather intelligence and perform tasks for intelligence officers. And there are what is known as “intelligencers” (or sometimes assets) who may not even know they are providing information to a spy agency.

The currency of human intelligence is access, knowledge and often the ability to compromise officials and influential people.

We often think that intelligence agencies and their agent runners seek to directly recruit people with the access and motivation to pass on state secrets. While this is undeniably the case – and the examples of the American Aldrich Ames and the Briton Melita Norwood provide good evidence of this – intelligence agencies are equally interested in recruiting what’s known as “access agents”.

Access agents

The value of an access agent is not the secrets they have access to, but the social and professional access they provide to people who do. People in high-end society, scientific research, banking, politics and culture make excellent targets for access agents. And from an agency’s point of view, the best thing is that these agents are deniable and under the radar.

Intelligence officers and their operatives require funding, mobility and a credible back story (known as a legend). Businessmen like Robert Maxwell and Jeffrey Epstein had plenty of all three, making them excellent candidates to theoretically serve the needs of intelligence agencies.

But rather than indulging in speculation about Epstein and Maxwell, which is unlikely ever to be conclusively confirmed or denied, it’s more instructive to look at what we know about access agents. They are often business people, sometimes academics or journalists with a reason to travel and the opportunity to meet people in influential circles in the course of their legitimate business.

It’s worth remembering that Kim Philby, the most notorious of the Cambridge spy ring, cut his teeth as a reporter in Spain during the civil war, before embarking on a career as an MI6 officer (and Soviet double agent). Australian journalist, Richard Hughes – who appeared lightly disguised in novels by Ian Fleming and John le Carre – was believed by many to be an agent for British intelligence, working in southeast Asia during the upheavals of the 1960s and 1970s.

Perhaps the most famous businessman-agent was Cyril Bertram Mills who combined being the director of the Bertram Mills Circus with a four-decade career spanning the years before and after the second world war with British intelligence. Travelling widely in Europe, ostensibly to seek out circus acts, he provided his spymasters with evidence of German rearmament in the 1930s. He also recruited Garbo, one of the most successful double agents, who was instrumental in convincing Germany that the D-Day landings would be in Calais, not Normandy.

An access agent is trained “to be the friend the informant doesn’t have”. They can provide what their contact needs and cannot get hold of: whether that’s useful inside information of some kind, an introduction to someone important, a sexual partner or finance for one of their ventures.

MI5 is quite open about this on its website: “Agents operate by exploiting trusted relationships and positions to obtain sensitive information. They may also look for vulnerabilities among those handling secrets.

Secrets and lies

Determining truth in intelligence is complicated. Very rarely do we see a single piece of incontrovertible evidence that proves someone’s intelligence status or the ethics or efficacy of their actions. But then as we know, all of this is shrouded in secrecy and supposition.

In Maxwell’s case, historical scholarship and TV documentaries have provided unverified hints. In Epstein’s we have indicators such as the claim by former US attorney, Alexander Acosta that he was told Epstein "belonged to intelligence”, when he negotiated his plea deal. But it’s unlikely we’ll ever know the truth about either.The Conversation

Robert Dover, Professor of Intelligence and National Security & Dean of Faculty, University of Hull

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Americans are asking too much of their dogs

Americans love dogs.

Nearly half of U.S. households have one, and practically all owners see pets as part of the family – 51% say pets belong “as much as a human member.” The pet industry keeps generating more and more jobs, from vets to trainers, to influencers. Schools cannot keep up with the demand for veterinarians.

It all seems part of what Mark Cushing, a lawyer and lobbyist for veterinary issues, calls “the pet revolution”: the more and more privileged place that pets occupy in American society. In his 2020 book “Pet Nation,” he argues that the internet has caused people to become more lonely, and this has made them focus more intensely on their pets – filling in for human relationships.

I would argue that something different is happening, however, particularly since the COVID-19 lockdown: Loving dogs has become an expression not of loneliness but of how unhappy many Americans are with society and other people.

In my own book, “Rescue Me,” I explore how today’s dog culture is more a symptom of our suffering as a society than a cure for it. Dogs aren’t just being used as a substitute for people. As a philosopher who studies the relationships between animals, humans and the environment, I believe Americans are turning to dogs to alleviate the erosion of social life itself. For some owners, dogs simply offer more satisfying relationships than other people do.

And I am no different. I live with three dogs, and my love for them has driven me to research the culture of dog ownership in an effort to understand myself and other humans better. By nature, dogs are masters of social life who can communicate beyond the boundaries of their species. But I believe many Americans are expecting their pets to address problems that they cannot fix.

Dogs over people

During the pandemic, people often struggled with the monotony of spending too much time cooped up with other humans – children, romantic partners, roommates. Meanwhile, relationships with their dogs seemed to flourish.

Rescuing shelter animals grew in popularity, and on social media people celebrated being at home with their pets. Dog content on Instagram and Pinterest now commonly includes hashtags like #DogsAreBetterThanPeople and #IPreferDogsToPeople.

“The more I learn about people, the more I like my dog” appears on merchandise all over e-commerce sites such as Etsy, Amazon and Redbubble.

One 2025 study found that dog owners tend to rate their pets more highly than their human loved ones in several areas, such as companionship and support. They also experienced fewer negative interactions with their dogs than with the closest people in their lives, including children, romantic partners and relatives.

The late primatologist Jane Goodall celebrated her 90th birthday with 90 dogs. She stated in an interview with Stephen Colbert that she preferred dogs to chimps, because chimps were too much like people.

Fraying fabric

This passion for dogs seems to be growing as America’s social fabric unravels – which began long before the pandemic.

In 1972, 46% of Americans said “most people can be trusted.” By 2018, that percentage dropped to 34%. Americans report seeing their friends less than they used to, a phenomenon called the “friendship recession,” and avoid having conversations with strangers because they expect the conversation to go badly. People are spending more time at home.

Today, millennials make up the largest percentage of pet owners. Some cultural commentators argue dogs are especially important for this generation because other traditional markers of stability and adulthood – a mortgage, a child – feel out of reach or simply undesirable. According to the Harris Poll, a marketing research firm, 43% of Americans would prefer a pet to a child.

Amid those pressures, many people turn to the comfort of a pet – but the expectations for what dogs can bring to our lives are becoming increasingly unreasonable.

For some people, dogs are a way to feel loved, to relieve pressures to have kids, to fight the drudgery of their job, to reduce the stress of the rat race and to connect with the outdoors. Some expect pet ownership to improve their physical and mental health.

And it works, to a degree. Studies have found dog people to be “warmer” and happier than cat people. Interacting with pets can improve your health and may even offer some protection against cognitive decline. Dog-training programs in prisons appear to reduce recidivism rates.

Unreasonable expectations

But expecting that dogs will fill the social and emotional gaps in our lives is actually an obstacle to dogs’ flourishing, and human flourishing as well.

In philosophical terms, we could call this an extractive relationship: Humans are using dogs for their emotional labor, extracting things from them that they cannot get elsewhere or simply no longer wish to. Just like natural resource extraction, extractive relationships eventually become unsustainable.

The late cultural theorist Lauren Berlant argued that the present stage of capitalism creates a dynamic called “slow death,” a cycle in which “life building and the attrition of life are indistinguishable.” Keeping up is so exhausting that, in order to maintain that life, we need to do things that result in our slow degradation: Work becomes drudgery under unsustainable workloads, and the experience of dating suffers under the unhealthy pressure to have a partner.

Similarly, today’s dog culture is leading to unhealthy and unsustainable dynamics. Veterinarians are concerned that the rise of the “fur baby” lifestyle, in which people treat pets like human children, can harm animals, as owners seek unnecessary veterinary care, tests and medications. Pets staying at home alone while owners work suffer from boredom, which can cause chronic psychological distress and health problems. And as the number of pets goes up, many people wind up giving up their animal, overcrowding shelters.

So what should be done? Some philosophers and activists advocate for pet abolition, arguing that treating any animals as property is ethically indefensible.

This is a hard case to make – especially with dog lovers. Dogs were the first animal that humans domesticated. They have evolved beside us for as long as 40,000 years, and are a central piece of the human story. Some scientists argue that dogs made us human, not the other way around.

Perhaps we can reconfigure aspects of home, family and society to be better for dogs and humans alike – more accessible health care and higher-quality food, for example. A world more focused on human thriving would be more focused on pets’ thriving, too. But that would make for a very different America than this one.The Conversation

Margret Grebowicz, Distinguished Professor of the Humanities, Missouri University of Science and Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

US gets a wake-up call as evidence points to a 'Trump slump'

With an upcoming FIFA World Cup being staged across the nation, 2026 was supposed to be a bumper year for tourism to the United States, driven in part by hordes of arriving soccer fans.

And yet, the U.S. tourism industry is worried. While the rest of the world saw a travel bump in 2025, with global international arrivals up 4%, the U.S. saw a downturn. The number of foreign tourists who came to the United States fell by 5.4% during the year – a sharper decline than the one experienced in 2017-18, the last time, outside the height of the COVID-19 pandemic, that the industry was gripped by fears of a travel slump.

Policy stances from the Trump administration on everything from immigration to tariffs, along with currency swings and stricter border controls, have seemingly proved a turnoff to travelers from other countries, especially Canadians – the single largest source of foreign tourists for the United States. Canadian travel to the U.S. fell by close to 30% in 2025. But it is not just visitors from Canada who are choosing to avoid the United States. Travel from Australia, India and Western Europe, among others, has also shrunk.

We are experts in tourism. And while we don’t possess a crystal ball, we believe that the tourism decline of 2025 could well continue through 2026. The evidence appears clear: Washington’s ongoing policies are putting off would-be travelers. In other words, the tourism industry is in the midst of a “Trump slump.”

Fewer Canadians heading south

The impact of Donald Trump’s policies are perhaps most pronounced when looking north of the U.S. border. According to the U.S. Travel Association, Canadian visitors generated approximately 20.4 million visits and roughly US$20.5 billion in visitor spending in 2024, supporting about 140,000 American jobs.

The economic impact of fewer Canadian visitors in 2025 affects mostly border states that depend heavily on people driving across the border for retail, restaurants, casinos and short-stay hotels.

The sharp drop in return trips by car to Canada is a direct indication that border economies might be facing stress. This has led elected officials and tourism professionals to woo Canadians in recent months, sometimes with “Canadian-only deals.”

And it isn’t just border states. In Las Vegas, some hotels are now offering currency rate parity between Canadian and U.S. dollars for rooms and gambling vouchers in a bid to attract customers.

Winter-sun states, such as Florida, Arizona and California, are facing both fewer short-stay arrivals and an emerging drop-off in Canadian “snowbirds.” Reports indicate a noticeable increase in Canadians listing U.S. properties in Florida and Arizona for sale and canceling seasonal plans, threatening lodging, health care spending and property tax revenue.

Economic and safety concerns

Economic policies pursued by the Trump administration appear to be among the main reasons visitors are staying away from the U.S. Multiple tariff announcements – pushing tariffs to the highest levels since 1935 – along with tougher border-related rhetoric and an aggressive foreign policy have contributed to a negative perception of the U.S. among would-be tourists.

Many foreigners report feeling unwelcome or uncertain about travel to the U.S., and some public leaders from Canada and Europe have urged citizens to spend domestically, instead. This significantly reduced intent to travel to the U.S. in 2025.

Meanwhile, exchange rates and inflation have further affected some aspiring travelers, especially Canadians. The Canadian dollar was weakened in 2025, making U.S. trips more expensive. This disproportionately affected day-trip and shopping-driven border crossings.

Travelers are also staying away from the U.S. because of safety concerns. Several countries have posted travel advisories about the risks of traveling to the U.S., with Germany being the latest. Although most worries are related to increased border controls, recent aggressive tactics by immigration agents have added to potential visitors’ decisions to avoid the U.S.

A wake-up call for the US

The current tourism outlook is reason for concern. Julia Simpson, president and CEO of the industry association World Travel and Tourism Council, has described the situation as a “wake-up call” for the U.S. government.

“The world’s biggest travel and tourism economy is heading in the wrong direction,” she said in May 2025. “While other nations are rolling out the welcome mat, the U.S. government is putting up the ‘closed’ sign.”

According to estimates, the U.S. stood to lose about $30 billion in international tourism in 2025 as travelers chose to travel elsewhere.

The disappointing figures for U.S. tourism follow a longer trend. The share of global international travel heading to the U.S. fell from 8.4% in 1996 to 4.9% in 2024 and was expected to drop to 4.8% in 2025. Meanwhile, arrivals to other top tourism destinations, including France, Greece, Mexico and Italy, are set to increase.

The decline is also being felt by the business tourism sector, with every major global region sending fewer people to the U.S. for work.

A World Cup bump?

So what does that mean for the upcoming FIFA World Cup, with 75% of the soccer matches being hosted across the United States? Traditionally, host nations benefit from sports events, although impacts are often overestimated. After a disappointing year, the U.S. tourism sector expects the World Cup to boost visits and revenue.

But Trump’s foreign policy may undermine those expectations.

A new visa integrity fee of $250 and plans for social media screening of some visitors make travel to the U.S. less attractive. And there are growing calls for a boycott of the U.S. following some of Trump’s policies, including his aggressive stance about Greenland.

Former FIFA President Sepp Blatter has suggested that fans avoid going to the U.S. for the World Cup.

It remains to be seen whether fans will follow his call. Bookings for flights and hotels were up after the dates and venues of games were announced in December.

But current political rhetoric is affecting travel decisions, especially given that fans from some specific countries may not be able to get visas. The U.S. government has imposed travel bans on Senegal, Ivory Coast, Iran and Haiti, all of which have qualified for the World Cup.

European soccer leaders have even discussed the possibility of a boycott, although such an action is unlikely to happen, given the revenue at stake for national teams and football associations.

Will the ‘Trump slump’ continue?

White House policies look unlikely to drastically change in the next few months. And this causes concern for tourism professionals, although most have remained silent about the recent immigration crackdown.

To make matters worse, federal funding for Brand USA, the national destination marketing organization, was cut deeply in mid-2025, leading to staff shortages that have reduced the country’s capacity to counter negative sentiment through positive promotion.

Soccer fans tend to be passionate about following their national side. And this could offset some of the impact of the Trump travel slump.

Yet, with sky-high match ticket prices and the international reputation of the U.S. as a tourism destination damaged, we believe it is unlikely that the tourism industry will recover in 2026. It will take a long time and good strategies to repair the serious damage done to the nation’s image among travelers in the rest of the world.The Conversation

Frédéric Dimanche, Professor and former Director (2015-2025), Ted Rogers School of Hospitality and Tourism Management, Toronto Metropolitan University and Kelley A. McClinchey, Teaching Faculty, Geography and Environmental Studies, Wilfrid Laurier University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Trying to predict what Trump will do next is bad for your brain — according to science

Donald Trump can change the temperature of a room with a sentence. One minute he is certain, the next he is backtracking. One day he is threatening, the next he is hinting at a deal. Even before anything concrete happens, people brace for his next turn.

That reaction is not just political. It is what unpredictability does to any system that requires stability. To act at all, you need some working sense of what is happening and what is likely to happen next.

One influential framework in brain science called predictive processing suggests the mind does not wait passively for events. It constantly guesses what will happen, checks those guesses against reality, and adjusts.

A brain that predicts can prepare, even when what it prepares for is uncertainty.The gap between what you expect and what actually happens is known as a prediction error. These gaps are not mistakes but the basis of learning. When they resolve, the brain updates its picture of the world and moves on.

This is not about what anyone intends, but about what unpredictability does to systems that need some stability to work. Trouble starts when mismatches do not resolve because the source keeps changing. People are told one thing, then the opposite, then told the evidence was never real.

The brain may struggle to settle on what to trust, so uncertainty stays high. In this view, attention is how the brain weighs up what counts as best evidence, and turns the volume up on some signals and down on others.

Uncertainty can be worse than bad news

When this keeps happening, it’s hard to get closure. Effort is spent checking and second guessing. That is one reason why uncertainty can feel worse than bad news. Bad news closes the question, uncertainty keeps it open. When expectations will not stabilise, the body stays on standby, prepared for many possible futures at once.

One idea from this theory is that there are two broad ways to deal with persistent mismatch. One is to change your expectations by getting better information and revising your view. The other is to change the situation so that outcomes become more predictable. You either update the model, or you act to make the world easier to deal with.

On the world stage, flattery can be a crude version of the second route, an attempt to make a volatile person briefly easier to predict. Everyday life shows the same pattern, such as unpredictable workplaces. When priorities change without warning, people cannot anticipate what is required. Extra effort may go into reducing uncertainty rather than doing the job.

Research links this kind of unpredictability to higher daily stress and poorer wellbeing.

The same pattern shows up in close relationships. When someone is unpredictable, people scan tone and try to guess whether today brings warmth or conflict. It can look obsessive, but it is often an attempt to avoid the wrong move.

Studies link unpredictable early environments to poorer emotional control and more strained relationships later in life.

The strain does not stay in thought alone. The brain does a lot more than thinking. A big part of its work is regulating the body, such as the heart rate, energy use and the meaning of bodily sensations.

It does this by anticipating what the body will need next. When those anticipations cannot settle, regulation becomes costly.

Words matter here in a literal sense. Language does not just convey information. It shapes expectations, which changes how the body feels.

Trump can do this at a distance. A few words about a situation can raise or lower the stakes for people, whether in Minneapolis or Iran. The point is that signals from powerful, volatile sources force others to revise their models and prepare their bodies for what might come next.

Communication is a form of regulation. Clarity and consistency help other people settle. Volatility and contradiction keep them on edge.

When a single voice can repeatedly unsettle expectations across millions of people, unpredictability stops being a personal stress and becomes a collective regulatory problem.

How to deal with unpredictability

So what helps when unpredictability keeps pulling your attention? Try checking for new information if it changes your next step or plan, otherwise it just keeps the uncertainty alive.

When a source keeps changing, reduce the effort spent trying to decode it. Switch to action. Set a rule that makes the next step predictable. For example, read the news at 8am, then stop and get on with your day.

Learn where not to look. When messages keep reversing, the problem is not a lack of information, it is an unreliable source.

Biological systems survive by limiting wasted predictions. Sometimes that means changing your expectations; sometimes it means changing the situation. And sometimes it means accepting that when Donald Trump is talking, the safest move is to stop trying to predict what comes next.The Conversation

Robin Bailey, Assistant Professor in Clinical Psychology, University of Cambridge

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Aztec empire's collapse shows why ruling through coercion and force fails

When Aztec emissaries arrived in 1520 to Tzintzuntzan, the capital of the Tarascan Kingdom in what is now the Mexican state of Michoacán, they carried a warning from the Aztec emperor, Cuauhtémoc.

They cautioned that strange foreigners – the Spaniards – had invaded the land and posed a grave threat. The emissaries requested an audience with the Tarascan ruler, known as the Cazonci, King Zuanga. But Zuanga had recently died, most likely from smallpox brought by the Spaniards.

Relations between the two empires had long been tense. They had clashed on the western frontier since 1476, fighting major battles and fortifying their borders. The Tarascans viewed the Aztecs as deceitful and dangerous – a threat to their very existence.

So, when the emissaries arrived to speak with a king who was already dead, they were sacrificed and granted audience with him in the afterlife. In that moment, the fate of the Aztecs was sealed in blood.

The Aztec empire did not fall because it lacked capability. It collapsed because it accumulated too many adversaries who resented its dominance. This is a historical episode the US president, Donald Trump, should take notice of as his rift with traditional US allies deepens.

Carl von Clausewitz and other philosophers of war have distinguished the concepts of force and power in relation to statecraft. In the broadest sense, power is ideological capital, predicated on military strength and influence in the global political sphere. In contrast, force is the exertion of military might to coerce other nations to your political will.

While power can be sustained through a strong economy, alliances and moral influence, force is expended. It drains resources and can erode internal political capital as well as global influence if it is used in a way that is perceived as arrogant or imperialistic.

The Aztec empire formed in 1428 as a triple alliance between the city-states of Tenochtitlan, Texcoco and Tlacopan, with Tenochtitlan eventually dominating the political structure. The empire exerted force through seasonal military campaigns and balanced this with a power dynamic of sacrificial display, threat, tribute and a culture of racial superiority.

In both its use of force and power, the Aztec empire was coercive and depended on fear to rule. Those subjugated by the empire, and those engaged in what seemed perpetual war, held great animosity and distrust of the Aztecs. The empire was thus built on conquered people and enemies waiting for the right opportunity to overthrow their overlords.

Hernán Cortés, the Spanish conquistador who ultimately brought large parts of what is now Mexico under the rule of Spain, exploited this hostility. He forged alliances with Tlaxcala and other former Aztec subjects, augmenting his small Spanish force with thousands of indigenous warriors.

Cortés led this Spanish-indigenous force against the Aztecs and besieged them in Tenochtitlan. The Aztecs had only one hope: to persuade the other great power in Mexico, the Tarascan empire to the west, to join forces with them. Their first emissaries met an ill fate. So, they tried again.

In 1521, Aztec envoys arrived once more in Tzintzuntzan and this time met with the new lord, Tangáxuan II. They brought captured steel weapons, a crossbow and armour to demonstrate the military threat they faced.

The Tarascan king paid attention. He sent an exploratory mission to the frontier to determine whether this was Aztec trickery or truth. As they arrived at the frontier, they met a group of Chichimecs – semi-nomadic warrior people who often worked for empires to patrol borders.

When told the mission was heading to Tenochtitlan to scout the situation, the Chichimecs replied that they were too late. It was only a city of death now, and they were on their way to the Tarascan king to offer their services. Tangáxuan submitted to the Spanish as a tributary kingdom the following year before being burned to death in 1530 by Spaniards trying to find where he had hidden gold.

Had the Tarascans maintained normal political relations with the Aztecs, they might have investigated the report of the first emissaries. One can imagine how history would be different if, during the siege of Tenochtitlan, 40,000 Tarascan warriors – renowned archers – had descended from the mountains to the west. It is unlikely that Cortés and his army could have prevailed.

American foreign policy

The failings of the Aztec empire were not due to a lack of courage or military prowess. During their battles with the Spanish, the Aztecs repeatedly demonstrated adaptability, learning how to fight against horses and cannon-laden ships.

The failing was a fundamental flaw in the political strategy of the empire – it was built on coercion and fear, leaving a ready force to challenge its authority when it was most vulnerable.

The foreign policy of the US since 2025, when Trump entered office for his second term, has emulated this model. Recently, the Trump administration has been projecting coercive power to support its ambitions for wealth, notoriety and to project American exceptionalism and manifest superiority.

This has manifested in threats or the exercise of limited force, such as tariffs or military attacks in Iran, Syria, Nigeria and Venezuela. Increasingly, other nations are challenging the effectiveness of this power. Colombia, Panama, Mexico and Canada, for example, have largely ignored the threat of coercive power.

As Trump uses American power to demand Greenland, his threats are becoming more feeble. Nato nations are abiding by their longstanding pact with economic and military resolve, with their leaders saying they will not give in to Trump’s pressure. The US is being pushed towards a position where it will have to switch from coercive power to coercive force.

If this course persists, military engagements, animosity from neighbours and vulnerabilities arising from the strength of other militaries, economic disruptions and environmental catastrophes may well leave the world’s most powerful nation exposed with no allies.The Conversation

Jay Silverstein, Senior Lecturer in the Department of Chemistry and Forensics, Nottingham Trent University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Blaming 'deluded wine moms' for violence is an old trick with a new spin

Following the recent shooting of Renee Good by an agent for Immigration and Customs Enforcement (ICE) in the United States, the Donald Trump administration’s latest narrative suggests that “deluded wine moms” are to blame for the violence in ICE-related demonstrations in Minneapolis and across the country.

This mother-blaming is nothing more than an old trick with a new spin.

Organized gangs of ‘wine moms’

A Fox News columnist recently wrote that “organized gangs of wine moms” are using “antifa tactics” to “harass and impede” ICE activity. In the opinion piece, he claimed that “confusion” over the what constitutes civil disobedience is what “got 37-year-old Renee Good killed.”

Similarly, Vice-President J.D. Vance called Good a “deranged leftist” while a new acronym, AWFUL — Affluent White Female Urban Liberal — has appeared on social media.

In framing protesters like Good, a mother of three, as confused, aggressive and “delusional,” this narrative delegitimizes and pathologizes maternal activism.

This strategy aims to divert blame from the U.S. government and its heavy-handed approach to immigration — which has resulted in yet another slaying of a protester by ICE agents in Minneapolis, this time a male nurse who was reportedly coming to the aid of a female demonstrator — while also drawing on a centuries-old strategy of blaming mothers for social problems.

What makes a ‘wine mom?’

The term “wine mom” emerged over the last two decades as a cultural symbol of the contemporary white, suburban mother who turns to a nightly glass of wine (or two) to cope with the stresses of daily life.

The archetype goes back much further, reflected in literature, film and television characters, such as the wily Lucille Bluth of Arrested Development.

Yet, this motif is less light-hearted than assumed: a recent systematic review reveals a strong link between maternal drinking and stress, especially for working mothers.

While it would be easy to view problematic drinking as another example of maternal failure, it is important not to. Here’s why.

Mother-blame in history

Throughout history, mothers have found themselves in the midst of what American sociologist Linda Blum calls a “mother-valor/mother-blame binary.”

When behaving in accordance with socially acceptable and desirable parameters — that is with warmth, femininity and selflessness — mothers are viewed as “good.” When mothers violate these norms, whether by choice, circumstance or by virtue of their race or class position, they’re “bad mothers.”

Mother-blame ultimately reflects the belief that mothers are solely responsible for their children’s behaviour and outcomes, along with the cultural tendency to blame them when things go wrong. Yet, as Blum points out, “mother-blame also serves as a metaphor for a range of political fears.”

Perhaps the most striking example of this is the suffrage movement, which represented a direct challenge to patriarchal notions that women belonged in the domestic sphere and lacked the intelligence to engage in political discourse.

Suffragettes in the United Kingdom — many of them mothers — occasionally used extreme tactics, such as window-smashing and arson, while women in the U.S. obstructed traffic and waged hunger strikes.

These activists were framed as threatening to not only the establishment, but also to families and the moral fabric of society.

Ironically, despite the fact that women’s entry into politics led to increased spending and improved outcomes related to women, children, families and health care, scholars have found that mother-blaming was as common after the women’s movement as it was before.

Contemporary mother-blame

Beyond political matters, contemporary mother-blame is rampant in other domains.

Mothers have been blamed for a wide variety of their children’s psychological problems, including anxiety, depression and inherited trauma. In media and literature, mothers are often blamed for criminality and violence, reflecting the notion that “mothers make monsters.”

When children struggle in school, educators and administrators may blame the mother. Mothers risk being called “too passive” if they don’t advocate for their children or “too aggressive” when they do.

Similarly, the “crazy woman” or “hysterical mother” is a well-known trope in custody law, and mothers may be blamed even when their children are abused by others. Mass shootings? Mom’s failure. The list goes on.

By setting up mothering as a high-stakes endeavour, the cultural norm of mother-blame also serves to “divide and conquer.”

In my sociology research, I found that mothers on Facebook worked to align themselves with like-minded “superior” mothers, while distancing themselves from perceived “inferior” mothers. This feeds into the cultural norm of “combative mothering,” which pits mothers against each other.

An old trick with a new spin

The “wine mom” narrative builds on this historical pattern of mother-blame. It is meant to trivialize, delegitimize, divide and denigrate mothers who are, in fact, well-organized and motivated activists concerned for their communities.

While there are legitimate concerns around maternal drinking as a coping mechanism, the “wine mom” label has begun to represent something different. Mothers are reclaiming the title to expand their cause.

As @sara_wiles, promoting the activist group @redwineblueusa stated on Instagram: “They meant to scare us back into the kitchen, but our actual response is, ‘Oh, I want to join!’”

We should acknowledge that rather than causing societal problems, mothers have a long history of trying to fix them, even if imperfectly. Mothers like Renee Good are no exception.The Conversation

Darryn DiFrancesco, Assistant Professor, School of Nursing, Faculty of Human and Health Sciences, University of Northern British Columbia

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How a 'dirtbag' billionaire chose to do capitalism differently

Few people globally have influenced business, sport, the environment and philanthropy like Patagonia founder Yvon Chouinard.

Chouinard’s inventive approach across these spheres makes the recent biography Dirtbag Billionaire by The New York Times journalist David Gelles an intriguing read.

The anti-authoritarian entrepreneur started out making basic rock-climbing equipment. He then built a business reputation based on ethical commerce, and eventually gave away his company, promising all profits to fighting the climate crisis.

From an Australian perspective, there are lessons to learn given growing environmental and climate concerns, while both corporate giving and corporate distrust have surged in the past decade.

The wild early years

Chouinard prefers the “dirtbag” label to that of businessman or billionaire. It’s a reference from his 1960s lifestyle, a term for someone who sleeps rough, roams widely and disdains material possessions.

As a young climber chasing adventures with friends on rock faces, rivers and waves, Chouinard lived frugally. He ate cat food, squirrels and porcupines.

In these years, inventive Chouinard revolutionised climbing. Using a junkyard forge, he hand-crafted innovative, reusable, softer metal spikes to drive into rock faces. At first selling from his car boot, he built up a US and international customer base.

But, faithful to his environmental values, Chouinard then risked the company by ditching his original top-selling metal spike that damaged rock faces for one that did less harm to the cliff face.

Along the way he employed many fellow climbing, surfing and kayaking enthusiasts, prioritising employee wellbeing and engagement in the business. This was decades before employees were seen as a stakeholder, or internal culture was considered important in a business.

A clash of values

However, with the success of his Patagonia clothing business formed in 1973, Chouinard the conservationist had entered a highly capitalistic sector. The retail market was based on trend-driven overconsumption and exploitative labour and environmental practices.

His quest to do capitalism differently is instructive.

Despite higher costs, Chouinard moved the company into organic cotton use and encouraged regenerative topsoil practices. The principled actions built customer trust and loyalty.

His approach also inspired others who saw decisions that put environmental considerations above profit were good business all round.

As Patagonia grew into a billion-dollar company, he maintained a policy of donating 1% of sales (not just profit) to the environment, no matter how tight the times.

Chouinard co-established 1% for the Planet in 2001 as an accrediting body to encourage companies worldwide to donate 1% of their sales to environmental organisations. Since founding, over 11,000 companies in 110 countries have donated a total of US$823 million (A$1.2 billion).

Chouinard also actively called out corporate greenwashing, and Patagonia was a corporate activist on multiple issues. This included suing US President Donald Trump in 2017 to keep wilderness reserves safe from oil and gas exploration and land development.

One of the first B Corps

In another leadership move, Patagonia in 2012 became the first California company to become a certified Benefit Corporation, better known as a B Corp.

This is a legally binding, transparently measured commitment to act sustainably, live up to independent performance standards and consider worker, society and environmental interests.

Then, aged 83 in 2022, Chouinard established a pioneering succession trust structure and nonprofit collective for the business. This would see Patagonia continue as an independent, environment-led activist company rather than be floated or sold and have its values and foundations diluted.

This organisational restructure supercharged Chouinard’s philanthropy.

The family retains a voice, while giving away 100% of their estimated US$3 billion and all of Patagonia’s future profits that are not reinvested in the business. (US$100 million in 2022).

Even the legendary industrialist and philanthropist Andrew Carnegie only gave away 90% of his fortune.

Lessons for future philanthropists

My previous research records the top five motivations for Australian philanthropists as:

  • making a difference
  • giving back to the community
  • personal satisfaction
  • aligning with moral or philosophical beliefs, and
  • setting an example.

Chouinard’s philanthropy touches on all of these.

US philanthropy researcher Paul Schervish uses the phrase “hyperagency” to capture the character and capacity that some individuals have to achieve the outcomes they deem important for society.

Schervish suggests such changemakers build their own world rather than staying within the constraints of traditional approaches.

Chouinard built his own version of capitalism. He continues to argue the Earth is the only resource base for business, and is therefore the prime business stakeholder. Without it, there are no customers, shareholders, employees or business.

Patagonia’s core mission became: “We’re in business to save our home planet”. The company established Earth as its major shareholder.

A message in Dirtbag Billionaire for givers small and large, individual and corporate, is that authentic giving is about values.

Such authentic giving across a lifetime using money, time, voice, networks, workplaces and ethical principles is rarely so well on display as in the life of Yvon Chouinard.The Conversation

Wendy Scaife, Adjunct Associate Professor and Director, Australian Centre for Philanthropy and Nonprofit Studies, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

'We want you arrested because we said so': Former federal judge blasts Trump admin

As Immigration and Customs Enforcement, or ICE, agents continued to use aggressive and sometimes violent methods to make arrests in its mass deportation campaign, including breaking down doors in Minneapolis homes, a bombshell report from the Associated Press on Jan. 21, 2026, said that an internal ICE memo – acquired via a whistleblower – asserted that immigration officers could enter a home without a judge’s warrant. That policy, the report said, constituted “a sharp reversal of longstanding guidance meant to respect constitutional limits on government searches.”

Those limits have long been found in the Fourth Amendment to the U.S. Constitution. Politics editor Naomi Schalit interviewed Dickinson College President John E. Jones III, a former federal judge appointed by President George W. Bush and confirmed unanimously by the U.S. Senate in 2002, for a primer on the Fourth Amendment, and what the changes in the ICE memo mean.

Okay, I’m going to read the Fourth Amendment – and then you’re going to explain it to us, please! Here goes:

“The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.” Can you help us understand what that means?

Since the beginning of the republic, it has been uncontested that in order to invade someone’s home, you need to have a warrant that was considered, and signed off on, by a judicial officer. This mandate is right within the Fourth Amendment; it is a core protection.

In addition to that, through jurisprudence that has evolved since the adoption of the Fourth Amendment, it is settled law that it applies to everyone. That would include noncitizens as well.

What I see in this directive that ICE put out, apparently quite some time ago and somewhat secretly, is something that, to my mind, turns the Fourth Amendment on its head.

What does the Fourth Amendment aim to protect someone from?

In the context of the ICE search, it means that a person’s home, as they say, really is their castle. Historically, it was meant to remedy something that was true in England, where the colonists came from, which was that the king or those empowered by the king could invade people’s homes at will. The Fourth Amendment was meant to establish a sort of zone of privacy for people, so that their papers, their property, their persons would be safe from intrusion without cause.

So it’s essentially a protection against abuse of the government’s power.

That’s precisely what it is.

Has the accepted interpretation of the Fourth Amendment changed over the centuries?

It hasn’t. But Fourth Amendment law has evolved because the framers, for example, didn’t envision that there would be cellphones. They couldn’t understand or anticipate that there would be things like cellphones and electronic surveillance. All those modalities have come into the sphere of Fourth Amendment protection. The law has evolved in a way that actually has made Fourth Amendment protections greater and more wide-ranging, simply because of technology and other developments such as the use of automobiles and other means of transportation. So there are greater protected zones of privacy than just a person’s home.

ICE says it only needs an administrative warrant, not a judicial warrant, to enter a home and arrest someone. Can you briefly describe the difference and what it means in this situation?

It’s absolutely central to the question here. In this context, an administrative warrant is nothing more than the folks at ICE headquarters writing something up and directing their agents to go arrest somebody. That’s all. It’s a piece of paper that says ‘We want you arrested because we said so.’ At bottom that’s what an administrative warrant is, and of course it hasn’t been approved by a judge.

This authorized use of administrative warrants to circumvent the Fourth Amendment flies in the face of their limited use prior to the ICE directive.

A judicially approved warrant, on the other hand, has by definition been reviewed by a judge. In this case, it would be either a U.S. magistrate judge or U.S. district judge. That means that it would have to be supported by probable cause to enter someone’s residence to arrest them.

So the key distinction is that there’s a neutral arbiter. In this case, a federal judge who evaluates whether or not there’s sufficient cause to – as is stated clearly in the Fourth Amendment – be empowered to enter someone’s home. An administrative warrant has no such protection. It is not much more than a piece of paper generated in a self-serving way by ICE, free of review to substantiate what is stated in it.

Have there been other kinds of situations, historically, where the government has successfully proposed working around the Fourth Amendment?

There are a few, such as consent searches and exigent circumstances where someone is in danger or evidence is about to be destroyed. But generally it’s really the opposite and cases point to greater protections. For example, in the 1960s the Supreme Court had to confront warrantless wiretapping; it was very difficult for judges in that age who were not tech-savvy to apply the Fourth Amendment to this technology, and they struggled to find a remedy when there was no actual intrusion into a structure. In the end, the court found that intrusion was not necessary and that people’s expectation of privacy included their phone conversations. This of course has been extended to various other means of technology including GPS tracking and cellphone use generally.

What’s the direction this could go in at this point?

What I fear here – and I think ICE probably knows this – is that more often than not, a person who may not have legal standing to be in the country, notwithstanding the fact that there was a Fourth Amendment violation by ICE, may ultimately be out of luck. You could say that the arrest was illegal, and you go back to square one, but at the same time you’ve apprehended the person. So I’m struggling to figure out how you remedy this.The Conversation

John E. Jones III, President, Dickinson College

This article is republished from The Conversation under a Creative Commons license. Read the original article.

More Americans are shocked by ICE's tactics

Over the past year, images of masked, heavily armed Immigration and Customs Enforcement agents arresting men, women and children – outside of courts, at schools and homes – have become common across the United States.

The video of an ICE agent shooting and killing Renee Nicole Good – a U.S. citizen – in Minnesota on Jan. 7, 2026, is one example of the brazen, sometimes deadly tactics that the agency employs.

Part of the reason why recent ICE tactics have shocked Americans is because most people haven’t seen them before. Historically, the country’s militarized immigration enforcement practices have played out closer to the U.S.-Mexico border. And for decades, agents with Customs and Border Protection have carried out most deportations near the border, not ICE.

From 2010-2020, nearly 80% of all deportations were initiated at or near the U.S.-Mexico border. During the COVID-19 pandemic, that number jumped to 98%, as both the Trump and Biden administrations utilized Title 42, a public health statute that allowed the government to rapidly deport recently arrived migrants.

But Trump during his second presidency has greatly shifted immigration enforcement north into the interior of the U.S. And ICE has played a central role.

As international migration and human rights scholars, we have examined recent federal immigration policy to determine why ICE has become the main agency detaining and deporting migrants as far away from the southern border as snowy Minnesota.

And we have also explored how the transition in immigration control from the southern border to more Americans’ front lawns could be shifting the public’s views on deportation tactics.

Migration as a threat

ICE is a relatively new agency. The 2002 Homeland Security Act, passed in the wake of the Sept. 11, 2001, terrorist attacks, created the Department of Homeland Security, known as DHS, by merging the U.S. Customs Service – previously under Treasury Department control – and the Immigration and Naturalization Service, formerly under the Justice Department.

DHS has 22 agencies, including three that focus on immigration: Customs and Border Protection, ICE and U.S. Citizenship and Immigration Services, which manages legal immigration and naturalization.

There is no inherent reason that immigration enforcement should fall under homeland security. But immigration was deemed a national security matter by the George W. Bush administration after 9/11.

In a 2002 presidential briefing justifying DHS’s creation, Bush said, “The changing nature of the threats facing America requires a new government structure to protect against invisible enemies that can strike with a wide variety of weapons.”

The U.S. government has viewed immigration from this national security perspective ever since.

The full impact of the deportations

The Trump administration in early 2025 set a goal of deporting 1 million people during its first year.

But with so few crossings, and thus deportations, at the U.S.-Mexico border, the administration instead has focused its efforts on the U.S. interior.

Trump’s 2025 tax and budget bill reflected this reprioritization, allocating US$170 billion over four years to immigration enforcement, compared to approximately $30 billion allocated in 2024.

Roughly $67 billion goes toward immigration enforcement at the border, including border wall construction. But the largest percentage of the bill’s immigration funding – at least $75 billion – goes toward arresting, detaining and deporting immigrants already living in the U.S.

The Trump administration did not initiate deportations from the U.S. interior. They have formed part of other administration’s policies, both Democratic and Republican.

Interior border enforcement increased under President Bill Clinton in the 1990s with the introduction of the 1996 Illegal Immigration Reform and Immigrant Responsibility Act, which widened the criteria for deportations. And former President Barack Obama was referred to as the “Deporter in Chief” after his administration carried out more than 3 million deportations over his two terms, with roughly 69% of deportations occurring at the border.

But the astronomical growth of government funding toward migration control – at the border and in the U.S. – got the country to where it is today.

Between fiscal year 2003 and 2024, for example, Congress allocated approximately $24 toward immigration enforcement carried out by ICE and CBP for every $1 spent on the immigration court system that handles asylum claims.

The new money allocated under the 2025 budget bill, and the reprioritization of immigration enforcement from the border to the interior, partly explains why Americans are now seeing the long-term consequences of border militarization play out directly in their communities.

Americans may not know about the experiences of migrants who are quickly deported near the border, but it is harder to ignore recent images of people snatched up within their own neighborhoods.

Now the visible targets of border enforcement are increasingly immigrants who have built their lives in the U.S. – neighbors, friends, co-workers – as well as anyone who opposes ICE’s tactics, like Renee Good.

Changing political attitudes

In fact, the violence of Trump’s mass deportation campaign may be changing how Americans view immigration.

Just before the 2024 presidential election, a Gallup Poll found that 28% of Americans believed that immigration was the most important problem facing the nation – the highest percentage since Gallup began tracking the topic in 1981. This number dropped to 19% in December 2025, reflecting how more Americans see immigration as a routine issue that the government can manage rather than a crisis that needs to be dealt with.

This is supported in the academic literature. Migration scholars have shown that voters often support strict immigration policies in the voting booth but resist and protest when governments attempt to implement those policies in organized immigrant communities.

In 2002, for example, migration scholar Antje Ellermann documented that immigration officers reported it was more difficult to detain and deport people in Miami – because of resistance by a politicized immigrant community – compared to relatively conservative and less organized communities in San Diego.

But in both places, Republican and Democratic lawmakers were influential in intervening in individual cases to prevent deportations. This is because senior immigration officials, Ellermann noted, were influenced by media attention and pressure by members of Congress to grant relief.

Support for Trump’s handling of immigration is trending downward. Only 41% of Americans approved of Trump’s approach to immigration as of early January 2026, compared to 51% in March of last year, according to CNN polling.

This declining support for Trump’s tactics comes as Republican senators such as Thom Tillis of North Carolina, Lisa Murkowski of Alaska and Joni Ernst of Iowa have criticized ICE and its operations in Minnesota.The Conversation

Kelsey Norman, Fellow for the Middle East, Baker Institute for Public Policy, Rice University and Nicholas R. Micinski, Assistant Professor of Human Rights and Cultural Relations, American University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Signs Trump’s policies are alienating 3 surprising parts of his MAGA base

As Donald Trump’s second term unfolds, the contradictions at the heart of his “America First” agenda are increasingly apparent. What began as a populist revolt against elite globalism appears to have morphed into policies that alienate the very rural and small-town constituencies that backed him in 2016, 2020 and 2024.

These rust-belt and rural counties were drawn to his promises of economic revival, border security and non-interventionism. Yet, emerging signs of fracture in this Maga base suggest a potential backlash in the upcoming midterms.

The administration’s domestic policies, coupled with aggressive foreign postures, are accelerating disillusionment among Trump’s core supporters.

Domestically, Trump’s intensified immigration enforcement has backfired. Ramped-up ICE raids were sold as fulfilling pledges of mass deportations targeting “criminals”. But these operations have swept up undocumented workers essential to rural economies. Small family farms and businesses in states including California, Idaho and Pennsylvania are reliant on immigrant labour for harvesting crops, dairy operations, and meatpacking. They now face acute shortages.

Agricultural employment dropped by 155,000 workers between March and July 2025, reversing prior growth trends. Farmers in Ventura County, California, for example, denounced raids that targeted routes frequented by agricultural workers. Fields lie unharvested signalling financial ruin for some operations. Family-run farms struggle to find replacements. Low wages and gruelling conditions simply fail to attract American-born labourers.

This labour crisis exacerbates a broader sense of betrayal. Rural voters supported Trump for his anti-elite rhetoric, expecting protection for their livelihoods. Instead, the administration’s actions have hollowed out local workforces without viable alternatives.

The H-2A visa programme, meant to provide temporary foreign workers, has been streamlined – but remains insufficient amid ongoing raids, which deter even legal migrants. These disruptions ripple through small-town economies, where agriculture underpins community stability. Democrats, sensing opportunity, are investing in rural outreach, emphasising economic populism to woo disillusioned voters who feel abandoned by Trump’s enforcement zeal.

Compounding these woes are the ongoing tariff disruptions. Trump touts his tariffs as tools to “make America great”, but in fact they have driven up costs for the same rural groups. Between January and September 2025, tariffs on imports from China, Canada, Mexico, and others have surged, collecting US$125 billion. However, the figure may be even higher according to experts.

But while the administration claims these taxes punish foreign adversaries, the burden falls squarely on American importers and consumers. Small businesses, which account for around 30% of imports, faced an average of US$151,000 in extra costs from April to September 2025, translating to $25,000 monthly hikes. Farmers, already squeezed by low grain prices, pay more for necessities, such as fertilisers (hit by 44% effective tariffs on Indian imports) and machinery parts.

Midwest producers of soybeans, corn, and pork – key US exports – suffer doubly from retaliatory tariffs abroad, which reduce demand and depress revenues. In Tennessee and Pennsylvania, builders report 2.5% rises in material costs, while food prices climb due to duties on beef, tomatoes and coffee.

Trump, meanwhile, is perceived as profiting personally. His properties and branding deals benefit from economic nationalism, even as family farms teeter on the verge of bankruptcy. This disparity fuels resentment. Polls show Trump’s approval slipping in swing counties, with economic anxiety eroding the loyalty that once overlooked his character flaws.

Foreign policy compounds domestic fractures

These domestic fractures are mirrored in foreign policy, where Trump’s interventionism starkly contradicts his campaign pledge of “America First” restraint. Having promised no new wars, he has instead pursued aggressive postures that many Republicans view as unnecessary. The most emblematic is his renewed bid to acquire Greenland, apparently by negotiation or force, which has swiftly followed the US raid on Venezuela in the first week of January, accompanied by threats against other Latin American countries including Cuba and Colombia.

The US president has justified demands for control over the Arctic island – citing threats from Russia and China – as a strategic necessity. But but Nato allies such as Denmark – of which Greenland is a constituent part – have rebuked it as an potentially alliance-shattering move. Congressional Republicans, including Mitch McConnell and Thom Tillis, have broken ranks, warning that force would obliterate Nato and tarnish US influence.

Such dissent highlights broader paradoxes. Trump’s populist realism prioritises tough rhetoric for domestic consumption but yields aggressive, even reckless actions abroad. His administration is effectively dismantling post-1945 institutions while embracing 19th-century spheres-of-influence and outright colonialist thinking, including invoking an updated version of the 1823 Monroe doctrine.

Rural voters, weary of endless wars, supported his non-interventionist promises. Now they see echoes of past entanglements in Trump’s suggestion that the US could intervene in Iran. This cognitive dissonance is accelerating disillusionment with his presidency.

These self-inflicted but inherent contradictions are hastening a pivotal reckoning for Trumpism. In many counties that have thrice backed him – and especially in swing counties – economic hardship and policy betrayals erode the cultural ties binding rural America to the Republican party. Democrats, through programmes such as the Rural Urban Bridge Initiative, are betting on this “betrayal” narrative, spotlighting farmers’ plights to flip seats in November 2026.

Polls show Latinos and independents souring on Trump, with the US president’s base turnout potentially waning as the midterm elections approach in November. If Republicans suffer larger-than-expected losses in those elections, it could mark the decline of Trumpism’s grip by exposing its elite-serving underbelly beneath populist veneer.

Yet, without a compelling alternative vision, Democrats risk squandering this opening. For now, the fractures signal that Trump’s “America First” policies may ultimately leave its rural and rust belt champions behind. Whether Trumpism proves resilient or begins a long decline may well be decided not in Washington and Mar-a-Lago, but in the county seats and small towns that once formed its unbreakable base.The Conversation

Inderjeet Parmar, Professor in International Politics, City St George's, University of London

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Trump plan for Greenland now on much shakier ground

Looking at headlines around the world, it seemed like United States President Donald Trump’s annexation of Greenland was imminent. Buoyed by the success of his military operation to oust Venezuelan President Nicolás Maduro, Trump has ratcheted up his rhetoric and is now threatening tariffs on any nation that opposes him.

Adding insult to injury, he’s openly mocked European leaders by posting their private messages and sharing an AI-generated image of himself raising the American flag over Greenland.

But behind these headlines a different story is emerging.

Trump’s military threats have toxic polling numbers with the American public. His Republican allies have openly threatened to revolt. European countries are rapidly sending reinforcements, raising the costs of any invasion. And Europeans are starting to think about what economic retaliation might look like.

Far from being inevitable, Trump’s Greenland gambit appears to be on increasingly shaky ground.

No good options

Trump has three options to take control of Greenland: diplomacy, money and military force. The latest diplomatic talks collapsed as Greenland and Denmark’s foreign ministers left the White House in “fundamental disagreement” over the future of the territory.

Simply buying the territory is a non-starter. Greenlanders have already said the territory is not for sale, and U.S. Congress is unwilling to foot the bill. That’s left military force, the worst possible option.

It’s difficult to convey in words just how stunningly unpopular this option is with Americans. A recent Ipsos poll found that just four per cent of Americans believe using military force to take Greenland is a good idea.

To put that in perspective, here are some policies that are more popular:

If your official foreign policy is less popular than pardoning drug traffickers, then your foreign policy might be in trouble.

Sensing this unpopularity, Trump has already begun to walk back his military threats. Using his platform at Davos, he claimed “I don’t have to use force. I don’t want to use force. I won’t use force.”

It is too early to tell whether Trump’s claims are sincere. Not long after claiming to be the “president of peace,” he was invading Venezuela and bombing Iran.

The broader point is that if diplomacy has failed, money is a non-starter, and now military action is ostensibly being taken off the table, then Trump has no good options.

The danger of defections

Trump’s political coalition, in fact, is increasingly fragile and in danger of defections. The Republican House majority has shrunk to a razor-thin margin, and Republicans are already signalling a loud break with Trump over Greenland.

Nebraska congressman Don Bacon recently told USA Today: “There’s so many Republicans mad about this … If he went through with the threats, I think it would be the end of his presidency.”

The situation in the Senate looks even worse. Multiple Republican senators have pledged to oppose any annexation, with Thom Tillis and Lisa Murkowski visiting Copenhagen to reassure the Danish government. With enough defections, Congress could sharply curtail Trump’s plans and force a humiliating climb-down.

There’s yet another danger of defection. Senior military officers can resign, retire or object to the legality of orders to attack America’s NATO allies. Just last year, Adm. Alvin Holsey, the leader of U.S. Southern Command, abruptly retired less than year into what is typically a multi-year posting.

Holsey’s departure came amid reports that he was questioning the legality of U.S. boat strikes in the Caribbean. Americans still have a high level of confidence in the military, so when senior officers suddenly leave, it can set off alarm bells.

Creating a tripwire

In recent days, Denmark and its European allies have rushed to send military reinforcements to Greenland. These forces, however, have no hope of defeating a committed American invasion. So why are they there?

In strategic studies, we call this a “tripwire force.” The reasoning is that any attack on this force will create strong pressures at home for governments to respond. Once Danes and Swedes — and other Europeans for that matter — see their soldiers being captured or killed, this will force their governments to escalate the conflict and retaliate against the United States.

The Trump administration would like to seize Greenland, face no European forces and suffer no consequences. But the entire point of a tripwire force is to deny easy wins and to signal that any attack would be met with costly escalation. It creates a price to invading Greenland for an administration that rarely wants to pay for anything.

The B-word

Amid the Trump administration’s threats, people are forced to grapple with what comes next. European governments are already quietly debating retaliation, including diplomatic, military and economic responses.

Chief among these is the European Union’s Anti-Coercion Instrument, colloquially known as the “trade bazooka,” that could significantly curb America’s access to the EU market.

But for ordinary Europeans a different B-word will come to mind: boycott.

Some Europeans began boycotting U.S. goods last year amid Trump’s trade threats — but never to the same level as Canadians. That could quickly change if the U.S. engages in a stunning betrayal of its European allies. Fresh anger and outrage could see Europeans follow Canada’s lead.

Trump repeatedly threatened Canada with annexation, and it triggered a transformation of Canadian consumer habits. Canadians travel to the U.S. less, buy less American food and alcohol and look for more home-grown alternatives. Despite Canada’s small population, these boycotts have caused pain for U.S. industries.

Now imagine a similar scenario with the EU. In 2024, the U.S. exported almost US$665 billion in goods and services to the EU. It’s one of the largest export markets for the U.S., fuelling thousands of jobs and businesses.

The real danger for American companies, however, is when consumer pressure moves upwards to governments and corporations. European governments and corporations who buy from American giants like Microsoft, Google and Boeing will start to see public pressure to buy European — or at least not American. America’s most valuable corporate brands risk being contaminated by the stigma of the U.S. government.

Will he, won’t he?

None of this will stop the Trump administration from trying. Trump’s own words — that there is “no going back” on his plans for Greenland — ensure he’s backed himself into corner.

The more likely scenario seems to be starting to play out — Trump will try and then fail. His threats to annex Greenland will likely be remembered next to “90 trade deals in 90 days” and “repeal and place” in the pantheon of failed Trump policies.

The tragedy here is not simply a Trump administration with desires that consistently exceeds its grasp. It’s that the stain of betraying America’s closest allies will linger long after this administration is gone.The Conversation

Eric Van Rythoven, Instructor in Political Science, Carleton University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

One venue, two speeches: How Mark Carney left Donald Trump in the dust in Davos

The meeting and venue were the same, but the style and tone of the two most anticipated keynote speeches at the World Economic Forum in the Swiss town of Davos could not have been more different. On Tuesday, January 20, Canadian prime minister Mark Carney addressed the assembled political and business leaders as one of them: a national leader with deep expertise in finance.

He spoke about a “rupture” in the world order and the duty of nations to come together through appropriate coalitions for the benefit of all. It was a paean to multilateralism, but one that recognised that the US would no longer provide the glue to hold alliances together. Carney never mentioned the US by name in his speech, instead talking of “great powers” and “hegemons”.

Carney’s quiet, measured and evocative case-making demonstrated his ability to be the leader France’s Emmanuel Macron would like to be and the UK’s Keir Starmer is too cautious to be. He was clear, unequivocal and unafraid of the bully below his southern border. In standing up to the US president, Donald Trump, he appeared every inch the statesperson.

Then, on January 21, Trump took the stage. There was none of Carney’s self-awareness and nor did he read the room recognising the strengths, talents and economic power of the audience. Trump started with humour, noting he was talking to “friends and a few enemies”.

But he quickly shifted to a riff on the greatest hits of the first year of Trump 2.0 with the usual weaving away from his script down the rabbit holes of his perceived need for vengeance. Joe Biden still takes up far too much of Trump’s head space, but the next hour could be summed up as: “Trump great: everyone else bad.”

The president is the most amazing hype man for his own greatness, but it’s a zero-sum game. For him to win, others must lose, whether that’s the UK, Macron or the unnamed female prime minister of Switzerland whom he mocked for the poverty of her tariff negotiation skills. It’s worth noting Switzerland has no prime minister and its current president is a man.

While Carney was at pains to connect with his audience of allies, Trump exists happily in his own world where support – and sovereign territory – can be bought, and fealty trumps all. As ever, Trump played fast and loose with facts, wrapping real successes, aspirations and his unique view of the truth into a paean to himself.

He actually returned to his script to make the case for taking Greenland. The case is built on a notional need for “national and international security”, underscored by pointing out the territory is “in our hemisphere”. As so many commentators have said, collective security will do the job Trump insists that only the US can – and won’t require Denmark to cede territory. But Trump is sounding ever-less the rational actor.

Contrasting visions

The coming year is one of inflection for Trump’s presidency. His Republican party may well lose control of the House and possibly the Senate in the November midterms, which would severely curtail his ability to impose his will unfettered.

Trump is focused on his legacy and demands he’s up there with former US presidents Thomas Jefferson, James Monroe, James Polk and William McKinley, expanding the American empire and its physical footprint. This may be a step too far, even for a president with such vast economic and military power.

Carney’s speech played well both at home and around the world. His line, “If we’re not at the table, we’re on the menu,” clearly resonated with his fellow western leaders. His vision for how “the power of legitimacy, integrity and rules will remain strong if we choose to wield them together”, also offered a positive vision in a dark time.

Trump told the audience that he would not use “excessive strength of force” to acquire Greenland. But, ever the real estate developer, he demanded “right, title and ownership” with an ominous threat: “You can say no – we will remember.”

As Trump laid out his grand vision of protecting and cherishing the rich and aligning nations to do America’s bidding, it was in stark contrast to Carney. The hyperbole and self-aggrandising, the insults and threats, and the singular vision of seeing the world only through the personal impact it has on him mark the US president out as remarkable, even exceptional.

But is this the exceptionalism the US wants? Is America about more than the strongman politics of economic and military coercion?

The immediate reaction in the US was relief, jumping on the line that Trump won’t take Greenland by force. It will be telling to look at the commentary as the country reflects on the president’s aim of lifting America up, seemingly by dragging the rest of the world down.

One leader donned the cloak of statesmanship at Davos this week. It wasn’t Donald Trump.The Conversation

Mark Shanahan, Associate Professor of Political Engagement, University of Surrey

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Trump's Greenland rationale doesn't hold up – but his tactics reveal another plan

In 2019, during his first term, U.S. President Donald Trump expressed a desire to buy Greenland, which has been a part of Denmark for some 300 years. Danes and Greenlanders quickly rebuffed the offer at the time.

During Trump’s second term, those offers have turned to threats.

Trump said on his social media platform Truth Social in late December 2024 that, for purposes of national security, U.S. control over Greenland was a necessity. The president has continued to insist on the national security rationale into January 2026. And he has refused to rule out the use of military force to control Greenland.

From my perspective as an international relations scholar focused on Europe, Trump’s national security rationale doesn’t make sense. Greenland, like the U.S., is a member of NATO, which provides a collective defense pact, meaning member nations will respond to an attack on any alliance member. And because of a 1951 defense agreement between the U.S. and Denmark, the U.S. can already build military installations in Greenland to protect the region.

Trump’s 2025 National Security Strategy, which stresses control of the Western Hemisphere and keeping China out of the region, provides insight into Trump’s thinking.

US interests in Greenland

The United States has tried to acquire Greenland several times.

In 1867, Secretary of State William Seward commissioned a survey of Greenland. Impressed with the abundance of natural resources on the island, he pushed to acquire Greenland and Iceland for US$5.5 million – roughly $125 million today.

But Congress was still concerned about the purchase of Alaska that year, which Seward had engineered. It had seen Alaska as too cold and too distant from the rest of the U.S. to justify spending $7.2 million – roughly $164 million today – although Congress ultimately agreed to do it. There was not enough national support for another frozen land.

In 1910, the U.S. ambassador to Denmark proposed a complex trade involving Germany, Denmark and the United States. Denmark would give the U.S. Greenland, and the U.S. would give Denmark islands in the Philippines. Denmark would then give those islands to Germany, and Germany would return Schleswig-Holstein – Germany’s northernmost state – to Denmark.

But the U.S. quickly dismissed the proposed trade as too audacious.

During World War II, Nazi Germany occupied Denmark, and the U.S. assumed the role of protector of both Greenland and Iceland, both of which belonged to Denmark at the time. The U.S. built airstrips, weather stations and radar and communications stations – five on Greenland’s east coast and nine on the west coast.

The U.S. used Greenland and Iceland as bases for bombers that attacked Germany and German-occupied areas. Greenland had a high value for military strategists because of its location in the North Atlantic – to counter Nazi threats to Allied shipping lanes and protect transatlantic routes, and because it was a midpoint for refueling U.S. aircraft. Greenland’s importance also rested on its deposits of cryolite, useful for making aluminum.

In 1946, the Truman administration offered to buy Greenland for $100 million, as U.S. military leaders thought it would play a critical role in the Cold War.

The secret U.S. project Operation Blue Jay at the beginning of the Cold War resulted in the construction of Thule Air Base in northwestern Greenland, which allowed U.S. bombers to be closer to the Soviet Union. Renamed Pituffik Space Base, today it provides a 24/7 missile warning and space surveillance facility that is critical to NATO and U.S. security strategy.

At the end of World War II, Denmark recognized Greenland as one of its territories. In 1953, Greenland gained constitutional rights and became a country within the Kingdom of Denmark. Greenland was assigned self-rule in 1979, and by 2009 it became a self-governing country, still within the Kingdom of Denmark, which includes Denmark, Greenland and the Faroe Islands.

Denmark recognizes the government of Greenland as an equal partner and recently gave it a more significant role as the first voice for Denmark in the Arctic Council, which promotes cooperation in the Arctic.

What the US may want

The Trump administration’s 2025 National Security Strategy identifies three threats in the Western Hemisphere: migration, drugs and crimes, and China’s increasing influence.

Two of those threats are irrelevant when considering Greenland. Greenlandic people are not migrating to the U.S., and they are not drug traffickers. However, Greenland is rich in rare earth minerals, including neodymium, dysprosium, graphite, copper and lithium.

Additionally, China seeks to establish mining interests in Greenland and the Arctic as part of its Polar Silk Road initiative. China had offered to build an infrastructure for Greenland, including improving the airport, until Denmark stepped in and offered airport funding. And China has worked with Australian companies to secure mining opportunities on the island.

Those rare earth minerals appeal to the European Union, too. The EU lists some 30 raw materials that are essential for their economies. Twenty-five are in Greenland.

The Trump administration has made it clear that controlling these minerals is a national security issue, and the president wants to keep them away from China.

Figures vary, but it is estimated that over 60% of rare earth elements or minerals are currently mined in China. China also refines some 90% of rare earths. This gives China tremendous leverage in trade talks. And it results in a dangerous vulnerability for the U.S. and other nation states seeking to modernize their economies. With few suppliers of these rare earth elements, the political and economic costs of securing them are high.

Greenland has only two operating mines. One is the Tan Breez project in southern Greenland. It produces 17 metals, including terbium and neodymium, that are used in high-strength magnets used in many green technologies and in aircraft manufacturing, including for the F-35 fighter planes.

Consider for a moment that Trump is not interested in owning Greenland.

Instead, he is using this threatening position to secure promises from the Greenlandic government to make economic deals with the U.S. and not China. Thus, Trump’s threats could be less about national security and much more about eliminating competition from China and securing wealth for U.S. interests.

This form of coercive diplomacy threatens the political and economic development of not only Greenland but Europe. In recent interviews, Trump has made it clear that he does not respect international law and the sovereignty of countries. His position, I believe, undermines the international order and removes the U.S. as a responsible leader of that framework established after World War II.The Conversation

Steven Lamy, Professor Emeritus of Political Science and International Relations and Spatial Sciences, USC Dornsife College of Letters, Arts and Sciences

This article is republished from The Conversation under a Creative Commons license. Read the original article.

From Your Site Articles
Related Articles Around the Web

A pattern the makes the 2026 congressional outlook clear

Now that the 2026 midterm elections are less than a year away, public interest in where things stand is on the rise. Of course, in a democracy no one knows the outcome of an election before it takes place, despite what the pollsters may predict.

Nevertheless, it is common for commentators and citizens to revisit old elections to learn what might be coming in the ones that lie ahead.

The historical lessons from modern midterm congressional elections are not favorable for Republicans today.

Most of the students I taught in American government classes for over 40 years knew that the party in control of the White House was likely to encounter setbacks in midterms. They usually did not know just how settled and solid that pattern was.

Since 1946, there have been 20 midterm elections. In 18 of them, the president’s party lost seats in the House of Representatives. That’s 90% of the midterm elections in the past 80 years.

Measured against that pattern, the odds that the Republicans will hold their slim House majority in 2026 are small. Another factor makes them smaller. When the sitting president is “underwater” – below 50% – in job approval polls, the likelihood of a bad midterm election result becomes a certainty. All the presidents since Harry S. Truman whose job approval was below 50% in the month before a midterm election lost seats in the House. All of them.

Even popular presidents – Dwight D. Eisenhower, in both of his terms; John F. Kennedy; Richard Nixon; Gerald Ford; Ronald Reagan in 1986; and George H. W. Bush – lost seats in midterm elections.

The list of unpopular presidents who lost House seats is even longer – Truman in 1946 and 1950, Lyndon B. Johnson in 1966, Jimmy Carter in 1978, Reagan in 1982, Bill Clinton in 1994, George W. Bush in 2006, Barack Obama in both 2010 and 2014, Donald Trump in 2018 and Joe Biden in 2022.

Exceptions are rare

There are only two cases in the past 80 years where the party of a sitting president won midterm seats in the House. Both involved special circumstances.

In 1998, Clinton was in the sixth year of his presidency and had good numbers for economic growth, declining interest rates and low unemployment. His average approval rating, according to Gallup, in his second term was 60.6%, the highest average achieved by any second-term president from Truman to Biden.

Moreover, the 1998 midterm elections took place in the midst of Clinton’s impeachment, when most Americans were simultaneously critical of the president’s personal behavior and convinced that that behavior did not merit removal from office. Good economic metrics and widespread concern that Republican impeachers were going too far led to modest gains for the Democrats in the 1998 midterm elections. The Democrats picked up five House seats.

The other exception to the rule of thumb that presidents suffer midterm losses was George W. Bush in 2002. Bush, narrowly elected in 2000, had a dramatic rise in popularity after the Sept. 11 attacks on the World Trade Center and the Pentagon. The nation rallied around the flag and the president, and Republicans won eight House seats in the 2002 midterm elections.

Those were the rare cases when a popular sitting president got positive House results in a midterm election. And the positive results were small.

Midterms matter

In the 20 midterm elections between 1946 and 2022, small changes in the House – a shift of less than 10 seats – occurred six times. Modest changes – between 11 and 39 seats – took place seven times. Big changes, so-called “wave elections” involving more than 40 seats, have happened seven times.

In every midterm election since 1946, at least five seats flipped from one party to the other. If the net result of the midterm elections in 2026 moved five seats from Republicans to Democrats, that would be enough to make Democrats the majority in the House.

In an era of close elections and narrow margins on Capitol Hill, midterms make a difference. The past five presidents – Clinton, Bush, Obama, Trump and Biden – entered office with their party in control of both houses of Congress. All five lost their party majority in the House or the Senate in their first two years in office.

Will that happen again in 2026?

The obvious prediction would be yes. But nothing in politics is set in stone. Between now and November 2026, redistricting will move the boundaries of a yet-to-be-determined number of congressional districts. That could make it harder to predict the likely results in 2026.

Unexpected events, or good performance in office, could move Trump’s job approval numbers above 50%. Republicans would still be likely to lose House seats in the 2026 midterms, but a popular president would raise the chances that they could hold their narrow majority.

And there are other possibilities. Perhaps 2026 will involve issues like those in recent presidential elections.

Close results could be followed by raucous recounts and court controversies of the kind that made Florida the focal point in the 2000 presidential election. Prominent public challenges to voting tallies and procedures, like those that followed Trump’s unsubstantiated claims of victory in 2020, would make matters worse.

The forthcoming midterms may not be like anything seen in recent congressional election cycles.

Democracy is never easy, and elections matter more than ever. Examining long-established patterns in midterm party performance makes citizens clear-eyed about what is likely to happen in the 2026 congressional elections. Thinking ahead about unusual challenges that might arise in close and consequential contests makes everyone better prepared for the hard work of maintaining a healthy democratic republic.The Conversation

Robert A. Strong, Senior Fellow, Miller Center, University of Virginia

This article is republished from The Conversation under a Creative Commons license. Read the original article.

One word defines Trump's second term

As Donald Trump celebrates the anniversary of his second inauguration as president of the United States and begins his sixth year in office, his greatest asset is power. He covets absolute power.

The greatest threat to how Trump completes his term is how he wields his power.

Indeed, in the most foolish act in foreign policy in Trump’s presidency, he has threatened punitive tariffs on Denmark and seven other NATO allies in Europe to force the sale of Greenland to the United States. They are outraged. This is a ridiculous ploy that will not deliver Greenland to Trump.

Trump’s escalation in Denmark has already strengthened Putin’s iron resolve to get as much of Ukraine as he can. Prospects for ending the war in Ukraine are now near zero.

On top of Trump’s pending tariffs on Europe, if Trump seizes Greenland, the consequences will shake the world – including Australia. NATO will be terminated. Australia will face an existential question of whether, under those circumstances, it must terminate its alliance with the US.

We can see in a raft of polls at this one-year mark of Trump’s second term that voters across the country are expressing growing disquiet about his management of the economy and the affordability of housing and groceries, the raids by ICE agents as they seize and deport migrants as we saw last week in Minneapolis, and uncertainty about Trump’s foreign adventurism in the Americas and with Iran.

Trump is exercising this power because he can. This will jolt Republicans in Congress to break with Trump on this issue – the first such rift between Trump and his party since his re-election.

Welcome to Trump’s year six.

Trumpism in his second term

Following his election victory in 2024, Trump has been faithful to three of four pillars of Trumpism that made his base a movement that has changed America:

  • nativism (favouring US-born citizens over immigrants)
  • protectionism and tariffs
  • America First nationalism (“Make America Great Again”).

To those ends, Trump is acting aggressively, with immigration agents arresting and deporting tens of thousands, and threats to deploy US troops in American cities to enforce these policies. Trump has imposed punitive tariffs against every trading partner – including Australia, which has a significant trade deficit with the United States. Trump demands foreign companies invest in the United States and build new factories.

But on the fourth Trumpism pillar – America-First isolationism as a driver of America’s foreign policy – Trump has redefined his foreign policy settings with grander ambitions.

Trump has rejected the history of the US waging wars to project American values: protecting Asia from communism in Korea and Vietnam; turning back brutal aggression in Kuwait; punishing the export of radical Islamic terrorism in Afghanistan and Iraq.

Trump has applied these lessons to Iran – so far. It is one thing to take out Iran’s nuclear capability. It is another to do regime change – a bridge too far back to the “forever wars” Trump despises.

Trump has buried America’s posture of globalism. He has withdrawn the US from virtually all the architecture, save the United Nations itself, erected after the second world war to ensure global security, stability and prosperity. He has ordered the US out of global organisations, and has cut billions in foreign aid.

The US attack on Venezuela was about much larger goals than arresting its leader. It was about power – controlling power over critical resources in the Americas, from Venezuela to Greenland and everything in between, from Mexico to Cuba to Canada.

Politics at home

Trump is paying a high price at home for his activism in wielding power abroad. Every day Trump spends projecting power outside the United States means he is not paying attention to the American people.

A recent poll shows 56% of US adults believe Trump has gone too far on Venezuela. 57% do not want the US to strike Iran. Even before Trump’s tariff announcement on Greenland, only 17% approved of Trump’s desire to acquire Greenland, and 71% rejected using military force to do it.

Trump’s overall polls are bad. His approval rating is 40% – nearly 10 points down since his inauguration – and disapproval is at 60%. AP-NORC also finds that “Trump hasn’t convinced the Americans that the economy is in good shape.”

CNN polling reports that 55% of those surveyed believe Trump’s policies “have hurt the economy” and that Trump is not doing enough to lower prices. Grocery prices are up sharply. The latest Wall Street Journal poll shows Trump is underwater by double digits on handling inflation, and that he is not focusing enough on the economy.

On immigration, the unrest in Minneapolis and other cities from the harsh methods employed by ICE agents is also taking a toll, with Trump’s approval on that issue lagging below 40%.

But even with all these red flags and warnings from the field, Trump is undeterred. He believes that as president, he can do anything he wants to do. Guardrails that have for decades protected America’s democracy have been cast aside.

Trump has not been blocked – yet – by an ultra-conservative Supreme Court or the pliant Republican Congress for the tariffs he is imposing, the government agencies he has shut down, the monies appropriated by Congress he has terminated, the hundreds of thousands of government employees he has fired, the military strikes he has ordered without advising, much less getting approval from, Congress.

Trump is seeking more control over the economy by seeking to prosecute the chair of the Federal Reserve Bank, an independent agency that sets monetary policy, and to pack its board with loyalists to Trump’s demands that interest rates be lowered.

Since his inauguration, Trump has instructed the Justice Department to prosecute those who attempted to bring him to justice in courtrooms and impeachment proceedings in Congress.

Trump’s musings on power

As Trump consolidates his power, Trump’s musings become imperatives. After months of expressing a desire to own it, Trump is now acting aggressively to conquer Greenland.

At home, Trump is now also musing – twice so far this month – over whether the US midterm elections will be cancelled. Trump knows the likelihood of the Democrats taking back control of the House of Representatives is high. That is precisely what he suffered in the 2018 congressional elections in his first term.

Trump told Reuters last week, “We shouldn’t even have an election,” because of all his great successes.

In January, Trump told Republicans in the House, “I won’t say cancel the election, they should cancel the election, because the fake news would say, ‘He wants the elections cancelled. He’s a dictator.’ They always call me a dictator.” He told them that if the Democrats take the House back they will “find a reason to impeach” him.

Any steps taken – such as declaring martial law to suspend the midterm elections – will be catastrophic. And that is an understatement.

Based on Trump’s restless mind and command of what he believes is absolute power, at stake this year are the future of democracy at home and alliances abroad.The Conversation

Bruce Wolpe, Non-resident Senior Fellow, United States Study Centre, University of Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How 1984 predicted the global power shifts happening now

There’s nothing new about calling George Orwell’s most influential novel prescient. But the focus has usually been on his portrayal of the oppressive aspects of life in Oceania, the superstate in which Nineteen Eighty-Four is set.

Today, however, a different feature – which as recently as 2019, some critics dismissed as “obsolete” – is getting more attention: its vision of a world divided into three spheres, controlled by autocratic governments that constantly form and then break alliances.

In 2022, Vladimir Putin initiated Russia’s full-on invasion of Ukraine. This year began with the US mounting a raid on Venezuela and snatching its president, while Donald Trump speculated about US actions against various other countries in Latin America and Greenland. Meanwhile, Xi Jinping regularly repeats China’s intention to “reunify” with Taiwan – by force if necessary.

“Orwell-as-prophet” commentators began showing more interest in the superstate idea early in the decade, often leading with references to Putin’s imperial ambitions. This trend became more pronounced when Trump’s second term began.

Last year, American historian Alfred McCoy led with a tripolar reference in his Foreign Policy essay: “Is 2025 the New 1984?” A Bloomberg report on the Trump-Putin summit in Alaska last August was headlined: “It Looks Like a Trump-Putin-Xi World, But It’s Really Orwell’s”. The article described Nineteen Eighty-Four’s fictional model of global affairs as “prophetic”.

Many observers now see Big Brother-like leaders wielding power in Washington, as well as in Moscow and Beijing. In her first essay of 2026, Anne Applebaum wrote in The Atlantic that: “Orwell’s world is fiction, but some want it to become reality.”

The American journalist and historian noted a dangerous desire of some for “an Asia dominated by China, a Europe dominated by Russia, and a Western Hemisphere dominated by the United States”. Social media is awash with comments and maps in the same vein.

Orwell’s influences

Analysts have claimed that elements of Orwell’s portrayal of politics inside Oceania paralleled various parts of dystopian novels written before Nineteen Eighty-Four. They cite, in particular, the potential influence of Jack London’s The Iron Heel (1908) and Aldous Huxley’s Brave New World (1932) – works Orwell discussed in a 1940 essay.

Then there’s Yevgeny Zamyatin’s novel We (1921), which Orwell wrote about in 1946, and Arthur Koestler’s Darkness at Noon (1940), which he wrote about in 1941. Both inspired him with their criticism of the real Soviet Union.

Could these or other utopian and dystopian texts – such as Ayn Rand’s Anthem (1938), Sinclair Lewis’s It Can’t Happen Here (1935), and Noël Coward’s play Peace in Our Time (1946) – have given him ideas about future geopolitics?

In fact, most of the works mentioned downplay or ignore international issues. Koestler focuses on one unnamed totalitarian country, Zamyatin and Huxley on a single world-state, London and Lewis on an America transformed by a domestic tyrannical movement, and Coward a Britain conquered by Hitler.

Two other novels provide partial precedents. The first is The War in the Air (1908) by H.G. Wells, an author Orwell read throughout his life. It has a tripolar side, depicting a war between Germany, the US and Britain, and a Chinese and Japanese force. The second is Swastika Night by Katharine Burdekin (writing as Murray Constantine).

Orwell never referred to Swastika Night in any publication, and his most prominent biographer, D.J. Taylor, has claimed there is no definitive evidence that he read it. However, as it was a Left Book Club selection and he was a Left Book Club author, Orwell would at least have known about it. The novel describes a world divided into two rival camps, not three, but portrays allies becoming rivals. The competing superstates are Nazi Germany and imperial Japan, who were on the same side when the book was written.

In his own words

The most satisfying place to look for inspiration for Nineteen Eighty-Four’s geopolitical vision, though, is in Orwell’s own experiences and non-fiction reading. Before the 1940s, Orwell spent a lot of time learning and writing critically about three oppressive systems: capitalism, fascism and Soviet communism.

In terms of capitalism, working as a colonial police officer in Burma in the 1920s left him disgusted with what he called the “dirty work of empire”. Living in England later led him to write works on class injustices such as The Road to Wigan Pier (1937).

In terms of fascism, he wrote scathingly about Hitler and Franco. Orwell was also appalled by accounts of repression under Stalin. His time fighting in Spain reinforced his dark view of Moscow and he saw erstwhile allies become arch-enemies as the anti-Franco coalition broke down, and the Soviets began treating groups that had been part of it as villains.

Second world war news stories had an impact as well. In 1939 and 1941 respectively, newspapers were full of reports of Moscow and Berlin signing a non-aggression pact, and then of Moscow switching sides to join the Allies.

And in a 1945 essay, Orwell mocked news of many people on the left embracing the fervently anti-Communist Chinese Nationalist Party leader, Chiang Kai-shek, once he was with the Allies – seemingly having forgetten their earlier disdain for Chiang’s brutal effort to exterminate the Chinese Communist Party.

But perhaps the most notable 1940s news story of all relating to Nineteen Eighty-Four’s geopolitics has been flagged by Taylor as one that broke in 1943. He notes that Orwell sometimes claimed a key inspiration for his final novel were the reports of Roosevelt, Stalin and Churchill talking at the 1943 Tehran conference about carving up the post-war world into three spheres.

Nineteen Eighty-Four has had extraordinary longevity as a go-to text for political commentary. There are many explanations for its staying power, but right now a key feature of it may be its relevance to thinking about both repression of dissent and Newspeak-style propaganda in many individual countries – and the unsettling geopolitical tensions in the world at large.The Conversation

Emrah Atasoy, Associate Fellow of English and Comparative Literary Studies & Honorary Research Fellow of IAS, the University of Warwick and Upcoming IASH Postdoctoral Research Fellow, the University of Edinburgh, University of Warwick and Jeffrey Wasserstrom, Professor of Chinese and World History, University of California, Irvine

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Two acclaimed American films reveal the failures of leftwing revolutionary politics

Donald Trump’s victory in November 2024 led to considerable soul-searching among those on the left of US politics. Having failed to defeat a convicted criminal they beat once before, the Democrats spent most of 2025 licking their wounds as Trump launched what they saw as a full-frontal assault on US democracy.

This new year has begun with fresh outrages at home and abroad, with the administration acting with increasingly horrifying impunity.

Coupled with the continued rise of rightwing populism and authoritarianism the world over, Trump 2.0 has felt like an existential crisis for the left.

The country has been here before. Leftwing protest movements in the 1960s in the US contributed to great legislative change – particularly in the area of civil rights – but they were often caricatured as unpatriotic, particularly in relation to the war in Vietnam. The feeling that the country was coming apart at the hands of young, violent radicals led the conservative “silent majority” to deliver Richard Nixon’s 1968 election victory.

Since then, mainstream leftwing politics in the US has recoiled from the idealism of the 1960s and instead offered change mostly in small increments. But this has arguably not proven a particularly successful strategy either over the past half century or more.

In the context of yet another defeat and the latest round of introspection, it seems appropriate, then, that two films concerned with the failures of leftwing revolutionary politics of the 1960s and 1970s should emerge almost simultaneously with Trump’s resurgence.

Exploring leftwing activism

Though very different in style and tone, Paul Thomas Anderson’s One Battle After Another (2025) and Kelly Reichardt’s The Mastermind (2025) both critique what they see as the strategic inadequacy and self-indulgence of leftwing activism, as well as explore its personal cost.

One Battle After Another sees former revolutionary Pat Calhoun, aka “Bob” (Leonardo Di Caprio) trying to rescue his daughter Willa (Chase Infiniti) from the clutches of a psychopathic white supremacist colonel, Lockjaw (Sean Penn). Though Bob had in a previous life resisted the federal government’s cruel, racist immigration policies through a series of daring raids on detention centres, fatherhood and excessive cannabis use have dulled his revolutionary edge.

Instead, Bob is now a somewhat incompetent buffoon. The film mines, for comedic purposes, his shambolic attempts to communicate with the “French 75” – the revolutionary army of which he was once part, modelled on real-life revolutionary groups of the 1960s and 1970s like the Weathermen.

Stumbling around in his bathrobe, he has forgotten all the codes and conventions necessary to navigate this world. From passwords to pronouns, Bob is out of step with the times.

However, the film finds room to poke fun at the sanctimony of the left too. As Bob grows increasingly aggressive when unable to secure information regarding a crucial rendezvous point, the thin-skinned radical to whom he is speaking on the phone informs him that the language Bob is using is having a detrimental impact on his wellbeing. If Bob lacks the competence to support the revolution, the people in charge of it are too fragile to achieve one either.

By contrast, The Mastermind follows J.B. Mooney (Josh O’Connor) in his attempts to evade the clutches of the authorities after he orchestrates the theft of four artworks from a suburban museum. Husband, father, and the son of a judge, Mooney is privileged, directionless, disorganised, selfish and, it seems, oblivious to the impact of the war in Vietnam as conflict rages all around him.

His disorganisation is obvious from the moment he realises his children’s school is closed for teacher training on the day of the heist. His privilege is clear when all he has to do is mention his father’s name when first questioned by police to get them off his back.

Even his attempts to convince his wife, Terri (Alana Haim), that he did this for her and their kids is inadequate, as he stumbles into admitting he also did it for himself.

While on the run from the authorities, Mooney appears ignorant of what is really going on around him, from the young Black men who discuss their imminent deployment to Vietnam, to the news broadcast of the realities of the war. Without spoiling anything, Mooney is, in the end, unable to avoid the effects of Vietnam on US society altogether.

Telling moments in both films also suggest the wavering commitment to revolution among its former acolytes. In The Mastermind, Mooney hides out at the home of Fred (John Magaro) and Maude (Gaby Hoffmann), a couple with whom he attended art college.

Despite her activist past, Maude refuses to let him stay for longer than one night for fear of unwanted attention from the authorities. In One Battle After Another, Bob’s willingness to take risks with his safety and freedom declines when he becomes a parent, and he is – rather problematically – quick to judge Willa’s mother, Perfidia (Teyana Taylor), for continuing to do so.

Political cinema of the 1970s

Both films can’t help but recall the similarly political work produced in US cinema in the late 1960s and early 1970s, such as Five Easy Pieces (1970), Two-Lane Blacktop (1971) and Chinatown (1974). In the midst of the Nixon-era backlash to the radicalism of the 1960s, these films have a tone of defeatist resignation, featuring directionless protagonists and unhappy endings.

The Mastermind’s conclusion is comparable to these earlier examples: its conclusion sees the police at a Vietnam protest, patting each other on the back, having rounded up another bunch of protesters and sent them to the can.

Though One Battle After Another is considerably more effervescent in its style, it too sees leftwing revolutionary politics as something of a dead end. Smaller scale victories are possible, with Sergio (Benicio Del Toro) continuing to fight the good fight for undocumented immigrants, and Willa running off to join a Black Lives Matter protest at the film’s end.

But watching both films from the perspective of a new year in which the Trump administration threatens violent upheaval at home and abroad, I think of Captain America’s (Peter Fonda) mournful lament towards the end of counterculture classic Easy Rider (1969): “We blew it.”

Keep reading...Show less
BRAND NEW STORIES
@2026 - AlterNet Media Inc. All Rights Reserved. - "Poynter" fonts provided by fontsempire.com.