This week people from around the state and the entire country will gather in Ward Valley, California to remember and celebrate the 25th anniversary of an historic people’s victory – the shutdown of a proposed nuclear waste dump project that would have endangered the water supply of Arizona, Southern California and Northern Mexico.
It was a victory of over a decade of persistent non-violent resistance by a coalition of Native American tribes and numerous other activist organizations who joined forces in an inspiring example of the impact united people power can have. That power was again shown in the successful campaign leading to the shutdown of the San Onofre Nuclear Generating Station (SONGS) in 2013.
In his book Doing Democracy – The MAP Model for Organizing Social Movements, the late social movement theorist Bill Moyer [not the TV guy] stressed the importance for activists to remember and celebrate their victories.
In 1989, a small group of Californians –including Phil Klasky, Ward Young, Rachel Johnson, Pam Dake and EON Co-Director Mary Beth Brangan – joined the Fort Mojave Indian Tribe and a few residents of Needles, California to help begin a movement to stop a planned nuclear waste dump at Ward Valley in the Mojave Desert near the Colorado River. Diane D’Arrigo of NIRS gave expert organizing assistance and Dan Hirsch of Committee to Bridge the Gap and Roger Herried of Abalone Alliance provided technical and procedural help.
Considered by many a hopeless cause at the beginning, over time the movement grew to include scientists, environmentalists and the region’s many Native American tribes. After a ten-year battle, an peaceful occupation at the proposed site and the powerful involvement of Native American tribal organizers, a judge’s ruling in 1999 brought an end to the planned dump.
This film – produced in thirty years ago and re-mastered from an archival copy – tells the story of that successful movement’s beginning. It portrays many of the now-fallen peaceful warriors who played important roles in the successful campaign and whose memories will be honored at the Ward Valley gathering.
The film’s analysis of radioactive waste issues is as relevant today as when it was first released.
“Autonomous nuclear weapons introduce new risks of error and opportunities for bad actors to manipulate systems. Current AI is not only brittle; it’s easy to fool. A single pixel change is enough to convince an AI a stealth bomber is a dog.” – Zachary Kallenborn – Bulletin of Atomic Scientists
By James Heddle – EON
Welcome to CyberWonderland
ChatGPT is being hyped as a cutting-edge new ‘helper bot’ by the Elon Musk-backed tech firm OpenAI. Sott.net reports that “Microsoft on Monday announced a new multiyear, multibillion-dollar investment with ChatGPT-maker OpenAI.”
According to the New York Post, “This superhuman tech can do a variety of complicated tasks on the fly, from composing complex dissertations on Thomas Locke to drafting interior design schemes and even allowing people to converse with their younger selves.”
Wow! Do you suppose this wondrous technology could maybe get weaponized with malicious intent?
You bet it can…And it is.
In 2021 Henry A. Kissinger, Eric Schmidt, Daniel Huttenlocher co-authored a book titled The Age of AI And Our Human Future. As you might expect, these guys are arch AI boosters. Critics pointed out that,
“Its title alone—The Age of AI: And Our Human Future—declares an epoch and aspires to speak on behalf of everyone. It presents AI as an entity, as superhuman, and as inevitable—while erasing a history of scholarship and critique of AI technologies that demonstrates their limits and inherent risks, the irreducible labor required to sustain them, and the financial incentives of tech companies that produce and profit from them.”
The reviewers objected that adoption of AI by the military is presented by the three authors as an inevitability, instead of as an active policy choice that involves ethical complexities and moral trade-offs.
Now, just months later, the war in Ukraine has brought those complexities and trade-offs to the front and center.
The Expose’ reports that, “On 30 June 2022, NATO announced it is creating a $1 billion innovation fund that will invest in early-stage start-ups and venture capital funds developing “priority” technologies such as artificial intelligence, big-data processing, and automation.”
The story by Rhoda Wilson also notes that “The US Department of Defense requested $874 million for artificial intelligence for 2022.” Of course European countries, China – and no doubt Russia – are rushing to keep up. Nuclear-armed countries in a warbot race puts the nuclear arms race on steroids. Multiple contending NukeBot forces – that can mistake a dog for a stealth bomber – making nano-second decisions based on a pixel. Armageddon Man has sprouted another head.
This new autonomous nukes race is a potential windfall for Big Tech giants like Peter Thiel’s Palantir, but also for aspiring newcomers to Silicon Valley.
She points out that, “Ultimately, the new era of military AI raises a slew of difficult ethical questions that we don’t have answers to yet.”
She interviews Kenneth Payne, who leads defense studies research at King’s College London and is the author of the book I, Warbot: The Dawn of Artificially Intelligent Conflict. He says that a key concept in designing AI weapons systems is that humans must always retain control. But Payne believes that will be impossible as the technology evolves.
“The whole point of an autonomous [system] is to allow it to make a decision faster and more accurately than a human could do and at a scale that a human can’t do,” he says. “You’re effectively hamstringing yourself if you say ‘No, we’re going to lawyer each and every decision.’”
If It’s AI, It’s Hackable – Self-Driving Nukes?
Award-winning reporter Eric Schlosser’s 2014 book Command and Control and the eponymous Oscar-shortlisted documentary based on it, directed by Robert Kenner, showed how the history of the U.S. nuclear arsenal is studded with examples of how both serious human error and courageous interventions by individual human intelligence have repeatedly risked and saved the world from thermonuclear destruction. That was then and this is now, when displacing humans with AI algorithms is under serious (and insane) consideration.
Mikko Hypponen is a Finnish global cyber security expert whose thirty-year career has coincided with the growth of the criminalization of the internet. In his recent book, If It’s Smart, It’s Vulnerable, he gives a flyover of the developmental stages of cybercrime from viruses, to worms, to malware, to ransomware, to Stuxnet and beyond.
“Question: How many of the Fortune 500 are hacked right now?
“Answer: 500”
That’s the way Hypponen sets up his basic contention from a lifetime of cyber security sleuthing: “If a company network is large enough, it will always have vulnerabilities, and there will always be something odd going on…” making it possible for the system’s security measures to be “…breached by attackers.”
With that as background, the prospect of giving AI warbots the codes to the world’s nuclear weapons arsenals is clearly just one more suicidal societal concession to Armageddon Man.