Home | About | Donate

Inequality, Revolution, and Drones That Kill


#1

Inequality, Revolution, and Drones That Kill

Paul Buchheit
The too-rapid evolution of intelligent machines, with the ability to make decisions that can impact human life, is bringing us closer to a man-made epidemic that we won't be able to control.

#2

“A modern-day rogue’s gallery is taking shape. For average Americans who feel oppressed in the 21st century, anonymity may turn out to be a blessing.”

I have been told by others for years that if I don’t keep my mouth shut I’ll disappear. If so, let it be. I refuse to be anonymous.


#3

Citizens in this country are in far more danger of AI drone attacks from their own government than any terrorist organization. Never forget for a moment that there is a very thin, and all too easily crossed, line between a foreign policy that justifies using drones to assassinate, with no oversight and no regard for innocents killed in the process and a domestic policy that, so far as we know, does not. The compartmentalization of foreign policy and domestic policy is a dangerous illusion.


#4

All of this technological murder can be avoided with a few dog whistles. Never underestimate the power of tribalism


#5

I agree. What the Fascists have done for a long,long time in foreign countries, they will do here in the homeland if the need arises.


#6

I too feel the same way. I only have a cat to provide for in my absence if deemed by the state. So I say all kinds of subversive real left ideology in my writings. If I hear my door breaking down I’ll end my life quick with Mr Smith&Wesson I keep under my PC desk. A revolution/insurrection would be suppressed in short order by the state. Better to usher a financial collapse and rebuild a democratic socialist system from the bottom up after dismembering all the levers of power.(military) I’m too old for that now but feel that’s the best chance for long term survival.


#7

I wonder if folk have forgotten when police in Texas used a robot to blow somebody up (tps://motherboard.vice.com/en_us/article/4xan9n/dallas-pd-using-a-bomb-robot-to-kill-a-suspect-is-an-unprecedented-shift-in-policing)

Lethal Autonomous Weapons Systems (LAWS) are already here - from swarm vehicles (http://nationalinterest.org/blog/the-buzz/us-military-successfully-tested-its-latest-super-weapon-‘the-19002), to autonomous drones (https://www.cbsnews.com/news/60-minutes-autonomous-drones-set-to-revolutionize-military-technology-2/), to biometric targeting (https://books.google.com/books?id=WFJIDgAAQBAJ&pg=PT89&lpg=PT89&dq=biometric+targeting+weapons&source=bl&ots=yAGiy4M-Pj&sig=mjleB94O2famKqNIXKUqMbHh5mE&hl=en&sa=X&ved=0ahUKEwjuieTOu-zYAhWQTd8KHQOCDrEQ6AEILzAB#v=onepage&q=biometric%20targeting%20weapons&f=false, http://www.israeldefense.co.il/en/node/28881)

Many folk in the field of machine learning and weapons research are champing at the bit for a large scale use of such weapons.

What I find extremely disturbing is the growing consensus, among researchers and developers of machine learning, that Lethal Autonomous Weapons Systems need to be deployed for ‘humanitarian’ reasons - i.e.

  • Machines are less prone to error than humans.
  • Machines don’t act out of malice like humans.

This ‘humanitarian’ argument ignores the reality that LAWS will enable, and inevitably result in, an exponential escalation in the scale and geographic range of mass killing.


#8

Way to go Paul! Now all we need is instructions on how to put one together that’s got a jam-proof electronic system, then program it and release it, and it has to be doable on a shoe-string budget.

The kicker is, you can’t be anonymous if you have an online presence! Facebook profile? Forget it!