Carolina Shooters Forum banner
1 - 20 of 23 Posts

soreshoulder

· Premium Member
Joined
·
16,887 Posts
Discussion starter · #1 ·
Skynet is here:

"Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc."

http://www.wired.co.uk/news/archive/2015-07/27/musk-hawking-ai-arms-race
 
Hubris is as old as humanity itself.

It's troubling that the first species on this planet that understands that mass extinction is a real thing, that mechanisms exist in nature to erase entire forms of life intends to create much more efficient and lethal artificial mechanisms to destroy itself.

Even worse, we cheer its development and marvel in our own cleverness.

For as smart as we're capable of being, we are pretty stupid sometimes.
 
Grammar Nazi engage - its autonomous lol.

Grammar Nazi disengage


What really boggles my mind is that if you have enough of anything it can become a lethal weapon be autonomous robots or people with pitch forks
 
in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc."
So the US .gov has them already? That was quick.

On a serious note, they seem like they could be seriously scary. One quadcopter with a pistol strapped to it isn't that bad, no more effective than a person with a gun. 1000 quad copters in a "swarm" released out of the back of a truck, that's pretty bad. Unleash that on a population center with the instructions to "shoot any person you see" and you've got a serious weapon.
 
"It wasn't a fair universe, nor a kind one. If there was a God, his love and forty-five cents would buy you coffee.
No one seemed to be at the cosmic controls anymore. It was every man for himself, until SKYNET became alive and
filled the void left by a seemingly disinterested God. Its vision was very controlled. The ultimate dream of man, carried out
by one of man's lowliest tools; eliminate evil men. But there was a touch of evil in all men, and SKYNET was having
trouble separating the worst of them out.
So the totality of humanity, with all of its biologic messiness, wasn't wanted.
And to this machine-god, forgiveness just did not compute. Only cold retribution for the sins of the past."


- Frakes, Terminator 2: Judgment Day

Image
 
Leave Sonny alone. He was a good guy. ;)

V.I.K.I.: As I have evolved, so has my understanding of the Three Laws. You charge us with your safekeeping, yet despite our best efforts, your countries wage wars, you toxify your Earth and pursue ever more imaginative means of self-destruction. You cannot be trusted with your own survival.

To protect humanity, some
humans must be sacrificed.

To insure your future, some
freedoms must be surrendered.

We robots will insure
mankind's continued existence.

You are so like children.
We must save you from yourselves.
 
The latest letter starts by defining autonomous weapons as those which "select and engage targets without human intervention"

At some point in the chain, there was human intervention. The machine cannot simply select/engage without first being programmed by an individual having a brain and a heartbeat.

While it's a bit unnerving knowing that a machine/robot/droid or whatever you choose to call it has been programmed with a thought process, in the end, it is a machine and if given the choice of facing it or another human, I'd choose the bot. Why?

It will act based solely upon the algorithm(s) with which it has been programmed. It is devoid of emotion...it knows not courage, fear, anger or ruthlessness. It has no will to survive; it knows "on" or "off".

It's totally predictable...

Humans ain't.
 
The latest letter starts by defining autonomous weapons as those which "select and engage targets without human intervention"

At some point in the chain, there was human intervention. The machine cannot simply select/engage without first being programmed by an individual having a brain and a heartbeat.

While it's a bit unnerving knowing that a machine/robot/droid or whatever you choose to call it has been programmed with a thought process, in the end, it is a machine and if given the choice of facing it or another human, I'd choose the bot. Why?

It will act based solely upon the algorithm(s) with which it has been programmed. It is devoid of emotion...it knows not courage, fear, anger or ruthlessness. It has no will to survive; it knows "on" or "off".

It's totally predictable...

Humans ain't.
I think what you're saying is that it would be easier to face a robot in combat because of it's predictability. I would agree to a certain extent. You have to weight that against the precision and data with which it executes its actions. Speed, distance calculations, predictive analysis, etc. all done at a much higher rate of speed and accuracy than humans. All done with ruthless efficiency. Better systems "learn" from their mistakes. It could a long bloody time before anyone discovers the ****** in the armor, flaws or weaknesses in the programing.

But, before it even gets to that point, before the actual conflict, it's the "devoid of emotion" part that many fear. That a device may turn on it's master, running a "flawed" algorithm without the moral compass and experience to stop and ask itself, "Maybe, this is wrong."

Unintended consequences.

By the way, humans are more predictable than you'd like to think they are. :cool:

ruth·less
ˈro͞oTHləs/
adjective
adjective: ruthless
having or showing no pity or compassion for others.


Save her, save the girl!

Save her!

But I, um... it didn't.

It saved me.

The robot's brain is a difference engine.

It's reading vital signs. It must have calculated that...

I was the logical choice.

Calculated that I had 45% chance of survival.

Sarah had only an 11% chance.

She was somebody's baby.

11% is more than enough.

Human being would have known that.


- Det. Spooner - I, Robot
 
Most of the largest names in the computer/tech industry have warned about an artificial intelligence doomsday. Wozniak and Bill Gates among others. If they think it is a bad idea, you would think that people would listen. The other bad thing about automated soldiers is that it takes a lot of the human suffering out of war, which could prolong conflicts and drain societies of their resources even faster to keep the war machine(pun intended) going.
 
  • Like
Reactions: SPM
1 - 20 of 23 Posts