The Warring States of NPF

The Warring States of NPF (http://www.nuklearforums.com/index.php)
-   Dead threads (http://www.nuklearforums.com/forumdisplay.php?f=91)
-   -   Twilight of Man, Rise of Machines (http://www.nuklearforums.com/showthread.php?t=3175)

Trev-MUN Hates AOL 04-12-2004 03:02 AM

Twilight of Man, Rise of Machines
 
This has been something that dwells on my mind from time to time. Recently I got to thinking about it again watching the Animatrix "Second Rennaisance," part II - watching how the machines of 01 literally crushed the armies of the world during Operation Dark Storm.

Now, that's just whimsical science fiction, but thinking robots are on the horizon. Many science fiction and technical experts believe that AI will become vastly more efficent and intelligent than humans.

In short, we'll be a useless, obsolete race by comparison.

I dunno, but I find that sort of future to be chilling. There's no real good I can forsee of creating a race of AI that surpasses us in every way - Many people have pointed out that a war between humans and AI could be inevitable, since humans would feel threatened by their children race, or the AI may decide that their parent race is just not worth existing anymore.

It's just hard for me to see a happy future of two races coexisting.

What do you all think about it?

Psycho Mantis 04-12-2004 03:14 AM

They're already developing computers that use quantum technology to calculate things in other dimensions. A quantum computer would be several billion times more intellegent than a human.

The Tortured one 04-12-2004 06:29 AM

this'll put the fear of God in ya: http://www.xs4all.nl/~mke/Robots.htm

I personally don't see it happening. I think we're waaaay to cautious for such a thing to happen.

Osterbaum 04-12-2004 06:40 AM

I think that the creation of an AI more intelligent that humans is safe as long as we keep it only in computer form. Or maby it would still be dangerous...if it would happen like in Terminator that the computers would turn our own weapons against us.

Still I dodn't think anything like this would happen in my lifetime...but you can never know.

Just read that link...scary...

But I was wondering that if machines would think like us then they would think of us as equals...right? So they would not do anything that we wouldn't...actually that is even scarier...but I ment to say was that they would commit murders and robberies, but to strat a revolution to kill all humans? They could start a revolution in a country with humans to take power to the people or something...but really...to kill all humans...

That thing about humans having a war betveen them selves is more propably (spelling?) I personally would take the side of those wanting to smash robots in to bits. The outcome of the war would be desided mostly about how many would join either side? Would there be a third side maby? And wouldn't the "Terrans" still use military vehicles and other electronics?

Hamelin 04-12-2004 08:43 AM

If the scientists just take a few hints from Issac Asimov, we wouldn't have had to worry about it so much.

Osterbaum 04-12-2004 09:26 AM

But if they would make robots/computers/machines that could think by them selves then that would mean that they wouldnt follow the rules. I'ts like with us. If we think some rules are injustified or something similar we go and do something about it. Maby we protest peacfúlly...or maby we star a full scale war/rebellion. Which means that the machines would do the same, doesent it?

The Tortured one 04-12-2004 10:03 AM

the main issue with the animatrix though, is that the robots gave humanity every opportunity for a peaceful co-existence. Only after we scorched the sky did we make things personal.

Anyway you look at it, it was us who committed violent acts against robots. It was us who oppressed them and treated them as inferiors. It was us that ran them out of society. It was us that grew jealous of 01's prosperity. It was us who struck first, with nukes. it was us that refused peace treaty after peace treaty. It was us that scorched the sky and gave the robots no other option but to turn to human bodies for energy needs.

I wouldn't call the second renaissance a robot uprising, I would call it robot's revolting, and fighting for their right to existence. They had every intention of peace up until we scorched the sky.

One thing that always puzzled me though, is why they said the nuclear weapons had a lesser effect on robots? IIRC nuclear explosions send out an EMP, which if it were used on 01, would have fried every circuit in the city.

Just Jon 04-12-2004 10:22 AM

I think it all depends on the programming of the robots. I mean, computers now can do a lot of things, but in the end they're limited to what we've programmed them to do. They don't really do anything but what we could do, they just do it faster. Until we find a way to make a robot solve intricate problems, we're pretty safe. But even then, putting morals into robots would be another step to possibly avert any sort of uprising.

And I agree with Tortured One, the robots were simply looking for a way to survive. It was like they were the UN searching for way after way to get things peaceful and diplomatic, and the humans were sending bombs at them for no reason then because they didn't like them. Why was it so easy to discriminate against robots? Especially robots who wore "human" costumes! I was freaked out when I watched it, watched a group with bats beating on a woman only to have her skin come off and reveal a robot. I was disgusted too, because that doesn't look like something that would be in any way impossible.

We don't even get along with people of different culture and skin color. How could we get along with machines?

Zweihander 04-12-2004 10:44 AM

Like Hamelin had pointed out, this is almost all moot if we were to instill/program the robots with those 3 Rules. Oddly enough, with those rules, robots would be morally superior to us. I mean, they can literally do nothing evil.

Trev-MUN Hates AOL 04-12-2004 10:49 AM

That's just it though, Tortured/Jon, what the Second Rennaisance shows is how humanity can feel threatened by a superior species, and the steps we'd take to ensure that the species wouldn't be a threat anymore. Of course, this is all barring the fact that Zero One tried its best to get along with its neighbors.

If you read the original information for Terminator 2/The Terminator you'll see similar lines in the events leading to Skynet becoming self-aware - well, for both sides of the coin anyway. Skynet's human creators were threatened by its sudden awareness and tried to shut it down. Later, when Skynet reviewed the history of humanity in full, it decided that humanity was a species unfit to live.

The thing about humanity is, if we produce artificial intelligence I really do think that we'd treat them no better than we would human slaves - or worse. We have to consider that since they would be intelligent, they deserve rights that a human being has.

And that's just considering if the AI is "only" capable of thinking or intelligence equal to ours. What if it's smarter? What if we have one that's billions of orders more intelligent, like what Psycho Mantis said?


All times are GMT -5. The time now is 10:36 PM.

Powered by: vBulletin Version 3.8.5
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.