![]() |
I'm going to be way too old for this. Are there going to be robots that have sex with the robots for you?
|
God forbid they make the robots for reproduction.
Goddamned wannabe human robots with their fancy baby-making crockpots. |
Quote:
|
Adrian Barbo-bot with chainsaw hands! Zzzzzzzzzzz!!!
|
Sparkimus PRIME.
|
Yes, I am Pro Robots!
Quote:
All of our laws will need to be redefined (again!) to not just include all self-sentient creatures but also self-sentient constructs. Which is fine by me. I'm looking forward to following the political and societal outcomes when this becomes a hot issue. It'll be a great just to watch how it unfolds. As for all the worries about robots reproducing - what's the problem?? If they're really gonna be built to think like us, the program design will probably mimic the design of our brains. Meaning they'll probably have to learn from scratch, intuitively just like us as well. So even if they built a giant robot army, they'd still have to be 'raised', taught, and trained in combat before the army would actually become active. Never mind the time actually spent building them to start with. By comparison, we already have several active armies. Giant missile launchers and other deadly robot weaponry? Not an issue - there's nothing stopping us from using those too. Seriously, if we're so afraid of them making war, killing, taking over the rights of other species... say, that sounds a lot like what WE do! We usually do it with reasons. Not necessarily good reasons, but the things we do certainly don't just happen out of the blue. Assuming the robots have to learn like we do (as above), there will be a stage where moral development must occur, otherwise they will not mesh with human society. If they can do that, they won't be too likely to "kill all humans". Why? Our society is increasingly teaching values of tolerance. Most robots brought up in such an environment will not be painting our race with such a broad brush, and will probably try to find a more socially acceptable means of getting their message out. Humans and sentient robots? Sure there's a difference. One's a machine. That's like, it. Actually I consider us machines as well; just that we need a constant energy supply because we're made out of materials that quickly decompose if left without. If we ever go offline, the fine balance that makes up our system is rapidly lost, and we are unable to restore the balance and come back online. We call it death! Of course the organic approach to our design is what allows us to be so complicated, since it works on a smaller scale than wires. |
Khael! your theory about the blank slate is slightly wrong because they can easily copy data.
I mean, it'd be like the matrix where they learn martial arts and how to pilot helicopters by downloading the information. |
Yes, maybe for the robots. But that's assuming they can learn that fast. I could see them learning at an advanced rate, but... Okay, I can't help it. Tangent time!
The Matrix installation idea is crap! When people learn, they develop neural connections and pathways that allow for the knowledge to be retained. Humans could never grow such paths that quickly, because the brain isn't set up to do that. Cells don't grow that fast! It would use an awful lot of energy and likely create excess heat at the rate depicted in the movie. And as we are older, brain cells often forge new connections by giving up old ones that aren't used anymore. (I may be wrong; let me know if I am) So in learning kung-fu, you'd probably forget some of that stupid algebra/helicopter flying/programmer/hacker/sword fighting/gunslinging/motorcycle stuntman skills or whatever else you haven't done in a while. There's no way Neo would retain that much skill variety all at the same time, because while the brain has no known upward limit on knowledge capacity, it still naturally prunes unneeded connections. Okay, back to the first point. I realize I'm assuming robot minds will mimic our own in design. I think it's a pretty good assumption - it would be easier to mimic something we already know works rather than come up with a completely different method for thought processes to occur. In order for it to learn like a human, the robot brain will need to go through similar path building and pruning as ours does. This means the robot will be prone to actually forgetting things. But more so, the same limit on learning speed may apply to some extent, because it takes time to physically make new pathways. Forcing new content into the head of such a robot would be manually forcing the neural paths to rearrange themselves. That could result in data loss or unforeseen changes in the robot's personality due to new 'experiences' altering the knowledge base from which it's personality developed in the first place. Any errors in data transfer (due to improperly formed/grown connections, warping due to external factors such as excess heat or sudden jolts) would result in something a lot like mental retardation. |
AI would run on code. code runs on data. Data transfers like crazy fast.
Why is that a hard concept? |
| All times are GMT -5. The time now is 05:53 PM. |
Powered by: vBulletin Version 3.8.5
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.