The Warring States of NPF  

Go Back   The Warring States of NPF > Social > Bullshit Mountain
User Name
Password
FAQ Members List Calendar Today's Posts Join Chat

Reply
  Click to unhide all tags.Click to hide all tags.  
Thread Tools Display Modes
Unread 03-11-2010, 09:16 AM   #1
Ryanderman
Beard of Leadership
 
Ryanderman's Avatar
 
Join Date: Apr 2006
Posts: 827
Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted.
Send a message via AIM to Ryanderman
Default

Re the OP: You said that:
Quote:
It does still allow for a definition of consciousness as any system in which there is a certain level of difficulty in erasing information to the point of reverting a wave function. Anything with a difficulty level above a certain threshold is conscious and anything below is not. Finding that threshold might be hard and might exclude or include somethings that are problematic.
And also that:
Quote:
In either case instead of subjecting an AI to a Turing test you could instead incorporate it into a delayed choice quantum eraser experiment and then delete all its memory before anyone looks at it or the experimental data. If the interference reappears your AI is not conscious if it doesn't then it is conscious.
Maybe I'm missing it, but what sets the threshold of difficulty to determine conciousness above the difficulty of erasing an AI's memory, and below the difficulty of erasing a human's memory?
__________________
~Your robot reminds me of you. You tell it to stop, it turns. You tell it to turn, it stops. You tell it to take out the trash, it watches reruns of Firefly.~
Ryanderman is offline Add to Ryanderman's Reputation   Reply With Quote
Unread 03-11-2010, 09:34 AM   #2
Sithdarth
Friendly Neighborhood Quantum Hobo
 
Sithdarth's Avatar
 
Join Date: Mar 2004
Location: Outside the M-brane look'n in
Posts: 5,403
Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier.
Default

Quote:
Maybe I'm missing it, but what sets the threshold of difficulty to determine conciousness above the difficulty of erasing an AI's memory, and below the difficulty of erasing a human's memory?
Nothing. The test doesn't tell you if the machine is not conscious because its not complex enough or if its not conscious because machines can't be conscious. (Unless you can prove somehow its a matter of complexity and prove your machine has enough complexity but still is not conscious.) For all I know the threshold could be at plants or it could be that only biological things can be conscious. Basically the test in and of itself only tells you if its conscious not why or why not.
Sithdarth is offline Add to Sithdarth's Reputation   Reply With Quote
Unread 03-11-2010, 10:38 AM   #3
Ryanderman
Beard of Leadership
 
Ryanderman's Avatar
 
Join Date: Apr 2006
Posts: 827
Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted. Ryanderman bakes the most delicious cookies you've ever tasted.
Send a message via AIM to Ryanderman
Default

Quote:
Originally Posted by Sithdarth View Post
Nothing. The test doesn't tell you if the machine is not conscious because its not complex enough or if its not conscious because machines can't be conscious. (Unless you can prove somehow its a matter of complexity and prove your machine has enough complexity but still is not conscious.) For all I know the threshold could be at plants or it could be that only biological things can be conscious. Basically the test in and of itself only tells you if its conscious not why or why not.
That's not quite where my confusion lies. How does the test tell that the AI is not conscious? You suggested in your scenario #1 that if the information could be erased from a human's mind, it could uncollapse the wave function. And your scenario #2 suggests that the is some siginifcance to conciousness that would cause the wave function to remain collapsed, even if the memory were erased. But then you said that in either case if deleting the AI's memory caused the wave function to uncollapse, it would prove it was not conscious. I guess I'm failing to see how that result is any different from the conscious human in scenario #1.

Sorry to pick at this detail, but I felt like I understood most of what you said, so I want to clear up this sticking point to my comprehension.
__________________
~Your robot reminds me of you. You tell it to stop, it turns. You tell it to turn, it stops. You tell it to take out the trash, it watches reruns of Firefly.~
Ryanderman is offline Add to Ryanderman's Reputation   Reply With Quote
Unread 03-11-2010, 11:46 AM   #4
Sithdarth
Friendly Neighborhood Quantum Hobo
 
Sithdarth's Avatar
 
Join Date: Mar 2004
Location: Outside the M-brane look'n in
Posts: 5,403
Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier. Sithdarth is like Reed Richards, but prettier.
Default

Ah in case one it'd be all about how difficult it is to erase the memory. Like if the machine is complex enough you could erase the area where the information is stored in the AI's brain without actually removing the information from the universe. Its kind of a strange thing to think about but information is essentially negative entropy. If in the first case deleting the computers memory doesn't revert the wave function than it is obviously conscious and erasing things from the human mind would probably have the same result. In the case where even erasing human memory reverts the wave function you're essentially screwed in that you know have to arbitrarily choose a point where the difficulty in erasing memory indicated consciousness. Then you perform the test on your AI and try everything up to that level of difficultly. For example you might be able to revert a wave function by killing everyone who observed it or the consequences of it and incinerating their corpses. That would be a very difficult erasing procedure. So you might choose the point of having to destroy the system to erase the data the point of consciousness. If that is the only way to revert the wave function after the AI looked then its conscious if hitting the delete key is enough then its not. Its a bit of a subtly but it still works just a bit more arbitrarily.

The alternative in case one is to accept there is nothing special about consciousness and making the distinction is pointless. But that's just no fun really.
Sithdarth is offline Add to Sithdarth's Reputation   Reply With Quote
Unread 03-11-2010, 09:46 AM   #5
Azisien
wat
 
Azisien's Avatar
 
Join Date: Jan 2005
Posts: 7,177
Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't.
Default

I love Sith threads. People post not understanding some of the physics terminology and his explanatory post involves more physics terminology.

I need to start reading up on QM again though, I feel rusty after reading this thread.
Azisien is offline Add to Azisien's Reputation   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 01:32 AM.
The server time is now 06:32:57 AM.


Powered by: vBulletin Version 3.8.5
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.