Welcome to European Tribune. It's gone a bit quiet around here these days, but it's still going.
Display:
A hardware AI would have the advantage over a wetware AI that we know how to upgrade hardware.

Though it is of course possible that the technology required to build a hardware AI would also enable us to build Ghost in the Shell style cyberbrains to enhance our wetware processors.

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Fri Aug 26th, 2011 at 02:01:18 PM EST
[ Parent ]
The last hardware problem was solved when terrabyte disks drives became affordable.  Simply put, doing the wrong things at the wrong times even faster does not get you to doing the right things at the right times.

 

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Fri Aug 26th, 2011 at 09:52:14 PM EST
[ Parent ]
The fly in the ointment is the role that emotion forms in all of our mental processes. The eat or flee response is very close to the base of all animal intelligence and emotional responses are the basis for judging almost all things. We have developed methods for suspending judgement and we can attempt to account for emotion in our decisions, but it is very tricky. In order to create an AI that truly resembles that of a human it may be necessary for the development process to functionally recapitulate the evolutionary sequence of human beings.

The problem this poses is amplified by the active disrespect so many give to the role of emotions in our lives. Even the suggestion that a truly human AI would have to have the equivalent of human emotions would/will likely be received with disdain by many of those best able to conceive of the necessary programming. I would like to see special purpose AI utilized much more extensively in known critical areas of human endeavor, such as medical diagnostics, which is so often a disaster when performed by humans.

"It is not necessary to have hope in order to persevere."

by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Sat Aug 27th, 2011 at 09:48:22 PM EST
[ Parent ]
It was after I read Damasio, Sapolsky, and et. al. that I came to realize just how much our emotions (Limbic system, mesolimbic pathways, etc.) underlie our cognitive processing.  To the extent that if our emotions are neurologically unable to function properly we simultaneously lose our executive decision making.  

This realization made me understand attempting to build a "truly" human intelligence isn't worth the effort. A "truly" human intelligence would be subject to developing psychological, neurological, emotional, and cognitive dysfunctions human express and if it doesn't it's not a "truly" human intelligence.

QED

:-)

Which makes TBG's contention, which I share, we should drop the "AI" thing, as such, in order to be working on building a "General Modeling Machine."

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Sun Aug 28th, 2011 at 02:42:44 PM EST
[ Parent ]
We apparently agree. My caveat wrt a General Modeling Machine is to keep it away from making executive decisions. We are sufficiently "inhuman" all by ourselves and have no need of "artificial inhumanity".

"It is not necessary to have hope in order to persevere."
by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Mon Aug 29th, 2011 at 12:38:44 PM EST
[ Parent ]
GMMs won't be allowed to make executive decisions. The executive decisions will be hardcoded into them by programmers who simply apply the state of the art in anthropology, psychology and economics, without understanding that these disciplines exist in large part to justify particular forms of executive decisions.

The Serious People will then pretend that the GMM is making the executive decisions, because this gives the decisions an air of inevitability and truthiness.

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Mon Aug 29th, 2011 at 03:17:30 PM EST
[ Parent ]
Then we will need the Butlerian Jihad.

"It is not necessary to have hope in order to persevere."
by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Mon Aug 29th, 2011 at 05:16:12 PM EST
[ Parent ]

Display: