Jump to content

Recommended Posts

Posted

A neural truing machine is a natural development. Among other things the marriage will allow interfacing a variety of sensors and actors. Someone will attach a programmable calculator, and teach the neural net to program. Others will interface to a variety of Computer Aided System Engineering (CASE) tools, and other things. The neural net must be trained to use each sensor, actor, and the interactions between each sensor and one or more actors, wherever applicable.

I cannot think of a primal reaction that a person cannot at least modify. Leg movement reacting to the knee jerk test is difficult to affect. But with training one might gain some control of it. I believe a conscious, sentient AI would have an amount of actor control similar to people. An AI that is not conscious would have similar control, except its decisions would lower quality.

Posted (edited)
4 hours ago, T. McGrath said:

Deep Learning is a programming methodology.  It isn't even a program itself.  Wikipedia defines Deep Learning as "part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms."  Hence, Deep Learning can do none of the things I listed, and it certainly can't pass a Turing test.

  1. Deep Learning can do all the things you listed, whether or not you admit it. (Eg: AlphaZero, or One Model To Learn Them All)
  2. Deep Learning is yet another program/piece of software, that learns to build very very complicated programs that humans have not been observed to be able to do!
  3. Reference A:."Self-taught artificial intelligence beats doctors at predicting heart attacks"
  4. Reference B: "AI learns and recreates Nobel-winning physics experiment"
  5. Reference C: "AI Uses Titan Supercomputer to Create Deep Neural Nets in Less Than a Day"
  6. Passing the Turing test is not a requirement to put millions of people out of work, which Ai is already starting to do!
Edited by thoughtfuhk
Posted
21 hours ago, EdEarl said:

I am convinced we are conscious and sentient, but not convinced we know how to build an AI that is conscious and sentient. In fact, I'm pretty sure we will not know until we do it. 

 

I largely agree but with the caveat that, as we gain more knowledge of both ourselves and computers, we will likely recognise/learn  the path to achieving it before we get there.

21 hours ago, EdEarl said:

Are we qualitatively different than other beings with brains, for example chimps, mice and ants? Or, are we only quantitatively different, just have a larger brain. Elephants don't seem to have our capabilities, is that a false impression; they have larger brains than us. Moreover, women have slightly smaller brains than men, about 11%. Yet there is no difference in IQ scores on the average.

2

Yes and no, brain architecture in mammals seem to be the same, but we are demonstrably (given the studies) better at communication and intelligence; it's possibly a myth, but I've read that brain size to body size is a good rule of thumb interspecies, not intraspecies.

21 hours ago, EdEarl said:

No one is sure of what that quality is. This ignorance makes me less convinced than you about our ability to control AI. Someone may build an AGI with consciousness and sentience without knowing.

 

That is possible, but as science progresses and that ignorance is replaced with understanding the chances decrease.

 

22 hours ago, EdEarl said:

Since we don't know exactly how we are conscience and sentient, I have doubt that we can compel an AI. What you claim seems plausible, but not I'm not convinced as you are.

My point is, only a sentient machine could change its basic programming or algorithm if it's set up carefully.  

5 hours ago, thoughtfuhk said:
  1.  
  2. Passing the Turing test is not a requirement to put millions of people out of work, which Ai is already starting to do!
 

They took our jobs

Posted
2 hours ago, dimreepr said:

I largely agree but with the caveat that, as we gain more knowledge of both ourselves and computers, we will likely recognise/learn  the path to achieving it before we get there.

Yes and no, brain architecture in mammals seem to be the same, but we are demonstrably (given the studies) better at communication and intelligence; it's possibly a myth, but I've read that brain size to body size is a good rule of thumb interspecies, not intraspecies.

That is possible, but as science progresses and that ignorance is replaced with understanding the chances decrease.

 

My point is, only a sentient machine could change its basic programming or algorithm if it's set up carefully.  

They took our jobs

I essentially agree. However, reprogramming neurons might invalidate the neural net training; potentially loosing its knowledge. The AI would probably be able to translate the neural data from older to newer format, but rounding errors might occur. If so, neuron reprogramming may be impractical.

Posted
On 04/01/2018 at 4:11 PM, EdEarl said:

I essentially agree. However, reprogramming neurons might invalidate the neural net training; potentially loosing its knowledge. The AI would probably be able to translate the neural data from older to newer format, but rounding errors might occur. If so, neuron reprogramming may be impractical.

Indeed,  we may never be free of unintended consequence but that doesn't mean we shouldn't try.

Posted
2 minutes ago, dimreepr said:

Indeed,  we may never be free of unintended consequence but that doesn't mean we shouldn't try.

True, and some kinds of change would not need data translations.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.