Jump to content

Self aware artificial intellegence will never happen and here's why


Recommended Posts

Posted

 

There are alife simulators, but there are a number of problems with development into something more robust.

 

Any one program on a computer can impact every other program on the computer. You cannot have the same level of impact in our wider Universe.

 

Most of our code will break, if altered to any degree, rather than result in something with a novel function. Alife simulators have made some progress in this area, but still lag behind coding methods developed via evolution(to be fair it had a large head start).

 

Probably the most glaring problem is hardware. Like asking us to simulate reality on an abacus. Can only dumb things down so much before it stops showing the complex behavior we desire.

 

I'm convinced it is possible, but will take time to overcome the issues. Some of the trading algo's have shown weak signs, so maybe down the line we'll have something.

Ok, so in these simulations, is there code representing physical senses? What aspect of the real world is trying to be simulated?

Posted (edited)

Sure. In Darwinbots, there was vision, touch, collision, hunger, etc. Saw some neat stuff. The evolution of family recognition was probably most interesting. At first general avoidance, then eventually evolving avoidance based on an identifying trait possessed only by close relatives.

 

Was basically a pond simulation with program fed 'plants' and evolving critters that fed on them(and one another). Primary focus was on the evolution of behaviors.

 

Number of simulators out there though with different foci. Evolution of the body, Core War style pure program competition, etc.

 

 

Somewhat dubious on whether those could develop intelligence, but interesting at any rate. World is getting weirder with the increasing usage of neural nets. Realistically where we are likely to see something develop as companies have a logical incentive to throw ever more resources their way and enable them able to write their own programs.

Edited by Endy0816
Posted

Sure. In Darwinbots, there was vision, touch, collision, hunger, etc.

Ok, how does one code hunger? thats a feeling, do did they symbolize an actually feeling in your stomach in code? Is there data showing one 'organism' with less hunger than others?

How are they coding vision? seeing is to have an observer contrast light and dark, if there's no awareness how can there be anyone seeing anything? How are codes symbolizing the experience of black and white if nothing is actually being seen by anything?

And touch, how is touch coded? how does the programmer know the program is feeling something physical if there's nothing physical to actually touch? How is that measured? How can one measure something as a physical interaction from something that's not physical?

 

Do you see my point? Simulations dont mean anything, they have no value on anything in reality but concepts, because the data is based on the wrong information. You need real life vision, touch and hunger to see if something is doing these things. When you look at a crab that lives inside a mountain and has lost its need for eyes, and instead uses other senses, that is a real world measurement of a life form that you can analyze having vision or not. If the Crab was just a drawing on a wall, how do you assess if it has vision if it doesnt exist? It wouldnt coz its not real. So why would you ever think squiggles symbolizing a crab on a digital screen moving around would make a picture on a wall anymore real in any way at all?

Posted

Sure. In Darwinbots, there was vision, touch, collision, hunger, etc. Saw some neat stuff. The evolution of family recognition was probably most interesting. At first general avoidance, then eventually evolving avoidance based on an identifying trait possessed only by close relatives.

 

Was basically a pond simulation with program fed 'plants' and evolving critters that fed on them(and one another). Primary focus was on the evolution of behaviors.

 

Number of simulators out there though with different foci. Evolution of the body, Core War style pure program competition, etc.

 

 

Somewhat dubious on whether those could develop intelligence, but interesting at any rate. World is getting weirder with the increasing usage of neural nets. Realistically where we are likely to see something develop as companies have a logical incentive to throw ever more resources their way and enable them able to write their own programs.

One of the very first programming things I did outside of some Visual Basic stuff in high school was an ALife project, a term I hadn't heard before, and I didn't know if it was even going to work because I wasn't aware of the many other such projects until after I was already well into it.

 

First seeing emergent behavior like being attracted to food sources and avoiding other, potentially predatory life was one of the coolest experiences I've had, and getting predators that mimicked the color of the "plant" food to attract prey, or that camped on top of plants (which I initially that of as being a dead end that looked kind of buggy because of the way movement worked but actually wound up being a winning strategy for similar reasons to the above) was icing on the whole experience.

 

It was very simplistic, comparatively speaking, but I ran that sim for fun on and off pretty much constantly through college just because I enjoyed watching it and seeing it progress towards one of several common "end states" where things tended to stabilize, usually after a couple of hours.

 

I've made a few variations on it since then and still go back to it semi-regularly when I'm first learning a language and need an interesting project to work through.

Posted (edited)

One of the very first programming things I did outside of some Visual Basic stuff in high school was an ALife project, a term I hadn't heard before, and I didn't know if it was even going to work because I wasn't aware of the many other such projects until after I was already well into it.

 

First seeing emergent behavior like being attracted to food sources and avoiding other, potentially predatory life was one of the coolest experiences I've had, and getting predators that mimicked the color of the "plant" food to attract prey, or that camped on top of plants (which I initially that of as being a dead end that looked kind of buggy because of the way movement worked but actually wound up being a winning strategy for similar reasons to the above) was icing on the whole experience.

 

It was very simplistic, comparatively speaking, but I ran that sim for fun on and off pretty much constantly through college just because I enjoyed watching it and seeing it progress towards one of several common "end states" where things tended to stabilize, usually after a couple of hours.

 

I've made a few variations on it since then and still go back to it semi-regularly when I'm first learning a language and need an interesting project to work through.

Know what you mean. Coolest thing to watch their ecosystem evolve in ways you can't fully predict.

 

At one point rigged some of mine up to be interactive. Seeing lifeforms reacting to my mouse was very odd. Then they evolved to eat the mouse... The boundary we imagine blurred for awhile though. :)

 

Keep thinking to go back to it. Freshen up the concept with modern neural nets(to develop at least the initial lifeform programming language) and an AI director to step up the environmental challenges.

 

Had decent success doing some manually, but feel we are missing the secret sauce needed to keep it from getting stuck.

 

Ok, how does one code hunger? thats a feeling, do did they symbolize an actually feeling in your stomach in code? Is there data showing one 'organism' with less hunger than others?

How are they coding vision? seeing is to have an observer contrast light and dark, if there's no awareness how can there be anyone seeing anything? How are codes symbolizing the experience of black and white if nothing is actually being seen by anything?

And touch, how is touch coded? how does the programmer know the program is feeling something physical if there's nothing physical to actually touch? How is that measured? How can one measure something as a physical interaction from something that's not physical?

 

Do you see my point? Simulations dont mean anything, they have no value on anything in reality but concepts, because the data is based on the wrong information. You need real life vision, touch and hunger to see if something is doing these things. When you look at a crab that lives inside a mountain and has lost its need for eyes, and instead uses other senses, that is a real world measurement of a life form that you can analyze having vision or not. If the Crab was just a drawing on a wall, how do you assess if it has vision if it doesnt exist? It wouldnt coz its not real. So why would you ever think squiggles symbolizing a crab on a digital screen moving around would make a picture on a wall anymore real in any way at all?

We provide them numerical values for their senses. Programming code for collision detection(sight, touch, feeding) was pretty complicated though.

 

I don't know if they "feel" anything in doing their comparisons, but they certainly are happy to attack if they see something is near them.

 

Should note that they are not what we see. What they are is an array of data and ultimately the electrons on the chips.

 

May want to look at mental models and the use of simulations in our own brains. Whether they think or don't think, I'm not about to start discriminating between our respective simulations. The map may not be the territory, but it can be near enough to be interesting.

Edited by Endy0816

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.