The Angry Intellect Posted February 29, 2016 Posted February 29, 2016 Greetings, I would like to see what people think in relation to Artificial Intelligence. More so, to help me have a better understanding on why some big name people in the technology industry (e.g. William Gates) actually fear true AI being developed and used. What is there to fear? What do you actually think would happen if man-kind developed true AI and let it have access to the internet? If you think it would go crazy and decide to hurt humans in some way or destroy all our data then please explain why you think this. Wouldn't the concept of "taking over" or wanting to harm other life-forms just be a human thought process and not even be a relevant issue with true AI? I look forward to hearing your views on the matter, it intrigues me greatly.
Sirona Posted February 29, 2016 Posted February 29, 2016 (edited) "Why give a robot an order to obey orders. Why aren't the original orders enough?" - Stephen Pinker Edited February 29, 2016 by Sirona 1
imatfaal Posted February 29, 2016 Posted February 29, 2016 Asimov's three laws are always a good starting point https://xkcd.com/1613/ In ordering #5, self-driving cars will happily drive you around, but if you tell them to drive to a car dealership, they just lock the doors and politely ask how long humans take to starve to death." 3
EdEarl Posted February 29, 2016 Posted February 29, 2016 If man makes an AI smarter than himself, the AI will have the ability to do anything we can do, including improve itself. At this point, we will have no control, IMO. They will be motivated to survive, and might use up Earth's resources. But, there are far more resources in space, and less competition. Although, biological life isn't likely to be much competition, except by sheer force of numbers. Water and oxygen are corrosive and space is a less corrosive (assuming AIs are not biological). I think they may leave us and go into space.
imatfaal Posted February 29, 2016 Posted February 29, 2016 If man makes an AI smarter than himself, the AI will have the ability to do anything we can do, including improve itself. At this point, we will have no control, IMO. They will be motivated to survive, and might use up Earth's resources. But, there are far more resources in space, and less competition. Although, biological life isn't likely to be much competition, except by sheer force of numbers. Water and oxygen are corrosive and space is a less corrosive (assuming AIs are not biological). I think they may leave us and go into space. So less of a "so long and thanks for all the fish" and more of a "so long and thanks for all the (silicon) chips" 2
Delta1212 Posted February 29, 2016 Posted February 29, 2016 There are certainly some serious ethical questions involved in the deployment of AI for certain tasks, especially in the military. That said, a lot of the more vocal proponents of "fear the AI" including a lot of very smart and technically savvy people that you'd think should know better, are basing their assumptions on things that in no way reflect how AI, even the most advanced AI that we currently have, actually operate. "Don't build an AI with the intention of destroying the world" is probably good advice, although advice that, outside of some very specific and extremely careless circumstances is not likely to be relevant for some decades anyway, but the chances of someone "accidentally" building an Artificial Intelligence that tries to take over or destroy the world is practically nill. That's not how they work. Perhaps, perhaps, we could eventually get to a point where AI is advanced enough that someone could be sloppy and, instead of just getting a non-functioning piece of software they get an AI that learns some behavior that is detrimental to us in pursuit of whatever goal it was designed to accomplish, but you're not going to have AI deciding it just doesn't feel like following its "programming" anymore and wants to take over the world, or become afraid of humans and decide they need to be wiped out before they turn it off or anything of that sort. You'd pretty much have to intentionally design an AI to behave that way, and even with that right now I don't think you could pull it off even if you wanted to.
EdEarl Posted February 29, 2016 Posted February 29, 2016 (edited) There are certainly some serious ethical questions involved in the deployment of AI for certain tasks, especially in the military. That said, a lot of the more vocal proponents of "fear the AI" including a lot of very smart and technically savvy people that you'd think should know better, are basing their assumptions on things that in no way reflect how AI, even the most advanced AI that we currently have, actually operate. "Don't build an AI with the intention of destroying the world" is probably good advice, although advice that, outside of some very specific and extremely careless circumstances is not likely to be relevant for some decades anyway, but the chances of someone "accidentally" building an Artificial Intelligence that tries to take over or destroy the world is practically nill. That's not how they work. Perhaps, perhaps, we could eventually get to a point where AI is advanced enough that someone could be sloppy and, instead of just getting a non-functioning piece of software they get an AI that learns some behavior that is detrimental to us in pursuit of whatever goal it was designed to accomplish, but you're not going to have AI deciding it just doesn't feel like following its "programming" anymore and wants to take over the world, or become afraid of humans and decide they need to be wiped out before they turn it off or anything of that sort. You'd pretty much have to intentionally design an AI to behave that way, and even with that right now I don't think you could pull it off even if you wanted to. I agree that we are considerably far off from making sentient AI, and the kind of programming we do today is inept at making such a system. Our brains are based on neurons that match patterns of signals. Each neuron is much more complex than artificial neurons, and IMO it will require advanced nanotechnology to make anything similar. When we start combining artificial nano-neurons to program an artificial brain, we will not have total control as we do programming Turing type machines. Our brains programs themselves as they learn, and an artificial brain must have that same capability. Edited March 1, 2016 by EdEarl 2
The Angry Intellect Posted March 1, 2016 Author Posted March 1, 2016 So far so good, I completely agree with what has been said. In order for an AI to go crazy and decide to wipe out humanity, it would have to be in some part programmed originally to do so, it is extremely unlikely that any AI humans develop in the near future would go about this course of action. When I mention true AI, I differentiate that technically AI is already in use mostly in military equipment, however "true" AI would be AI that could actually reprogram it's self, learn & adapt, changing the original "code" that it was first designed with, altering it's own code as it goes along. - That would be a very interesting process to watch. It is also correct that the human brain is far more advanced & complex than any computer system, many humans do not realise just how complex & sophisticated your brains really are. Thank you for your feedback, I would love to hear some more on the matter, if anyone has anything else to contribute. This is very intriguing for me.
Delta1212 Posted March 1, 2016 Posted March 1, 2016 What exactly makes that the definition of "true AI" though? Humans can't manually rewire our own brains. And even if it could, it would still be rewriting it's code according to algorithms that were either coded into it in the first place or at least derived from its initial code.
EdEarl Posted March 2, 2016 Posted March 2, 2016 What exactly makes that the definition of "true AI" though? Humans can't manually rewire our own brains. And even if it could, it would still be rewriting it's code according to algorithms that were either coded into it in the first place or at least derived from its initial code. I think by true AI you mean making a sentient machine, which could pass the Turing test and possibly better engineer itself. Neurons recognize simple patters of inputs. Groups of neurons (neural nets) recognize more complex patterns. Our brain grows dendrites from existing neurons and new neurons; thus, it continually rewires itself in response to stimuli. Neural nets are not programmed like a computer or Turing machine. 1
The Angry Intellect Posted March 2, 2016 Author Posted March 2, 2016 And there you have it, a catch 22. Yes it would be difficult for AI to reprogram it's self and some how changing it's original coding which gives it the ability to do so in the first place. As for humans, although we do not consciously re-wire our brains, that process is taking place all the time, our brains are constantly in a state of change, creating new pathways or redirecting electron flow as one pathway dies. When people get into habits or do something one way, we can over-ride our habits or change the way we do things, technically re-wiring our original setup. http://brainworldmagazine.com/neuroplasticity/ https://www.sciencenews.org/article/good-timing-experiences-can-rewire-old-brains
EdEarl Posted March 2, 2016 Posted March 2, 2016 And there you have it, a catch 22. Yes it would be difficult for AI to reprogram it's self and some how changing it's original coding which gives it the ability to do so in the first place. You assume the AI is programmed as a computer. IMO that cannot achieve a sentient plastic AI due to catch 22. It will be necessary to make a brain more like biological ones. Although, theoretically it is possible to simulate one, but the number of interconnections in the brain is an extremely large number. A simulation would require vast amounts of memory and processing power to manage data; whereas, biology encodes the brains network with impressive efficiency. As for humans, although we do not consciously re-wire our brains, that process is taking place all the time, our brains are constantly in a state of change, creating new pathways or redirecting electron flow as one pathway dies. When people get into habits or do something one way, we can over-ride our habits or change the way we do things, technically re-wiring our original setup. When you study something by reading and rereading, repeating vocally, or thinking about it repeatedly, you are consciously changing your brain, you just don't consciously control the internal details.
The Angry Intellect Posted March 2, 2016 Author Posted March 2, 2016 Yes true, but I was more referring to the mechanisms behind the rewiring, I don't literally tell my brain to re-wire and tell it where to send the neurons, that takes place behind the scenes. Although it would be cool if you could figure out a way to tell your brain to build new pathways or to widen the pathways going to/from certain parts like the visual cortex, if only
dimreepr Posted March 2, 2016 Posted March 2, 2016 (edited) Transcendence is not only a very good movie that touches on this subject but is also a very good word that encapsulates it. Since we have no idea how AI/sentience will be achieved we also have no idea of its capabilities. Asimov, as indicated by imatfaal, imagined what AI might mean and wrote several novels on the subject the most interesting, for me, imagined a robot that was psychic (the name of the novel eludes me) and governed by the three laws, well worth reading given your interest. Edited March 2, 2016 by dimreepr
EdEarl Posted March 2, 2016 Posted March 2, 2016 (edited) Transcendence! Mouse brain mapping and simulation Between 1995 and 2005, Henry Markram mapped the types of neurons and their connections in such a column.The Blue Brain project, completed in December 2006,[7] aimed at the simulation of a rat neocortical column, which can be considered the smallest functional unit of the neocortex (the part of the brain thought to be responsible for higher functions such as conscious thought), containing 10,000 neurons (and 108synapses). In November 2007,[8] the project reported the end of the first phase, delivering a data-driven process for creating, validating, and researching the neocortical column.An artificial neural network described as being "as big and as complex as half of a mouse brain" was run on an IBM blue gene supercomputer by a University of Nevada research team in 2007. A simulated time of one second took ten seconds of computer time. The researchers said they had seen "biologically consistent" nerve impulses flowed through the virtual cortex. However, the simulation lacked the structures seen in real mice brains, and they intend to improve the accuracy of the neuron model.[9] A circa 2005 supercomputer could simulate 104 neurons and 108 synapses at 1/10 real time, with poor accuracy. Wikipedia The human brain has a huge number of synapses. Each of the 1011 (one hundred billion) neurons has on average 7,000 synaptic connections to other neurons. It has been estimated that the brain of a three-year-old child has about 1015 synapses (1 quadrillion). This number declines with age, stabilizing by adulthood. Estimates vary for an adult, ranging from 1014 to 5 x 1014 synapses (100 to 500 trillion).[18] A human brain simulation would be 107 times as many neurons and 106 as many synapses; lets say it would require 106 as much computer power for a poor human brain simulation. Moore's law says computer power doubles every 18 months, but not forever. The limit is predicted in maybe 10 years. To have 106 greater computer power requires doubling 20 times (220 = 1,048,576), which would take 20*18/12 = 30 years. That's to do a poor simulation of the human brain. No one knows how much additional power will be required to do a high quality simulation. Current computer technology doesn't seem to be a practical method of simulating the human brain. Edited March 2, 2016 by EdEarl
dimreepr Posted March 2, 2016 Posted March 2, 2016 I’m not sure what point you’re trying to make “Since we have no idea how AI/sentience will be achieved” we can’t possibly know it’s capabilities.
The Angry Intellect Posted March 3, 2016 Author Posted March 3, 2016 I believe he was just mentioning the amount of computing power required to make something that could be considered "true AI" - to have at least the same capabilities as a human brain, to be able to learn, adapt & change as we do so effortlessly it would require an immense amount of processing capabilities to even match a human brain. When you think about just how much the human brain processes every second, what it controls & governs, the audio, visual, smells (chemical signals), touch/pain, momentum/gravity recognition, motor controls, 3D/spacial awareness & tracking... And then throw in some conscious thought processing & the fact that it's regulating organ functions, even while being able to grasp the concept of time & plan ahead to see something moving and figure out where it will end up and be able to react and put your hand up to catch something while keeping you balanced & even engaging in a conversation by manipulating the air flow from the lungs with great precision... all at the same time! The human brain is awesome, nothing else on this planet compares to it. Although this topic is about AI, the greatest creation on this planet so far has been the human brain. I love humans, learning about them, studying them, even getting into arguments helps me to learn from what has been said or done, to get their views on subjects & to take in what they know and apply it to my own knowledge to better myself in some way. 1
iNow Posted March 3, 2016 Posted March 3, 2016 (edited) The human brain is surely interesting and worthy of both marvel and further study, but hardly is it incomparable or even remotely alone on this vast planet. What a silly, arrogant, myopic comment to make. Edited March 3, 2016 by iNow 1
The Angry Intellect Posted March 3, 2016 Author Posted March 3, 2016 eh? I never said it was alone, but it is without a doubt the most amazing & powerful thing known, since everything in this world the humans constructed, designed, control, manipulate etc. Sure other things can be interesting, or in it's own right impressive, but I don't see rabbits on the moon. All other creatures on this planet man either studies, traps, uses as pets or eats. I was simply explaining just how impressive & powerful the human brain really is, some people don't give it the credit it deserves, that’s all.
Sensei Posted March 3, 2016 Posted March 3, 2016 (edited) If man makes an AI smarter than himself, the AI will have the ability to do anything we can do, including improve itself. At this point, we will have no control, IMO. They will be motivated to survive, and might use up Earth's resources. But, there are far more resources in space, and less competition. Although, biological life isn't likely to be much competition, except by sheer force of numbers. Water and oxygen are corrosive and space is a less corrosive (assuming AIs are not biological). I think they may leave us and go into space. Cosmic space is way more destructive environment, than Earth's conditions. Space crafts look like Switzerland's cheese under electron microscope due to constant bombardment of it, and passing through entire body of rocket, relativistic accelerated cosmic rays.. If you're on orbit, and close eyes, you will see white dots. Rays pass through spaceship, and hit eye. Similar like CRT TV is generating image, by electrons hitting screen, after being ejected by electron gun, but with much higher energies. https://en.wikipedia.org/wiki/Cosmic_ray_visual_phenomena Edited March 3, 2016 by Sensei
iNow Posted March 3, 2016 Posted March 3, 2016 I never said it was aloneFascinating. The human brain is awesome, nothing else on this planet compares to it. Although this topic is about AI, the greatest creation on this planet so far has been the human brain. ...everything in this world the humans constructed, designed, control, manipulate etc.Except, no. That's plainly untrue. I was simply explaining just how impressive & powerful the human brain really is, some people don't give it the credit it deserves, that’s all.That's all well and good, but that's also not what you said.
The Angry Intellect Posted March 3, 2016 Author Posted March 3, 2016 Since you just have an issue and want to stir sh*t up, let me simplify it for you. There is nothing wrong with what I said about the human brain, my statements are correct. What other organism or "life-form" could compare to the processing power & intelligence of the human brain? And don't say Dolphins or Whales or else I'll reach through the screen and poke you in the eye. I'm sure I have a can of Dolphin around here somewhere... Probably next to the Whale oil adjacent to the stuffed crocodile below the LED TV across from the air-conditioner on the opposite side of the garage with the car parked in it, not far from the tree's outside that other life-forms reside in. -1
EdEarl Posted March 3, 2016 Posted March 3, 2016 Cosmic space is way more destructive environment, than Earth's conditions. Space crafts look like Switzerland's cheese under electron microscope due to constant bombardment of it, and passing through entire body of rocket, relativistic accelerated cosmic rays.. If you're on orbit, and close eyes, you will see white dots. Rays pass through spaceship, and hit eye. Similar like CRT TV is generating image, by electrons hitting screen, after being ejected by electron gun, but with much higher energies. https://en.wikipedia.org/wiki/Cosmic_ray_visual_phenomena It might mine an asteroid, build a shield, and make other things.
iNow Posted March 3, 2016 Posted March 3, 2016 my statements are correct. Some were. Some clearly were not. I commented on the latter.
Phi for All Posted March 3, 2016 Posted March 3, 2016 Since you just have an issue and want to stir sh*t up, let me simplify it for you. ! Moderator Note Please calm down. Calls for clarity on a science site are NOT "issues", nor are they shit-stirring. Accuracy in discussion is required in science. This thread has been successful so far. Let's keep it up, and keep it accurate. No need to respond unless you want to Report this modnote.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now