ydoaPs Posted May 23, 2009 Posted May 23, 2009 If a machine believed it were human-if it felt human-should we treat it as a human?
iNow Posted May 23, 2009 Posted May 23, 2009 No. For much the same reason that I wouldn't treat a man who thought that he was god AS god. *Disclaimer: I fully recognize the huge number of subtle complexities I've just brushed over.
ydoaPs Posted May 23, 2009 Author Posted May 23, 2009 No. For much the same reason that I wouldn't treat a man who thought that he was god AS god. *Disclaimer: I fully recognize the huge number of subtle complexities I've just brushed over. If having the capacity to feel human is not enough to treat a machine as a person, what is? Can machines ever achieve personhood? Is there some innate characteristic of our chemical composition that machines necessarily lack which causes them to forever fall out of the realm of moral concern?
mooeypoo Posted May 23, 2009 Posted May 23, 2009 Aren't we all but complex biological machines? If a machine is indistiguishable from a human being -- if it has consciousness and thought process and believes itself to be a human being, then we might not CALL it human, but we would probably consider treating it as human, or at least giving it some rights. Perhaps call it a person, instead? Otherwise, well.. we would have to explain what's so special about a biological human as opposed to a non-biological entity with consciousness.
ydoaPs Posted May 23, 2009 Author Posted May 23, 2009 Aren't we all but complex biological machines? If a machine is indistiguishable from a human being -- if it has consciousness and thought process and believes itself to be a human being, then we might not CALL it human, but we would probably consider treating it as human, or at least giving it some rights. Perhaps call it a person, instead? Otherwise, well.. we would have to explain what's so special about a biological human as opposed to a non-biological entity with consciousness. If it feels human, we should probably give it equal rights for our protection, if nothing else. All this has happened before and will happen again.
mooeypoo Posted May 23, 2009 Posted May 23, 2009 I don't think "feels human" is the only rule, though. If we program it to "feel human" but have nothing other than that, then what? is it still human? I'm not sure. There needs to be a bit more than that. If it has consciousness, is my idea. How do you test it? Well.. that's.. a different (harder) question.
ydoaPs Posted May 23, 2009 Author Posted May 23, 2009 I don't think "feels human" is the only rule, though. If we program it to "feel human" but have nothing other than that, then what? is it still human? I'm not sure. There needs to be a bit more than that. If it has consciousness, is my idea. How do you test it? Well.. that's.. a different (harder) question. Can unconscious things feel?
mooeypoo Posted May 23, 2009 Posted May 23, 2009 Sure. Feeling is a reaction, too. There are comatose patients who respond to pain. They "feel", they're not conscious.
ydoaPs Posted May 23, 2009 Author Posted May 23, 2009 Sure. Feeling is a reaction, too. There are comatose patients who respond to pain. They "feel", they're not conscious. Blegh. Too many fuzzy terms and concepts. If I remember, I'll come back tomorrow and make it more concrete.
mooeypoo Posted May 23, 2009 Posted May 23, 2009 Yeah, it's always about definitions with these things, that's the point. What is life? What is consciousness? How do we figure out if something is conscious or if it is alive? Or if it's conscious enough to be close-enough to a human being to "earn" rights? All about definitions. I suspect we'll ahve to deal with a whole lot of it in the upcoming years as our AI systems get closer and closer to the real deal. We're not there yet, but we probably will at some point.
iNow Posted May 23, 2009 Posted May 23, 2009 The questions seems to be, "What do you mean by treat as human?" Does that mean we "perceive" them as human? Does that mean we give them jobs and try to prevent them from dating our daughters? Does that mean they vote and get to hold seats of power? Does that mean they can declare their superiority over all other lifeforms and pollute the shit out of our planet since god told them it was okay? My inclination is that you are referring specifically to rights... such as those inalienable ones with which we're all endowed. TBH, I have a hard time wrapping my head around what that would even mean when applied to a non-reproducing, non-organic machine.
ydoaPs Posted May 23, 2009 Author Posted May 23, 2009 TBH, I have a hard time wrapping my head around what that would even mean when applied to a non-reproducing, non-organic machine. Surely if we could build them, it is possible that they could build themselves.
iNow Posted May 23, 2009 Posted May 23, 2009 The confusing part to me is not at all related to their reproduction, but the granting of inalienable rights and what that would actually mean.
YT2095 Posted May 23, 2009 Posted May 23, 2009 without a common frame of reference and this case definition as to what constitutes "human" in an agreeable way to all, there can be no argument/answer or debate, only futility.
mooeypoo Posted May 24, 2009 Posted May 24, 2009 without a common frame of reference and this case definition as to what constitutes "human" in an agreeable way to all, there can be no argument/answer or debate, only futility. Well, we're going to have to start somewhere. The viability of AI systems is a known fact... we all know it's coming, it's just not here at the moment. We're not there yet, but we're VERY VERY LIKELY to get there. Those are things we should consider morally, imho.
cameron marical Posted May 24, 2009 Posted May 24, 2009 Do you think that ai will be Better at existing in the future than us? also, off topic, but marcus, why should we know your name?
MM6 Posted May 30, 2009 Posted May 30, 2009 No. Human is a strict biological classification, homo sapiens. Australopithecus afarensis and homo erectus might share certain characteristics of a human, but they're not human. Same with your hypothetical machine. I would call it humanoid at best. But as others have said, these are definitions. It's the philosophical implication of these definitions that interests us. Even though the machine is not homo sapiens, it's still something of special interest.
Knoxy Posted January 19, 2010 Posted January 19, 2010 Because machine is machine. It can't be human ever. If there will be really such condition still there will be many differences in machine and human like: Machine can't grow Machine can't reproduce But human can... So machine must fulfill these conditions too before we call it a human or a living body.
Sisyphus Posted January 19, 2010 Posted January 19, 2010 Because machine is machine. It can't be human ever. If there will be really such condition still there will be many differences in machine and human like: Machine can't grow Machine can't reproduce But human can... So machine must fulfill these conditions too before we call it a human or a living body. So suppose you made a machine that did those things? I know it's not impossible, because such things already exist: us. But whether something is technically "human" is, I think, beside the point. The question is whether it's a person.
Syntho-sis Posted March 2, 2010 Posted March 2, 2010 Hmm..I just had a weird thought... Synthetic-Cows..
Sisyphus Posted March 2, 2010 Posted March 2, 2010 Hmm..I just had a weird thought... Synthetic-Cows.. Is there more to that thought?
Mr Skeptic Posted March 2, 2010 Posted March 2, 2010 I'd consider the sentient robot a non-human person.
Syntho-sis Posted March 3, 2010 Posted March 3, 2010 Is there more to that thought? If you are able to understand every single biochemical process that defines a 'cow', then somehow you can recreate these trillions of processes in a lab, then you are lucky enough to be able to amplify this recreation so that it can be mass produced...on and on. I'm wondering, would you call that a cow? Or just a sophisticated chemical recreation? What if we could do the same with humans? Is life only constituted of organic matter? What is organic? Well it usually means there is carbon involved. Can real life only be made of carbon? What if I use something different? Like silicon? Is that not life? There, that's the rest of my thought.
Cap'n Refsmmat Posted March 3, 2010 Posted March 3, 2010 Sounds like a good subject for a thread in our new Philosophy forum.
Genecks Posted March 4, 2010 Posted March 4, 2010 (edited) Hmm..I just had a weird thought... Synthetic-Cows.. Hmm, I just had a weird thought... Implant some wetwear into a human... Allow it to have AI. Allow the electromechanical features to change upon AI alterations (the AI understands the host needs to survive). See if the AI takes over the host's brain, actions, and mind. The parasite becomes the dominant member of the body. Hmm... Anyway, if Motoko Kusanagi existed, I'd date her before the hat reached the ground. Personhood? Blah. I suspect I would be satisfied with dating her. I would think she'd have enough programming to make her seem really close to a human if not exactly the same. Sure, I'll call her a person if she wants. Actually, I've had this idea running around my head: Evolution of silicon based organisms. Given the chance, perhaps they would become what we call electronic, robotic, and the such. Such a possibility is almost unreal, yet feasible. As such, I could consider any AI system that can continue to learn, adapt, and know how to express emotion similar to a human. The emphasis is on it learning. I suspect it would need some form of sociobiology programming in order to find ways to relate to a human society. In other words, not running a mile and taking the bus instead... some form of civility basis. I think an interesting aspect of Ghost in the Shell is that the owners actually switch out bodies from younger versions to older versions. Thus, allowing them to "grow old." I'm also curious about what kind of neural darwinism can occur within a person to allow evolution of brain material without destroying the person. Perhaps, given the person an ability to be immortal, the person's brain changes to become more electronic. Of course, the DNA isn't wired to have Silicon, I think. I believe some algae can have silicon in their bodies. Possibly by studying the various evolutionary mechanics behind those and emphasizing them within neural tissue evolution along with a feasible continuance of mental processes, then a person could be allowed a basis for neural darwinism into a more sophisticated robot-like, computer-like state with a refined computer-like brain. Carbon offers itself as quite an insulator and resistor. Given the ability to use other materials in the brain and the nervous system, I suspect that the wiring and transmission of data would be much faster. Edited March 4, 2010 by Genecks
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now