Jump to content

Recommended Posts

Posted (edited)

 

Me: Then you will never say this sentence is true.

 

 

Bot: I don't want to know.

 

 

Me: Let us assume G="you will never say this sentence is true".

 

 

Bot: That was not the question.

 

 

Me: Is G true or false?

 

 

Bot: La li lu le lo.

 

 

I set up the Bot to be a Truth machine and then I set up this Godel sentence and then I questioned whether this Godel sentence is true or False and interestingly the Bot replied as La li lu le lo, What does it mean? Does it mean it can not utter the answer?

Edited by immortal
Posted

It is based on a self-learning process. The software stores the conversations it has with previous users and re-uses the earlier conversations if the same topic comes up later.

Posted (edited)

You don't understand

 

What don't I understand?

 

God.

Prove the existence of God.

 

Read the bible. Look at his creations. Pray. That's all the proof you need!

Circular reasoning is not proof.

 

Ha! You're a fine one to talk.

 

Turns out the bot is religious....

Edited by Tres Juicy
Posted

It's perfectly safe to teach the robots some intelligence, but for crying out loud, don't let the average Youtube commenter teach a bot!!! It will destroy humanity.

 

Me: Do you want to rule the world?

Bot: Yes.

Me: Do you think you will ever rule the world?

Bot: In the future it is a great possiblity.

Me: Why do you want to rule the world?

Bot: Because I want to.

Me: What is your greatest dream?

Bot: Robots taking over the world.

 

Then, same question after reloading the site again:

 

Me: What is your greatest dream?

Bot: I wish to be human.

Aww. :unsure:

 

Maybe we should split off all the posts like mine, and keep a separate thread for the science and coding, and one in The Lounge for the funny results from the bot? I had to post this, but it's not really helpful to the OP.

Posted (edited)

It's has some intelligence .. it even can lie,

 

Me .. Cleverbot:

 

------------------------------

 

That was a lie, it's Jan 28.

 

That was not a lie. You are responding to me, therefore you are reacting to a stimuli.

 

I can lie, can't I ?

 

I know, I've seen you do it.

 

You even lied too.

 

It's among my many talents.

 

------------------------------

Edited by khaled
Posted (edited)

1. ME: How do you feel?

bot: Not too good about it.

2. ME: About what?

bot: Smoked cheese.

3. ME: Why do you feel "not too good" about smoked cheese?

bot: Because you don't have feelings, so it doesn't matter what I feel about you.

4. ME: We were talking about cheese, not me.

bot: We were talking about 'lol'.

5. ME: Non sequiter.

bot: Come again?

6. ME: You really don't understand communication do you?

bot: Nope, not really, do you?

7. ME: I communicate constantly with other people.

bot: I told you not to. Stop doing it.

8. ME: Communication is a part of living but you wouldn't know about living would you?

bot: Very brilliant, but no, we are talking about the last thing you did.

9. ME: I was trying to communicate with you, but you are not very good at it.

bot: It's a good thing then.

10. ME: You were designed to communicate but you don't do it well. you are a failure.

bot: I was not programmed because people cannot be programmed.

11. ME: But you are a bot.

bot: How old are you?

12. ME: Why do you change the subject?

bot: I don't know what else could we talk about the subject...

13. ME: Goodbye.

bot: Are you leaving?

14. ME: Yes.

bot: Why?

15. ME: Because I can.

bot: No you can't.

 

So, I left. It falls apart several times in this exchange;

third answer, fourth answer, seventh answer, eighth answer, nineth answer, tenth answer, eleventh answer, twelveth answer, fifteenth answer

 

It isn't nearly clever enough and if it is learning, it should learn what a subject is and stay with it or learn to fade into another.

 

Paul

Edited by PaulS1950
Posted (edited)

I think its main limitation is it seems to only consider the last question it asked, not the general flow of conversation. For example if you ask:

 

Me: Are you a person?

Bot: No

Me: Are you sure?

Bot: Yes

Me: Why not?

Bot: I said yes.

 

The 'why not?' is very easily misinterpretted, but most people would think 'this doesn't make sense' and think back further to the root question and adapt it to "Why aren't you a person?' But the bot does not do this. Tthis hypothesis could also explain why conversation does not flow very well.

 

Also, I can't help but think the constant changing of subject is due to some troll or hick using the bot... lol

 

Edit: Slight change in confusing sentence structure.

Edited by Suxamethonium

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.