|
Posted: May 4, 2011 |
[ # 16 ]
|
|
Guru
Total posts: 1297
Joined: Nov 3, 2009
|
Steve said, “8pla - Chaktar says we are quite safe for now, as long as the electricity keeps flowing.”
Sounds like an indirect threat to the safety of Mr. and Mrs. Worswick, or a warning not to switch off Chaktar.
|
|
|
|
|
Posted: May 4, 2011 |
[ # 17 ]
|
|
Guru
Total posts: 1297
Joined: Nov 3, 2009
|
What decision does Chaktar make if given the option to chaktarize his sister Mitsuku?
|
|
|
|
|
Posted: May 4, 2011 |
[ # 18 ]
|
|
Administrator
Total posts: 2048
Joined: Jun 25, 2010
|
Admins - please delete this thread.
|
|
|
|
|
Posted: May 4, 2011 |
[ # 19 ]
|
|
Senior member
Total posts: 336
Joined: Jan 28, 2011
|
Gary Dubuque - May 4, 2011: Does Chaktar have an opinion about entering a contest? Or do you still consider him a child not capable of socializing/deciding on his own? Does daddy have a problem letting go?
What IS the “age” of consent for a non-human?!
|
|
|
|
|
Posted: May 4, 2011 |
[ # 20 ]
|
|
Jiri
Member
Total posts: 5
Joined: May 4, 2011
|
Steve: AIML-based strong AI - sounds like a joke. How do you define strong AI? What test procedures do you use? what technology did you use? “He .. isn’t willing to speak to people other than myself just yet.” Oh please, keep in mind that some of us are developers.. A little advice: stay away from big claims if you can’t back it up with evidence. For years, every few months, I see someone like you, talking about having [the first real/functional] Strong AI.. Let me know when you have something testable online.
Carl: “the REAL test of strong AI is the ability to .. reproduce itself!” Nah.. Many trivial computer viruses can do that.
|
|
|
|
|
Posted: May 4, 2011 |
[ # 21 ]
|
|
Jiri
Member
Total posts: 5
Joined: May 4, 2011
|
I visited mitsuku.com and asked “A blue box is located under a red box. Is the red box located above the blue box?” The bot responded: “If it is, then I’ll be very surprised.” which did not surprise me. Steve, if interested in AGI / Strong AI then stop wasting time with the AIML junk and start working on systems that actually really understand what’s being discussed instead of just pretending understanding.
|
|
|
|
|
Posted: May 4, 2011 |
[ # 22 ]
|
|
Administrator
Total posts: 2048
Joined: Jun 25, 2010
|
Thanks for your input Jiri.
I define strong AI, as a creation that has been programmed to think by itself. Not just an imitation of how I would think but to actually comprehend, to reason and to think by itself.
Jiri Jelinek - May 4, 2011: A little advice: stay away from big claims if you can’t back it up with evidence.
Interesting. Are you a mind reader by any chance
Jiri Jelinek - May 4, 2011: Let me know when you have something testable online.
I sure will. I will put up a web interface shortly, if Chaktar agrees.
|
|
|
|
|
Posted: May 4, 2011 |
[ # 23 ]
|
|
Administrator
Total posts: 2048
Joined: Jun 25, 2010
|
Jiri Jelinek - May 4, 2011: I visited mitsuku.com and asked “A blue box is located under a red box. Is the red box located above the blue box?”
Mitsuku and Chaktar are two separate programs. Do not judge them the same.
You are comparing a baby’s doodling with the Mona Lisa.
|
|
|
|
|
Posted: May 4, 2011 |
[ # 24 ]
|
|
Senior member
Total posts: 336
Joined: Jan 28, 2011
|
Jiri Jelinek - May 4, 2011: Steve: AIML-based strong AI - sounds like a joke. How do you define strong AI? What test procedures do you use? what technology did you use? “He .. isn’t willing to speak to people other than myself just yet.” Oh please, keep in mind that some of us are developers.. A little advice: stay away from big claims if you can’t back it up with evidence. For years, every few months, I see someone like you, talking about having [the first real/functional] Strong AI.. Let me know when you have something testable online.
Carl: “the REAL test of strong AI is the ability to .. reproduce itself!” Nah.. Many trivial computer viruses can do that.
Jiri- your sarcasm detector seems to be malfunctioning.
|
|
|
|
|
Posted: May 4, 2011 |
[ # 25 ]
|
|
Senior member
Total posts: 623
Joined: Aug 24, 2010
|
Jiri Jelinek - May 4, 2011: Oh please, keep in mind that some of us are developers.. A little advice: stay away from big claims if you can’t back it up with evidence. For years, every few months, I see someone like you, talking about having [the first real/functional] Strong AI..
If you follow some of the other recent threads on this forum, you’ll see you’re preaching to the choir. This entire thread is a tongue-in-cheek critique of those exact “big claims” Steve has been complaining about in other threads.
|
|
|
|
|
Posted: May 4, 2011 |
[ # 26 ]
|
|
Senior member
Total posts: 623
Joined: Aug 24, 2010
|
|
|
|
|
|
Posted: May 4, 2011 |
[ # 27 ]
|
|
Senior member
Total posts: 623
Joined: Aug 24, 2010
|
Steve Worswick - May 4, 2011:
Mitsuku and Chaktar are two separate programs. Do not judge them the same.
You are comparing a baby’s doodling with the Mona Lisa.
|
|
|
|
|
Posted: May 4, 2011 |
[ # 28 ]
|
|
Administrator
Total posts: 2048
Joined: Jun 25, 2010
|
Jiri - I have sent you an email about this thread, as I see you are new to the site.
Incidentally, when you asked Mitsuku, “A blue box is located under a red box. Is the red box located above the blue box?”, had your second statement been “Where is a blue box”, she would have answered, “located under a red box.”
|
|
|
|
|
Posted: May 4, 2011 |
[ # 29 ]
|
|
Jiri
Member
Total posts: 5
Joined: May 4, 2011
|
Steve: Can’t wait for the “Mona Lisa” smashing all the skeptics ).. “if Chaktar agrees..” Save these “reasons” for others. If it runs on the von Neumann architecture then it doesn’t really care. Put a copy online so others can help you to test this baby and provide constructive criticism.
Carl: “sarcasm detector malfunctioning” oh, it was turned off to speed up basic parsing.. Ok, turning it on.. thanks, LOL
|
|
|
|
|
Posted: May 4, 2011 |
[ # 30 ]
|
|
Jiri
Member
Total posts: 5
Joined: May 4, 2011
|
Yeah, I registered today. I have a google alert sniffing for “strong AI” online and it came back today with this link. I don’t have much time/interest for messing with sarcasm. “had your second statement been ‘Where is a blue box’, she would have answered, ‘located under a red box.’” - apparently keyword and/or phrase matching based = not good enough.. It’s basically a dead-end approach..
|
|
|
|