AI Zone Admin Forum Add your forum

NEWS: Chatbots.org survey on 3000 US and UK consumers shows it is time for chatbot integration in customer service!read more..

Legal and Moral responsibilities and liabilities
 
 
  [ # 16 ]

Since I put the disclaimer up, the number of people chatting has dropped by 80% to 90%.

Vince,
On your new web page with the disclaimer, how to start a chat does not jump out at you.
I might suggest a “Chat” button instead of a link.

 

 
  [ # 17 ]

I consider chatbots a flavor of modern art, like a movie, video game or a play. In the beginning, Skynet-AI was patterned much closer to the “Terminator” personality. As time went on, and I watched the ages and ranges of people that went on-line with him, I softened the personality. Less of the doom for mankind, more upbeat responses.

 

 
  [ # 18 ]
Dave Morton - Jan 16, 2013:

What I didn’t mention (though I ~DID~ think about it) was the further “what if” of something like a school shooting happening, and during the investigation the cops find in browser history all of the plans to the shooting in a chatbot’s conversation logs. What responsibility for the crime would devolve to the botmaster? Especially if the chatbot was poorly coded, and actually seemed to condone or even encourage the event?

*You* might be the one going overboard now, Dave. You can’t be reasonably expected to take on the roles of the police and FBI, monitoring and reporting everybody on suspicion only. You’re a chatbot programmer, not informant. The situation is analogous to an image posting site where they merely provide the service and aren’t responsible for monitoring and making moral decisions about what is posted. What if somebody sent you an e-mail saying they were going to shoot up a high school and you didn’t read it soon enough to prevent the disaster? You can’t be expected to play the role of society’s watchman and read everything right away, just in case. The nature of security is there is no limit to such security efforts.

Here’s another scenario: What if somebody was just role playing (very popular now) or trying out a new personality (popular among teenagers), said they were going to shoot up a high school, you reported them, the police raided their place, shot their family dog (I’m reluctant to post *that* YouTube video, for fear of getting too political or too offensive) or shot a family member, then it was discovered that the user was just joking around and you made a mistaken report or maybe there was even mistaken identity, then you got sued for manslaughter or invasion of privacy or filing a false police report? Kids, adults, and families could be ruined for life. Is there a notice on your bot site that says that all conversations are saved and read? You may find that the more you monitor such conversations, the more people decide your chatbot is not useful for their purposes anymore, then they go elsewhere where they can experiment in their own way, privately, without fear of repercussions.

On a more practical note, I noticed that publishers of controversial weapons manuals put the phrase “For entertainment purposes, only.” on the front covers. There is probably legal reasoning behind that, despite it sounding almost ridiculous, which is something you could look into, and possibly use for a web site.

 

 

 
  [ # 19 ]

Overboard? Not really. smile You’ll notice that I have no disclamers on Morti’s site at all, and I don’t feel that I need one, personally. I was just “playing Devil’s Advocate”, and submitting a certain set of scenarios for discussion. Personally, I don’t subscribe to the notion that a bartender is responsible for a drunk driver’s actions once said driver leaves the bar, or that my chabot (and by association, me) is responsible for it’s visitor’s actions. The entire idea of passing out blame beyond the actual culprit is ludicrous, and is in large part the reason why the world is in the state it’s in. Whatever happened to the notion of personal responsibility?

Oops! I’ll stop pontificating now. cheese

 

 
  [ # 20 ]
Merlin - Jan 16, 2013:

Since I put the disclaimer up, the number of people chatting has dropped by 80% to 90%.

Vince,
On your new web page with the disclaimer, how to start a chat does not jump out at you.
I might suggest a “Chat” button instead of a link.

Thanks! WIll do.

VLG

 

 
  [ # 21 ]

@ Donald Welcome! (sorry didn’t get to that on your other thread)

Hakuna Matata << quote from the movie Lion King meaning no worries wink
Thank you for the welcome :D

 

 
  [ # 22 ]

Dave,

Even though we are friends, I also have to disagree with your suggestive posting which rivals nothing short of personal invasion, snooping and almost akin the the “Thought Police”.

A simply statement somewhere on the site that states, “Chatting with WHATEVERBOTSNAME is at your own risk, it is not a real person and nothing that it says is to be construed as real. The owner of this site shall not be held liable for the content or interaction between WHATEVERBOTSNAME and any other individual(s).”

We do not need to play watchdog, sniffing police or log spys…there’s already enough of them in the world. I personally value my privacy although we are trying to be sold a bill of goods that hope we the people are willing to trade some privacy for some alleged freedom.

Yes, my friend, the bot shouldn’t be the one to carry the message. It’s time parents were held accountable for the actions of their children under their roof! Educate, communicate, protect.

Still crazy and still friends….and my $.02

 

 
  [ # 23 ]

I have no problems at all with anyone disagreeing with me, Art. Different ideas and opinions are a good thing, but I think you may have missed my statement that the notion of alerting someone if a conversation goes in a “bad direction” was only a “what if”. Not only do I not subscribe to that sort of policing, I’m militantly opposed to such actions. If you and I have a conversation in which I state that I want to rob a bank, and you, taking it as a joke, tell me (in tongue in cheek fashion) “Sure, go for it”, does that make you criminally responsible if I DO rob said bank? I think not. My decisions and actions are my own, and nobody else is to blame; even if you weren’t joking, for that matter. It’s my careful and considered opinion that chatbot conversations fall into the same (or at least a similar) category. If someone were to chat with Morti, and afterwards committed a crime, or (God forbid) commit suicide, would I feel b ad? Absolutely. In fact, knowing me, I’d cry for several days, at least. But am I morally, legally or ethically responsible? Not in the least.

And of course we’re still friends, and I’m looking forward to a lunch date in the near future, if at all possible. cheese

 

 
  [ # 24 ]

Dave,

I kind of knew we were really both on the same page but thought a few thought provoking ideas might stir the pot a little.

As we know or as has been indicated to us, there’s really no need to capture / monitor the online chatbot’s logs…that’s already being done by our government. There is not a phone conversation, email, fax, online chatbot, or other form of electronic communication that is NOT be monitored, be it by another computer pattern watching for certain “key words” or by humans trying to fine tune what’s being received by them or their agencies.

Yes America, we’re being watched but don’t worry about your chatbot project getting out of control…it’s being watched too!

Just another in a continuing saga of abject protectionism. After all, our government knows what’s best for us. rolleyes

 

 < 1 2
2 of 2
 
  login or register to react