Do you think learning a language requires empathy?

Status
Not open for further replies.

jutfrank

VIP Member
Joined
Mar 5, 2014
Member Type
English Teacher
Native Language
English
Home Country
England
Current Location
England
Very interesting what you say about the Dota 2 experiment.

I don't think that machines can properly learn language, no. I also believe that not only are we not close to creating sentient AI, it is not in fact possible for us ever to do it.

In answer to your final question—yes, I think so.
 

Glizdka

Key Member
Joined
Apr 13, 2019
Member Type
Other
Native Language
Polish
Home Country
Poland
Current Location
Poland
Could you please tell why you think it's impossible to create sentient AI? My view is radically different, and I always like to learn about what others think.

In regard to the Dota 2 esperiment, I don't know how connected it is with the AI used in Teslas, but the founder of OpenAI is Elon Musk, so I'd expect at least some level of connection.
 

jutfrank

VIP Member
Joined
Mar 5, 2014
Member Type
English Teacher
Native Language
English
Home Country
England
Current Location
England
Could you please tell why you think it's impossible to create sentient AI?

My personal view on this issue would take too many words to explain here. I don't think I could do it in just a couple of paragraphs. Plus, it would sidetrack somewhat from the thread. I'm more than happy to discuss the issue elsewhere, however.
 

Tdol

No Longer With Us (RIP)
Staff member
Joined
Nov 13, 2002
Native Language
British English
Home Country
UK
Current Location
Japan
"The Selfish Gene" is a good read.

Richard Dawkins said that The Selfless Gene might have been a better title, but not as catchy.
 

Raymott

VIP Member
Joined
Jun 29, 2008
Member Type
Academic
Native Language
English
Home Country
Australia
Current Location
Australia
Could you please tell why you think it's impossible to create sentient AI?
I think it's very possible if, by sentient, you mean able to receive sensory input and respond to it. We have that now.
If you mean self-aware, or self-conscious, that's less clear.
 

Glizdka

Key Member
Joined
Apr 13, 2019
Member Type
Other
Native Language
Polish
Home Country
Poland
Current Location
Poland
I think it's very possible if, by sentient, you mean able to receive sensory input and respond to it. We have that now.
If you mean self-aware, or self-conscious, that's less clear.

I've been thinking about the topic. I'm not entirely sure what I understand by 'sentient'. I'm sure it at least partially consists of emotions, and I do believe they can be possessed by artificial intelligence, though it isn't the case, yet.
 

Raymott

VIP Member
Joined
Jun 29, 2008
Member Type
Academic
Native Language
English
Home Country
Australia
Current Location
Australia
I'm sure that robots will be able to display appropriate emotional behaviour. They will be able to frown, smile, maybe even cry if some robotics engineer wants to install a water reservoir in their 'eyes'. But I doubt whether they will feel anything. Feeling requires a body, not just cognition.
 

Tdol

No Longer With Us (RIP)
Staff member
Joined
Nov 13, 2002
Native Language
British English
Home Country
UK
Current Location
Japan
Turing suggested that the differences could be eroded and, once we couldn't tell the difference, the machines would have got there.
 

Raymott

VIP Member
Joined
Jun 29, 2008
Member Type
Academic
Native Language
English
Home Country
Australia
Current Location
Australia
I hope he didn't mean that once humans couldn't tell the difference that the robot was actually feeling something. That would be rather illogical. "I don't know whether you're feeling anything, therefore you are feeling something."

PS: Was that a variant of the Turing Test?
 
Last edited:

probus

Moderator
Staff member
Joined
Jan 7, 2011
Member Type
Retired English Teacher
Native Language
English
Home Country
Canada
Current Location
Canada
Research by Alexander Z. Guiora, Robert C. L. Brannon, Cecelia Y. Dull2 The University of Michigan, EMPATHY AND SECOND LANGUAGE LEARNING, this may interest of you.

Wow.

For the first time, the thank button seems insufficient to me. I feel the need to a wow button. That stuff will keep me busy for days and days. Thanks indeed.
 

Tdol

No Longer With Us (RIP)
Staff member
Joined
Nov 13, 2002
Native Language
British English
Home Country
UK
Current Location
Japan
I hope he didn't mean that once humans couldn't tell the difference that the robot was actually feeling something. That would be rather illogical. "I don't know whether you're feeling anything, therefore you are feeling something."

PS: Was that a variant of the Turing Test?

No, but if we couldn't spot the machines, then it would no longer be particularly relevant.
 

Raymott

VIP Member
Joined
Jun 29, 2008
Member Type
Academic
Native Language
English
Home Country
Australia
Current Location
Australia
I see. That's what Turing said about the Turing test. If a human couldn't tell whether they were talking to a machine, then the machine could be considered as being intelligent. But emotions are different. They have more ethical implications. If we make machines that can feel emotions like a human can, we're ethically obliged to treat them as well as we do humans.
In fact, we are probably ethically obliged not to make machines that can become suicidally depressed or suffer needlessly from any of the severely unpleasant emotions that humans can suffer. Bearing that in mind, I don't believe it's irrelevant whether a robot is suffering from agonising mental pain or just appears to be behaving as if it were.
 

probus

Moderator
Staff member
Joined
Jan 7, 2011
Member Type
Retired English Teacher
Native Language
English
Home Country
Canada
Current Location
Canada
If we make machines that can feel emotions like a human can, we're ethically obliged to treat them as well as we do humans.

I think that is at least debatable. The amygdala, where emotions are thought to reside, is common to all mammals. And anybody who has had a dog knows that they love us intensely. Dogs can also be jealous, as I know from personal experience. And while we have laws against cruelty to animals, not everyone agrees that dogs must be treated as well as people.
 

Glizdka

Key Member
Joined
Apr 13, 2019
Member Type
Other
Native Language
Polish
Home Country
Poland
Current Location
Poland
I don't believe passing the Turing test (you can't distinguish a machine from a human when speaking to them) is equal to being sentient. The robot might just be good at mimicking language and all of its features, whether grammar or vocabulary. It might even be good at correlating responses with what prompts them, but this is something it can learn by studying humans speak (recognizing and replicating patterns). However, these would not be the representation of its 'thoughts', i.e. not its words, just the most observed type of human response it replicated. Much like how Raymott described it.

I'm sure that robots will be able to display appropriate emotional behaviour. They will be able to frown, smile, maybe even cry if some robotics engineer wants to install a water reservoir in their 'eyes'. But I doubt whether they will feel anything. Feeling requires a body, not just cognition.

I don't think, however, that 'feeling' requires a physical body at all. A fully virtual entity is also capable of processing input information and being motivated by the data it analyzes. The fact our emotions are triggered by a wide variety of neurotransmitters doesn't make it any special; artifical intelligence can also operate in distinctive states we call emotions, even though they are triggered by electrical signals. It can also correlate these states with what causes them to evolve long-term relation or preemptive responses, e.g. 'fear' what causes harm, and 'love' what is important for it.

It does not mean the emotions are any less real if the entity does not have a physical, biological body, or that its various, distinctive modes it operates in are triggered by electrical signals rather than chemical ones.
 
Last edited:

Raymott

VIP Member
Joined
Jun 29, 2008
Member Type
Academic
Native Language
English
Home Country
Australia
Current Location
Australia
Yes, I agree about the Turing test. I don't think anyone has suggested that passing it was a sign of sentience. And it has nothing to do with emotions.
I also take probus's point that we don't treat animals as emotionally the same as humans, but I did specify "If we make machines that can feel emotions like a human can ..." and I stand by that. Also, we didn't deliberately set about to engineer the specifications of dogs or other animals, so that limits our moral responsibility somewhat.

We can disagree about whether robots will ever have human-like emotions. It depends to some extent on whether programming the ability to suffer intense negative emotions into a robot is something that we would consider necessary to do. Or perhaps that could become an emergent property that develops regardless of human programming.
 

jutfrank

VIP Member
Joined
Mar 5, 2014
Member Type
English Teacher
Native Language
English
Home Country
England
Current Location
England
I'm sure that robots will be able to display appropriate emotional behaviour. They will be able to frown, smile, maybe even cry if some robotics engineer wants to install a water reservoir in their 'eyes'. But I doubt whether they will feel anything. Feeling requires a body, not just cognition.

Just to say that I fully agree with this.
 

jutfrank

VIP Member
Joined
Mar 5, 2014
Member Type
English Teacher
Native Language
English
Home Country
England
Current Location
England
Turing never meant to suggest that passing the Turing test had anything at all to do with sentience. His idea of an 'imitation game' was proposed merely as a challenge to those researchers working in the field to develop more intelligent machines.
 

Glizdka

Key Member
Joined
Apr 13, 2019
Member Type
Other
Native Language
Polish
Home Country
Poland
Current Location
Poland
My personal view on this issue would take too many words to explain here. I don't think I could do it in just a couple of paragraphs. Plus, it would sidetrack somewhat from the thread. I'm more than happy to discuss the issue elsewhere, however.
Yep... this thread has turned into a discussion about artificial intelligence. Foreshadowing much?
Research by Alexander Z. Guiora, Robert C. L. Brannon, Cecelia Y. Dull2 The University of Michigan, EMPATHY AND SECOND LANGUAGE LEARNING, this may interest of you.
This was a really good read. I'm glad I've read the whole of it. I agree with Probus there should be a "wow" button.

Thank you all for posting in this thread, but I'd like to return to the original question. I've been thinking about another case of how empathy correlates with using language. What do you think about sociopaths? Especially those that are very convincing, and use language extremely well (not only verbal language, but also body language). I was watching one of Jordan Peterson's lectures where he gave the prison interview with Paul Bernardo as an example of how a psychopath can use language to manipulate our emotions. What do you think?
 

jutfrank

VIP Member
Joined
Mar 5, 2014
Member Type
English Teacher
Native Language
English
Home Country
England
Current Location
England
I don't think Paul Bernard is using language particularly. He's using a whole set of manipulative techniques.
 
Status
Not open for further replies.
Top