Author Topic: Could AI robots develop prejudice on their own?  (Read 417 times)

0 Members and 1 Guest are viewing this topic.

rangerrebew

  • Guest
Could AI robots develop prejudice on their own?
« on: September 08, 2018, 03:50:01 pm »

Could AI robots develop prejudice on their own?
September 6, 2018, Cardiff University
 

Showing prejudice towards others does not require a high level of cognitive ability and could easily be exhibited by artificially intelligent machines, new research has suggested.

Computer science and psychology experts from Cardiff University and MIT have shown that groups of autonomous machines could demonstrate prejudice by simply identifying, copying and learning this behaviour from one another.

It may seem that prejudice is a human-specific phenomenon that requires human cognition to form an opinion of, or to stereotype, a certain person or group.

https://techxplore.com/news/2018-09-ai-robots-prejudice.html

Offline sneakypete

  • Hero Member
  • *****
  • Posts: 52,963
  • Twitter is for Twits
Re: Could AI robots develop prejudice on their own?
« Reply #1 on: September 08, 2018, 06:29:04 pm »
Not a chance. Even human teens that spend all their waking hours on a cell phone or computer don't develop a personality.
Anyone who isn't paranoid in 2021 just isn't thinking clearly!

Online The_Reader_David

  • Hero Member
  • *****
  • Posts: 2,312
Re: Could AI robots develop prejudice on their own?
« Reply #2 on: September 08, 2018, 06:51:58 pm »
The answer to the headline question is, of course, since that's exactly what an AI system made for interacting with the real world is designed to do:  after some interactions, it begins to prejudge upcoming interactions based on their similarity to prior interactions, rather than treating each one as a completely new circumstance (to which it would react randomly).  Doing the latter would be called artificial naivety, rather than artificial intelligence.  Recognizable analogues of human prejudices will arise when those prejudices have a statistical basis, the more rapidly and more strongly, the stronger the statistical basis for the prejudice, as they will if the training data is unwittingly (or maliciously) biased.
« Last Edit: September 08, 2018, 06:54:03 pm by The_Reader_David »
And when they behead your own people in the wars which are to come, then you will know what this was all about.