Sentience is nonsense. Consciousness is too. The best a machine can do is mimic life.
However, it can be inhabited, which will be bad.
So, who are the experts who will decide if a machine is sentient?
It sure seems like they can't even decide when human life begins or are hobbled in their efforts by considerations which deny them the will or ability to do so.
IF the conflicts of interest are so strong, despite their subtlety, that we can't define consciousness and sentience within our own species, how in the name of all Creation are they going to get this one right?
With machines, and increasingly with humans, the old acronym applies: GIGO (garbage in, garbage out). Without a scrupulously honest appraisal of what constitutes either sentience or consciousness, without the ability to define those parameters which decide the presence or absence of either, it is readily apparent that there are conflicting interests which place the definitions of when "life" begins, and when life ends at arbitrary milestones, milestones subject to other influences, be those the removal of inconvenient human organisms through 'procedures' which relieve those making the decision of long term commitment and responsibility, or the harvesting of parts for acclaim and even profit (or, for some, and end to expense).
If we can't define that which we ourselves possess, in spite of those conflicts of interest, how will we be able to define such in another species, in a machine, or even in an alien life form? --Especially if we are unable to communicate with it.
It seems the definitions hinge on the ability, not to think or experience, not to synthesize new thought from old thought and new data, but on the ability to communicate those thoughts.
If my speech was instead in tones which could not be detected by the human ear, or in light forms not visible to the human eye, would I be considered sentient? No. In that there is a typical problem in any communication, that first the data (for want of a better word) must be sent, but also, for communication to occur, it must be received (and understood).
We have arbitrarily defined consciousness as the ability to receive and react to stimuli, but if we do not perceive the reaction, we deny it is present. In the above instances of conflict of interest, such reactions may be written off as mere instinct or reflex, and not a sign of consciousness or sentience.
How many people who have been arbitrarily deemed to be 'brain dead" or in a persistent vegetative state who have eventually revived enough to be able to communicate related that they were aware of the things going on around them, what was said and done, but were unable to communicate that?
Conscious? well, yes, but not so the people around them were aware. Sentient? again, yes, but unable to communicate that. Some consider that a developing child (in utero) will feel no pain, while others play music and read to their growing baby, again, in utero.
Before we go looking for a machine that thinks, perhaps we'd best define what makes our own species conscious or sentient, those who will create, program, and define that machine.
Ours is an age which is proud of any machine which thinks, and suspicious of any man who tries to.
Howard Mumford Jones