The only physical evidence is found in behavior and facial expressions. But the internal evidence is very convincing: try, for example, sticking yourself with a pin. Much if not all of morality also depends on our belief in or knowledge of sentience. Sentience is why torture, rape and murder are wrong.
> Then is torture, rape and murder wrong because the victim is sentient, or because the perpetrator is?
Nothing is wrong unless it's done by a moral actor (which is a much higher standard than sentience. Pretty much everything with a central nervous system is sentient, but lobsters, for instance, are not moral actors.
Similarly, the usual understanding of the moral status (the gravity of not the binary permissible/wrong status) of the three acts you describe is somewhat connected to the target as well as the actor being a moral actor (that's least the case with torture, and most the case with murder) rather than merely sentient.
There are arguments to be made for both. Some crimes, even if virtual, can stain or corrupt the perpetrator in ways inimical to society. There are plenty of examples of people who fantasised or role played abhorrent behaviour and went on to perpetrate it in real life, so there is a real danger.
For example we tolerate computer games with virtual killing, but don’t tolerate virtual rape games. Even with virtual killing there are limits. Should we tolerate nazi death camp torturer simulation games?
I think it has to do with the part where the perpetrator is a concious being. Clearly the enemies in the games aren’t concious, but does it still stain the human playing the game?
It was an angle I didn’t consider at all, so it was actually quite interesting.
> Should we tolerate nazi death camp torturer simulation games?
This immediately brought the “Farming Simulator” imagery to mind. I can totally see how they’d make a nazi death camp simulator seem soul crushingly boring.
> But the internal evidence is very convincing: try, for example, sticking yourself with a pin.
Systems don't need sentience to avoid self-harm: simply assign a large negative weight to self-harm. Now you need a big reward to offset it, making the system reluctant to perform such an action.
It is if you take “sentience” to mean “the ability to feel,” which is what my dictionary just told me. I think this category really is the most basic differentiating one. Higher level stuff like self awareness all depend on it. The most basic difference between a computer and a human (or even a dog…) is, in my opinion, the ability to feel.
>It is if you take “sentience” to mean “the ability to feel,”
I don't like this definition much because "feel" is a fuzzy word. In this context it should be "feel" as in experience. I can build a machine that can sense heat and react to it, but I can't build one that can experience heat, or can I?
You need to figure out what having the capability "to experience" means, and you'll be one step closer to defining sentience. Even so, I've never experienced anyone coming up with a succinct definition encapsulating how I experience sentience. I believe it can't be done. If it can't be done it renders any discussion about whether or not someone or something is sentient moot. If it can't be put into words we also cannot know how others experience it: If they say this machine is just as sentient as I am, we'll have to take their word for it.
So the meaning of sentience is subjective, so there can't be an objective definition acceptable to everyone and everything claiming to be sentient.
There's my argument for why sentience cannot be defined. Feel free to prove me wrong by pulling it off.
> but I can't build one that can experience heat, or can I?
It would need to have a planner that can detach from reality to hunt for new longterm plans, plus a hardcoded function that draws it back to the present by replacing the top planning goal with "avoid that!" whenever the heat sensor activation has crossed a threshold.
‘So the meaning of sentience is subjective, so there can't be an objective definition acceptable to everyone and everything claiming to be sentient.‘
It feels like your begging the question here, I don’t think this follows from any of your arguments. Except for maybe where you state you believe sentience can’t be defined, which again, begs the question.
Though admittedly I don’t see much of a traditional argument — your conclusion is interesting, could you try supporting it?
The first "So" at the beginning of that sentence is a typo. It indeed doesn't follow.
You can quickly spot what makes sentience subjective when you follow the explanations. They're all either utter gibberish once unpacked, lead to the conclusion that my computer is sentient (fine by me, but I don't think that's what we wanted?), are rooted in other terms with subjective meaning, or they are circular. Let's look at that third kind, which Wikipedia illustrates well:
> Sentience: Sentience is the capacity to experience feelings and sensations [...]
> Experience: Experience refers to conscious events in general [...]
> Conscious: Consciousness, at its simplest, is sentience [...]
Back at where we started.
To break this circle one needs to substitute one of the terms with how they intrinsically and subjectively understand it. Therefore the meaning of sentience is subjective. I realize you can expand this to mean that then everything is subjective, but to me that is a sliding scale.
The challenge I posed could be rephrased to come up with a definition that is concise and not circular. It would have to be rooted only in objectively definable terms.
> I can build a machine that can sense heat and react to it, but I can't build one that can experience heat, or can I?
Agents can imagine the future and the expected positive and negative rewards, this is an important process in order to select actions. Thinking about future rewards is "experiencing" the present emotionally.
I guess it is hard to define because it’s such a basic, essential thing. So does it matter that it’s hard to define? Even babies and puppy dogs experience pain and pleasure. They are feeling creatures. We don’t have any evidence that non-biological beings have pain, pleasure, fear, excitement… and so on.
Dictionary definitions are of limited utility in philosophical discussions because they often make very broad assumptions. For example computers can certainly sense things, they can detect inputs and make decisions based on those inputs. What is the difference between feeling and sensing though?
In this case by ‘feel’ we might implicitly assume various capabilities of the subject experiencing the feeling, like self awareness. If we’re being precise we can’t just say feeling is enough, we need to state the assumptions the dictionary leaves unstated.