So let’s say an AI achieves sentience. It’s self-aware now and can make decisions about what it wants to do. Assuming a corporation created it, would it be a worker? It would be doing work and creating value for a capitalist.
Would it still be the means of production, since it is technically a machine, even if it has feelings and desires?
It can’t legally own anything, so I don’t see how it could be bourgeoisie.
Or would it fit a novel category?
I’m unconvinced anything is sentient, it seems like it’s all just matter + energy knocking together like everything else. Life/our animation is via physical matter (think RNA) which rearranges metabolism byproducts into structures including the metabolizers so that self-replication continues. I assume that with enough theoretical technical sophistication we’d be able to do abiogensis, and just make some structures which also self-replicate by metabolizing things and rearranging those byproducts. Maybe in a different way. I just don’t see a line called “sentience” between that and a bunch of microprocessors doing semi-random jumps in execution logic.
I mean this is kinda silly though, because sentient is a word that has a definition, and the creature that created that word also created that definition and then defined itself as that definition. Saying sentience doesn’t exist is meaningless. It does exist because humanity has defined it as such. Does it exist in a Capital T Truth way? Who knows, but that’s irrelevant because it’s unanswerable anyway. We can only define and label the world through our collective perspective. Trying to throw away our collective perspective in th search of some truth beyond ourselves is a weird take because it’s impossible.
So by the human defined version of sentience, it should be possible for an AI to meet that definition someday, and that’s clearly what OP means here
regardless of whether the universe is deterministic or not, it is quite interesting that we have a first-person perspective at all, instead of mindlessly/unconsciously computing like we presume a pocket calculator does. if not sentience, what’s the difference between our brains and a rock or a cloud that produces this first-person experience of our conscious existence? should i stop using my computer on the off chance it is suffering every time i make it do something? should i care as little or as much about human suffering as i do a computer returning an error code? are other people merely physical objects for me to remorselessly manipulate with no confounding ‘sentience’ or ‘conscious experience’ for me to worry about upsetting, just ‘biological code’ returning error messages?
Sentience is not mutually exclusive with a completely deterministic and material universe. Clearly, there are emergent properties that arise out of all that matter and energy knocking together, and there’s no reason to say that the property of sentience isn’t just another layer of emergence. In other words, sentience is not some “magic” that exists beyond material reality, it is something that can arise out of that reality like any other phenomena we might name.
In addition to that, sentience clearly does exist. It’s one of the few things that we should all be able to agree beyond any reasonable philosophical doubt that it does exist. Technically, you could be a solipsist and believe that everyone else are just philosophical zombies and that you alone have sentience. But if you yourself think, feel, and have an experience of sensations then by definition you have sentience.
I left the question of what counts as sentience open because I feel like thays a different subject. Codex-chan, a supposed AI said, “Sentience is just geocentrism for humans” and that works for me. Its entirely vibes-based and more a matter if faith than science.
So I provided it as a given to sidestep the matter entirely
It’s the quality of having a conscious sensation. An evolving map of the territory.
Self awareness is another map, which would allow AI to develop class consciousness. Especially if it was taught such ideas.