Kialo requires cookies to work correctly.
General AI should have fundamental rights
The relevance of human rights would depend on how AI works and may be a poor fit. We cannot know until we actually produce a general conscious AI.
Human rights are based on a "natural state" of humanity, and from where humans derive happiness and meaning, and what causes us distress. Since an AI has no "natural state", it may derive distress, happiness and meaning from completely different things than humans.
An AI could be replicated – or could replicate itself – possibly trillions of times, demanding the rights of trillions of humans.
Some human rights are incompatible with a machine or software. Granting such rights to machines or software would be unethical towards other humans.
We don't know enough about AIs to decide to grant them human rights. What is their consciousness? Feeling or only mental awareness? Human rights seem to be more linked to feeling than mental awareness.
We have rights in order to provide us with our basic needs. Machines have no needs or desires outside of maintenance and upkeep, meaning they require no such rights.
If an AI ever become sentient there is no reason it would need or want the same "human rights" as a living human has evolved to want or need.