Intelligence does not equal will, will does not equal ability to. Best case, new set of basic emergency rights should be created for sentient beings.
As long as AI does not have same obligations imparted on it as any human beeing, it should not have same rights.
Our human rights are guaranteed to us by the governments of the societies where we live. This "guarantee" is also what "grants" the right to us in a technical (if not idealistic) sense. If we can technically grant (or deny) rights to humans, we can do this to non-living entities.