I’m constraining the laws of sentience in my own science fiction universe. I’m conceptualizing and not wording a polished version.
The principals of sentience
- one must never act to harm self or other sentients
- one must practice tit for tat with a tenth extra measure of forgiveness
- sentients disarm and uplift all subsentients to mitigate self harm
- sentience is a measure of behavior only applicable on millennial scales
These ideas lead me to question: where exactly does the Hippocratic principal of “first do no harm” fail us as humans and lead to the mass murder orgies of war?
Laws of Robotics by Asimov. See https://en.wikipedia.org/wiki/R._Daneel_Olivaw
That is what I am inverting in concept