by Isaac Asimov

  • A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
  • A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  • A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
feb 24 2014 ∞
feb 24 2014 +