
 
{"id":122,"date":"2024-03-22T20:02:58","date_gmt":"2024-03-22T20:02:58","guid":{"rendered":"https:\/\/misterhayden.com\/blog\/?p=122"},"modified":"2024-06-05T13:20:21","modified_gmt":"2024-06-05T13:20:21","slug":"hey-lawdy-mama-asimovs-robotics-laws","status":"publish","type":"post","link":"https:\/\/misterhayden.com\/blog\/2024\/03\/22\/hey-lawdy-mama-asimovs-robotics-laws\/","title":{"rendered":"Hey Lawdy Mama (Asimov&#8217;s Robotics Laws)"},"content":{"rendered":"\n<p><\/p>\n\n\n\n<p>A while back an acquaintance posed the following question in Linkedin:<\/p>\n\n\n\n<p>&#8220;Why do you think these three laws for robots or AI failed for humans:<br>\u201cThe first law is that a robot shall not harm a human, or by inaction allow a human to come to harm.<br>The second law is that a robot shall obey any instruction given to it by a human,<br>and the<br>third law is that a robot shall avoid actions or situations that could cause it to come to harm itself.\u201d<\/p>\n\n\n\n<p>Harold (Steve) Hayden&#8217;s Comment:<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p>Well, firstly, the above is not complete for laws 2 and 3, the complete\/correct laws are:<\/p>\n\n\n\n<p>&#8220;A robot must obey orders given it by human beings except where such orders would conflict with the First Law. <\/p>\n\n\n\n<p>A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.&#8221;<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p>Let&#8217;s not forget Asimov later added the \u201cZeroth Law,\u201d above all the others \u2013 \u201cA robot may not harm humanity, or, by inaction, allow humanity to come to harm.\u201d<\/p>\n\n\n\n<p>WRT those laws &#8220;failing&#8221;, having been a S\/W engineer for decades, it&#8217;s a formidable challenge to code such laws (as real &#8220;behavior&#8221;), even more of a challenge to &#8220;teach&#8221; (e.g. learning systems\/technologies) those laws and &#8220;embedding&#8221;\/coding such laws to produce appropriate behavior into ALL systems is (IMHO) BEYOND (or at least one heck of a challenge) our &#8220;state of the art&#8221; (and more importantly, just the plain old desire\/requirement to implement those laws).<\/p>\n\n\n\n<p>So, I would submit, they&#8217;ve not been coded into our systems, and therefore have not failed.<\/p>\n\n\n\n<p>FYI, back in the 80&#8217;s as a &#8220;visiting research scientist&#8221; at Vanderbilt University&#8217;s Center for Intelligent Systems, I was co-host for a &#8220;Workshop on Programming Environments for Intelligent Systems&#8221;, and I gave each attendee an EEPROM which I had &#8220;programmed&#8221; to contain the three original laws (before Asimov&#8217;s &#8220;0th&#8221; one appeared) in ASCII format giving the 150+ attendees the challenge &#8220;figure out what&#8217;s in them&#8221;. Likely those chips live in drawers\/landfills and nobody but me (and whoever reads this) will ever know they exist.<\/p>\n\n\n\n<p>Was &#8220;Liked&#8221; by David Cay Johnston<br>https:\/\/www.linkedin.com\/in\/david-cay-johnston-a722934\/?original_referer=https%3A%2F%2Fmail.yahoo.com%2F<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A while back an acquaintance posed the following question in Linkedin: &#8220;Why do you think these three laws for robots or AI failed for humans:\u201cThe first law is that a robot shall not harm a human, or by inaction allow a human to come to harm.The second law is that a robot shall obey any [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/posts\/122"}],"collection":[{"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/comments?post=122"}],"version-history":[{"count":4,"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/posts\/122\/revisions"}],"predecessor-version":[{"id":129,"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/posts\/122\/revisions\/129"}],"wp:attachment":[{"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/media?parent=122"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/categories?post=122"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/misterhayden.com\/blog\/wp-json\/wp\/v2\/tags?post=122"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}