We would like to know your opinion about the 3 laws that are mentioned in the movie I, Robot
They were speaking about the basic 3 laws for robots:
1 A robot may not injure a human being or through inaction allow a human being to come to harm.
2 A robot must obey the orders given it by human beings except where such orders would conflict with the first law.
3 A robot must protect its own existence as long as such protection does not conflict with the first or second law.
We are convinced that these basic laws have to be integrated also in each machine learning program and in any artificial intelligence software in general.