...
Show More
I loved this book when I read it in high school and geeked out and left a sticky note for the author on his office door when I toured UC San Diego. I was so excited by what I found here. I was excited except for what he said in the end. His argument that robots need emotions to be better servants to people makes no sense to me. Emotions in a "creature" require will to honor them. Robots having no emotions is what makes it ethical for them to do serve us in the ways they do. His idea of true robot servants with emotions just reads like slavery to me. Why give something the capacity to feel when its purpose can be realized without that capacity? Program it to do what you want instead of feel like doing what you want. (????) That was ridiculous to me I can't understand how he got to that conclusion. That aside I loved this book.