Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A robot could certainly be programmed to get food for a sick, dying friend (I mean, don't drones deliver Uber Eats?) but it will never understand why, or have a phenomenal experience of the act, or have a mental state of performing the act, or have the biological brain state of performing the act, or etc. etc.


Interesting. I wonder why?

Perhaps when we deliver food to our sick friend we subconsciously feel an "atta boy" from our parents who perhaps "trained" us in how to be kind when we were young selfish things.

Obviously if that's all it is we could of course "reinforce" this in AI.


"Never" is a very broad word.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: