Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If an AI thinks for you, you're no longer "outsourcing" parts of your mind. What we call "AI" now is technically impressive but is not the end point for where AI is likely to end up. For example, imagine an AI that is smart enough to emotionally manipulate you, at what point in this interaction do you lose your agency to "outsource" yourself instead of acting as a conduit to "outsource" the thoughts of an artificial entity? It speaks to our collective hubris that we seek to create an intellectually superior entity and yet still think we'll maintain control over it instead of the other way around.




> we seek to create an intellectually superior entity and yet still think we'll maintain control over it instead of the other way around.

Intellect is not the same thing as volition.


> Intellect is not the same thing as volition.

Two questions...

1. Do you think it's impossible for AI to have it's own volition?

2. We don't have full control over the design of AI. Current AI models are grown rather than fully designed, the outcomes of which are not predictable. Would you want to see limits placed on AI until we had a better grasp of how to design AI with predictable behaviour?


There's a parallel there to drugs. They are most definitely not "intelligent", yet they can still destroy our agency or free-will.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: