If an AI thinks for you, you're no longer "outsourcing" parts of your mind. What we call "AI" now is technically impressive but is not the end point for where AI is likely to end up. For example, imagine an AI that is smart enough to emotionally manipulate you, at what point in this interaction do you lose your agency to "outsource" yourself instead of acting as a conduit to "outsource" the thoughts of an artificial entity? It speaks to our collective hubris that we seek to create an intellectually superior entity and yet still think we'll maintain control over it instead of the other way around.
1. Do you think it's impossible for AI to have it's own volition?
2. We don't have full control over the design of AI. Current AI models are grown rather than fully designed, the outcomes of which are not predictable. Would you want to see limits placed on AI until we had a better grasp of how to design AI with predictable behaviour?