From the title, I thought Charlie would be in full rant mode, but this is middle of the road for him. Windows enshittification continues, and will continue, for sure.
But, he’s absolutely wrong that most people care about whether Recall stores things or not. They really really want what Copilot offers (note I didn’t say delivers, yet), which is labor saving on their knowledge work. My kids use a version of Recall on Mac for classroom lecture notes, and .. they find it helpful. People who have a job of making PowerPoints based on information from emails are going to find Copilot helpful.
These are the people OpenAI believes won’t have jobs eventually, because this sort of work is going from ‘needs human with college degree’ to ‘automatable’ very rapidly.
Charlie doesn’t offer any alternative architectures for this in his post, and that’s because there aren’t really any — MS delivered a locally encrypted database with RAG, and that’s … as good as it gets for this feature set right now. If nobody wants it, it will die. But, a lot of people want it, because they believe they’ll keep their jobs and get to be more productive in the meantime.
> MS delivered a locally encrypted database with RAG, and that’s … as good as it gets for this feature set right now.
I don't think that's true, at the very least they could have separately encrypted the db with a locking timeout the way that password managers do. That alone would eliminate a number of people's concerns.
Beyond that there are a ton of ways they could make the feature safer. They could have a setting where anytime the DB is unlocked the system uses Windows Hello periodically to confirm that the user that typed the password is still in front of the machine. They could proactively prompt the user to update rules or pause Recall in cases where recall is capturing snapshots the user might not want stored (and I question the overall utility of an AI that can't do that reasonably well for the common cases).
But all those things would be active reminders to the user of how creepy and dangerous the feature is, so instead they have to pretend that they've thought of everything and that there's no reason to be concerned.
These are fair, but I'd argue that they're still largely security theatre, and, to your point, worsen the emotional experience a lot. As a product manager, I think it's a fair shake to be like "We expect that generally your Windows machine is secure, and we do a little extra here, no worries mate!" Not that I believe the priors embedded in that pitch; I don't. But, most customers do already, and it's probably a bad plan to make them doubt those priors.
Anyway, you said it precisely at the end: don't remind the customers how creepy this is.
But, he’s absolutely wrong that most people care about whether Recall stores things or not. They really really want what Copilot offers (note I didn’t say delivers, yet), which is labor saving on their knowledge work. My kids use a version of Recall on Mac for classroom lecture notes, and .. they find it helpful. People who have a job of making PowerPoints based on information from emails are going to find Copilot helpful.
These are the people OpenAI believes won’t have jobs eventually, because this sort of work is going from ‘needs human with college degree’ to ‘automatable’ very rapidly.
Charlie doesn’t offer any alternative architectures for this in his post, and that’s because there aren’t really any — MS delivered a locally encrypted database with RAG, and that’s … as good as it gets for this feature set right now. If nobody wants it, it will die. But, a lot of people want it, because they believe they’ll keep their jobs and get to be more productive in the meantime.