Your enthusiasm is great! People don't want to quash your enthusiasm, and I'm in the same boat.
But while enthusiasm is great, delusion is not. Since you're striving to be a founder and not a hobbyist, you have to be realistic about what you're trying to build.
What you're describing is fundamentally not possible to provide assurances on without some kind of legititmate AGI, which you lack the resources to build yourself.
Many better resourced companies are trying to provide grounded, factually accurate information, so it just seems like an area of effort far too broad to ever succeed in.
I would suggest a pivot into demonstrating legitimacy in a very narrow niche before attempting to be a genralist know-it-all. Providing fine-tuning as a service to a point of assured factual grounding is itself a hard enough open challenge in AI.
This is the only wise response in the entire thread. OP please listen to this very valid criticism it is extremely valid. Misinformation just general is not a solvable problem, nor do I believe you could ever approach a good solution.
You are tackling an extremely broad, nuanced, unsolvable problem.
You and your friends are obviously incredibly bright, pivot to something more narrow focused. Maybe you can fact check for some sub genre of information that is solvable?
Think sports scores, building heights and structural engineering. Hard, concrete fact.
As soon as you get into anything with any degree of subjectivity misinformation is impossible to solve.
I honestly thought hackernews of all places would have given you better advice in-line with the above commenter, but what’s actually happening is people are filling you with false hope because you are young.
I was in a similar position as you when I was younger, and as I’ve gotten older and had some successes I’ve learnt to listen for valid criticisms.
Block out the noise, both positive and negative. Listen to the wise ones
But while enthusiasm is great, delusion is not. Since you're striving to be a founder and not a hobbyist, you have to be realistic about what you're trying to build.
What you're describing is fundamentally not possible to provide assurances on without some kind of legititmate AGI, which you lack the resources to build yourself.
Many better resourced companies are trying to provide grounded, factually accurate information, so it just seems like an area of effort far too broad to ever succeed in.
I would suggest a pivot into demonstrating legitimacy in a very narrow niche before attempting to be a genralist know-it-all. Providing fine-tuning as a service to a point of assured factual grounding is itself a hard enough open challenge in AI.