Although based on other tasks, overall, GPT-4 seems to be the best, but by a very small margin, so I cancelled my subscription. Although the native mobile app is really great.
Is there a way to use Mistral-large with TTS and STT engines so you can converse with it like you can ChatGPT in the mobile app? it's really great on long drives for learning/talking about stuff, like a customized personal podcast.
Exactly, I absolutely love this feature. And many times the conversation is quite natural and fluid (with good internet connection). I think I'll build something like that myself (:
Not sure. Most of the time GPT-4 is better. Since I'm using Vercel AI playground[1], on almost every query I get a response from all models so it's easy to compare.
Although based on other tasks, overall, GPT-4 seems to be the best, but by a very small margin, so I cancelled my subscription. Although the native mobile app is really great.