My workflow for any complex queries is to ask it in multiple AI chats (Gemini, Claude, o3,..) in parallel and then continue the conversation with the chat response that I found the most useful.
I built a simple open source app that queries 10+ AI models at once and summarizes their answers with a selected combiner AI model.
There's a GIF in the github repo that shows it in action. You can try it on your local machine: https://github.com/Nexarithm/multi_model_chat
If you are interested, I also made a detailed blog post on technical details, feature of the personal helper sites, and vibecoding limitations: https://www.proxai.co/blog/archive/multi-model-ai-chat-app
I'd love to hear your feedback. Feel free to open an issue or a pull request on GitHub!