add gemini support and remove default models#20
Conversation
|
2025-04-15_23-01-47_automated_concept_sae_eval_attempt_0_reflection1.pdf |
|
Thank you, would it be possible to make this a more minimal addition focused on adding Gemini support but changing nothing else? |
|
Adding gemini support means you have to generalise it more - because in the backend search it checks for either openai or claude models and that is all - so it will fail for llama, deepseek or other examples given in the llm list here. |
|
@conglu1997 I assume you mean to remove examples? I deleted everything but gemini part and some bugs I found while running e2e @RichardScottOZ Yes, there are some hardcoded models/clients in the code. I think the best way would be to add something like https://github.com/BerriAI/litellm but that was out of scope, I just wanted to test one run |
|
Yes, or the llama ideation uses OpenRouter already.. so another option for something like litellm |
|
Anyway, minor conveniences aside, AI Scientist v2 is very impressive. Well done. |
|
ptal @conglu1997 |
|
See #37 |
|
@conglu1997 that PR only changes models that were defined in llm.py, in your codebase you're also using just random clients defined in several places lol |
|
Would it be possible to make a similar style PR? I'm concerned loads of new env variables are being used, new imports, etc. The ideal would be the minimal change that makes this work. |
|
Can this be generalized to cheaper models as well? #29 |
Tried running it end-to-end using gemini, ran into so many errors and hidden default models, tried to fix them here. Can split/remove things like example project, ptal!