How do I run an interview against a model that has an OpenAI compatible API? #232
KartDriver
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I read the docs and I attempted the following, but I don't know that I am doing things right:
python interview-litellm.py --input senior/recursion.yaml --model AlpacaMOE --apibase http://TOP_SECRET.COM:9393/v1 --apikey MY_SUPER_SECRET_KEY --params params/greedy-openai.json
I keep getting: n
Traceback (most recent call last):
File "/home/scin/can-ai-code/interview-litellm.py", line 54, in
selected_model = [x for x in model_info['data'] if x['id'] == args.model.replace('openai/','')]
~~~~~~~~~~^^^^^^^^
KeyError: 'data'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/scin/can-ai-code/interview-litellm.py", line 60, in
raise Exception(f'Unable to reach {args.apibase}/models')
Beta Was this translation helpful? Give feedback.
All reactions