LiteLLM
Environment
In order to use this Model, we expect that you have a .env file created with all required API keys.
Bases: UnconstrainedModel
Class for LiteLLM remote model integration. https://github.com/BerriAI/litellm
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model_name_or_path
|
str
|
Name or identifier of the model to use with LiteLLM.
Should begin with provider, e.g. |
required |
env
|
str
|
Environment path, defaults to current directory (".") |
'.'
|
config
|
Optional[dict]
|
Optional dictionary containing model configuration parameters |
None
|
caching
|
bool
|
Bool determining whether to enable response caching |
True
|
**kwargs
|
Additional keyword arguments to pass to the model |
{}
|
Examples:
from blendsql.models import LiteLLM
model = LiteLLM("openai/gpt-4o-mini", config={"temperature": 0.7})
Source code in blendsql/models/unconstrained/_litellm.py
13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 |
|