Skip to content

Azure Phi

AzurePhiLLM

Bases: LocalModel

Class for Azure Phi model with guidance serverside integration.

https://github.com/guidance-ai/guidance?tab=readme-ov-file#azure-ai

Parameters:

Name Type Description Default
env

Path to directory of .env file, or to the file itself to load as a dotfile. Should either contain AZURE_PHI_KEY and AZURE_PHI_URL

required
caching bool

Bool determining whether we access the model's cache

True

Examples:

Given the following .env file in the directory above current:

AZURE_PHI_KEY=...
AZURE_PHI_URL=...
from blendsql.models import AzurePhiModel

model = AzurePhiModel(
    env="..",
)

Source code in blendsql/models/local/_azure_phi.py
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
class AzurePhiModel(LocalModel):
    """Class for Azure Phi model with guidance serverside integration.

    https://github.com/guidance-ai/guidance?tab=readme-ov-file#azure-ai

    Args:
        env: Path to directory of .env file, or to the file itself to load as a dotfile.
            Should either contain `AZURE_PHI_KEY` and `AZURE_PHI_URL`
        caching: Bool determining whether we access the model's cache

    Examples:
        Given the following `.env` file in the directory above current:
        ```text
        AZURE_PHI_KEY=...
        AZURE_PHI_URL=...
        ```
        ```python
        from blendsql.models import AzurePhiModel

        model = AzurePhiModel(
            env="..",
        )
        ```
    """

    def __init__(
        self,
        config: Optional[dict] = None,
        caching: bool = True,
        **kwargs,
    ):
        if not _has_transformers:
            raise ImportError(
                "Please install transformers with `pip install transformers`!"
            ) from None
        import transformers

        transformers.logging.set_verbosity_error()
        if config is None:
            config = {}

        super().__init__(
            model_name_or_path="microsoft/Phi-3.5-mini-instruct",
            requires_config=True,
            tokenizer=transformers.AutoTokenizer.from_pretrained(
                "microsoft/Phi-3.5-mini-instruct"
            ),
            load_model_kwargs=config,
            caching=caching,
            **kwargs,
        )

    def _load_model(self) -> ModelObj:
        # https://huggingface.co/blog/how-to-generate
        from guidance.models import AzureGuidance

        assert all(os.getenv(k) is not None for k in ["AZURE_PHI_KEY", "AZURE_PHI_URL"])
        lm = AzureGuidance(
            f"{os.getenv('AZURE_PHI_URL')}/guidance#auth={os.getenv('AZURE_PHI_KEY')}",
            echo=False,
            # **self.load_model_kwargs,
        )
        return lm