Azure
Polaris LLM uses the standard Polaris Azure GPU Terraform module with LLM-specific configurations. This enables safe and efficient language model deployment on secure hardware.
Accessing the Module
The module is available on the Terraform Registry:
Terraform Registry - Fr0ntierX/polaris-gpu/azure
module "polaris_azure_gpu_module" {
source = "../"
subscription_id = "YOUR-SUBSCRIPTION-ID"
name = "polaris-llm"
location = "eastus2"
zone = "2"
admin_username = "azureuser"
admin_password_or_key = "MY_PASSWORD"
authentication_type = "password"
virtual_network_name = "my-precreated-vnet"
virtual_network_resource_group = "my-network-rg"
subnet_name = "my-subnet"
custom_workload = {
image_address = "example.azurecr.io/custom-workload:latest"
port = 11434
registry = {
login_server = "example.azurecr.io"
username = "registry_user"
password = "registry_password"
}
}
polaris_proxy_enable_input_encryption = true
polaris_proxy_enable_output_encryption = true
}
LLM-Specific Features
Feature | Description |
---|---|
Framework Support | Pre-configured options for vLLM, Ollama |
Model Security | Protected loading and serving of proprietary models |
Prompt Protection | Secure handling of potentially sensitive prompts |
Response Filtering | Options for secure post-processing of generated content |
Learn More
For detailed configuration options and usage examples, refer to: