public class AzureOpenAiLanguageModel extends Object implements dev.langchain4j.model.language.LanguageModel, dev.langchain4j.model.language.TokenCountEstimator
AzureOpenAiChatModel instead,
as it offers more advanced features like function calling, multi-turn conversations, etc.
Mandatory parameters for initialization are: endpoint, serviceVersion, apikey (or an alternate authentication method, see below for more information) and deploymentName. You can also provide your own OpenAIClient instance, if you need more flexibility.
There are 3 authentication methods:
1. Azure OpenAI API Key Authentication: this is the most common method, using an Azure OpenAI API key. You need to provide the OpenAI API Key as a parameter, using the apiKey() method in the Builder, or the apiKey parameter in the constructor: For example, you would use `builder.apiKey("{key}")`.
2. non-Azure OpenAI API Key Authentication: this method allows to use the OpenAI service, instead of Azure OpenAI. You can use the nonAzureApiKey() method in the Builder, which will also automatically set the endpoint to "https://api.openai.com/v1". For example, you would use `builder.nonAzureApiKey("{key}")`. The constructor requires a KeyCredential instance, which can be created using `new AzureKeyCredential("{key}")`, and doesn't set up the endpoint.
3. Azure OpenAI client with Microsoft Entra ID (formerly Azure Active Directory) credentials. - This requires to add the `com.azure:azure-identity` dependency to your project, which is an optional dependency to this library. - You need to provide a TokenCredential instance, using the tokenCredential() method in the Builder, or the tokenCredential parameter in the constructor. As an example, DefaultAzureCredential can be used to authenticate the client: Set the values of the client ID, tenant ID, and client secret of the AAD application as environment variables: AZURE_CLIENT_ID, AZURE_TENANT_ID, AZURE_CLIENT_SECRET. Then, provide the DefaultAzureCredential instance to the builder: `builder.tokenCredential(new DefaultAzureCredentialBuilder().build())`.
| Modifier and Type | Class and Description |
|---|---|
static class |
AzureOpenAiLanguageModel.Builder |
| Constructor and Description |
|---|
AzureOpenAiLanguageModel(com.azure.ai.openai.OpenAIClient client,
String deploymentName,
dev.langchain4j.model.Tokenizer tokenizer,
Double temperature,
Double topP,
Integer maxTokens,
Double presencePenalty,
Double frequencyPenalty) |
AzureOpenAiLanguageModel(String endpoint,
String serviceVersion,
com.azure.core.credential.KeyCredential keyCredential,
String deploymentName,
dev.langchain4j.model.Tokenizer tokenizer,
Double temperature,
Double topP,
Integer maxTokens,
Double presencePenalty,
Double frequencyPenalty,
Duration timeout,
Integer maxRetries,
com.azure.core.http.ProxyOptions proxyOptions,
boolean logRequestsAndResponses) |
AzureOpenAiLanguageModel(String endpoint,
String serviceVersion,
String apiKey,
String deploymentName,
dev.langchain4j.model.Tokenizer tokenizer,
Double temperature,
Double topP,
Integer maxTokens,
Double presencePenalty,
Double frequencyPenalty,
Duration timeout,
Integer maxRetries,
com.azure.core.http.ProxyOptions proxyOptions,
boolean logRequestsAndResponses) |
AzureOpenAiLanguageModel(String endpoint,
String serviceVersion,
com.azure.core.credential.TokenCredential tokenCredential,
String deploymentName,
dev.langchain4j.model.Tokenizer tokenizer,
Double temperature,
Double topP,
Integer maxTokens,
Double presencePenalty,
Double frequencyPenalty,
Duration timeout,
Integer maxRetries,
com.azure.core.http.ProxyOptions proxyOptions,
boolean logRequestsAndResponses) |
| Modifier and Type | Method and Description |
|---|---|
static AzureOpenAiLanguageModel.Builder |
builder() |
int |
estimateTokenCount(String prompt) |
dev.langchain4j.model.output.Response<String> |
generate(String prompt) |
public AzureOpenAiLanguageModel(com.azure.ai.openai.OpenAIClient client,
String deploymentName,
dev.langchain4j.model.Tokenizer tokenizer,
Double temperature,
Double topP,
Integer maxTokens,
Double presencePenalty,
Double frequencyPenalty)
public AzureOpenAiLanguageModel(String endpoint, String serviceVersion, String apiKey, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Double temperature, Double topP, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses)
public AzureOpenAiLanguageModel(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Double temperature, Double topP, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses)
public AzureOpenAiLanguageModel(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Double temperature, Double topP, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses)
public dev.langchain4j.model.output.Response<String> generate(String prompt)
generate in interface dev.langchain4j.model.language.LanguageModelpublic int estimateTokenCount(String prompt)
estimateTokenCount in interface dev.langchain4j.model.language.TokenCountEstimatorpublic static AzureOpenAiLanguageModel.Builder builder()
Copyright © 2024. All rights reserved.