Class MistralAiStreamingChatModel

java.lang.Object
dev.langchain4j.model.mistralai.MistralAiStreamingChatModel
All Implemented Interfaces:
dev.langchain4j.model.chat.StreamingChatModel

public class MistralAiStreamingChatModel extends Object implements dev.langchain4j.model.chat.StreamingChatModel
Represents a Mistral AI Chat Model with a chat completion interface, such as mistral-tiny and mistral-small. The model's response is streamed token by token and should be handled with StreamingResponseHandler. You can find description of parameters here.
  • Constructor Details

  • Method Details

    • doChat

      public void doChat(dev.langchain4j.model.chat.request.ChatRequest chatRequest, dev.langchain4j.model.chat.response.StreamingChatResponseHandler handler)
      Specified by:
      doChat in interface dev.langchain4j.model.chat.StreamingChatModel
    • defaultRequestParameters

      public dev.langchain4j.model.chat.request.ChatRequestParameters defaultRequestParameters()
      Specified by:
      defaultRequestParameters in interface dev.langchain4j.model.chat.StreamingChatModel
    • listeners

      public List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners()
      Specified by:
      listeners in interface dev.langchain4j.model.chat.StreamingChatModel
    • provider

      public dev.langchain4j.model.ModelProvider provider()
      Specified by:
      provider in interface dev.langchain4j.model.chat.StreamingChatModel
    • supportedCapabilities

      public Set<dev.langchain4j.model.chat.Capability> supportedCapabilities()
      Specified by:
      supportedCapabilities in interface dev.langchain4j.model.chat.StreamingChatModel
    • builder