Class: Ollama
Unified language model interface
Hierarchy
-
↳
Ollama
Implements
Constructors
constructor
• new Ollama(init
): Ollama
Parameters
Name | Type |
---|---|
init | Partial <Ollama > & { model : string } |
Returns
Overrides
Defined in
packages/core/src/llm/ollama.ts:40
Properties
additionalChatOptions
• Optional
additionalChatOptions: Record
<string
, unknown
>
Defined in
packages/core/src/llm/ollama.ts:37
baseURL
• baseURL: string
= "http://127.0.0.1:11434"
Defined in
packages/core/src/llm/ollama.ts:32
callbackManager
• Optional
callbackManager: CallbackManager
Defined in
packages/core/src/llm/ollama.ts:38
contextWindow
• contextWindow: number
= 4096
Defined in
packages/core/src/llm/ollama.ts:35
embedBatchSize
• embedBatchSize: number
= DEFAULT_EMBED_BATCH_SIZE
Inherited from
Defined in
packages/core/src/embeddings/types.ts:8
hasStreaming
• Readonly
hasStreaming: true
Defined in
packages/core/src/llm/ollama.ts:28
model
• model: string
Defined in
packages/core/src/llm/ollama.ts:31
requestTimeout
• requestTimeout: number
Defined in
packages/core/src/llm/ollama.ts:36
temperature
• temperature: number
= 0.7
Defined in
packages/core/src/llm/ollama.ts:33
topP
• topP: number
= 0.9
Defined in
packages/core/src/llm/ollama.ts:34
Accessors
metadata
• get
metadata(): LLMMetadata
Returns
Implementation of
Defined in
packages/core/src/llm/ollama.ts:51
Methods
chat
▸ chat(params
): Promise
<AsyncIterable
<ChatResponseChunk
>>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming |