Chat completion
Completion
The chat completion can be used to generate a conversational response for a given set of messages with a provided model.
| package main
import (
"github.com/parakeet-nest/parakeet/completion"
"github.com/parakeet-nest/parakeet/llm"
"github.com/parakeet-nest/parakeet/enums/option"
"fmt"
"log"
)
func main() {
ollamaUrl := "http://localhost:11434"
model := "deepseek-coder"
systemContent := `You are an expert in computer programming.
Please make friendly answer for the noobs.
Add source code examples if you can.`
userContent := `I need a clear explanation regarding the following question:
Can you create a "hello world" program in Golang?
And, please, be structured with bullet points`
options := llm.SetOptions(map[string]interface{}{
option.Temperature: 0.5,
option.RepeatLastN: 2,
option.RepeatPenalty: 2.0,
})
query := llm.Query{
Model: model,
Messages: []llm.Message{
{Role: "system", Content: systemContent},
{Role: "user", Content: userContent},
},
Options: options,
Stream: false,
}
answer, err := completion.Chat(ollamaUrl, query)
if err != nil {
log.Fatal("๐ก:", err)
}
fmt.Println(answer.Message.Content)
}
|
โ To keep a conversational memory for the next chat completion, update the list of messages with the previous question and answer.
Completion with stream
| package main
import (
"fmt"
"log"
"github.com/parakeet-nest/parakeet/completion"
"github.com/parakeet-nest/parakeet/llm"
)
func main() {
ollamaUrl := "http://localhost:11434"
model := "deepseek-coder"
systemContent := `You are an expert in computer programming.
Please make friendly answer for the noobs.
Add source code examples if you can.`
userContent := `I need a clear explanation regarding the following question:
Can you create a "hello world" program in Golang?
And, please, be structured with bullet points`
options := llm.Options{
Temperature: 0.5,
RepeatLastN: 2,
}
query := llm.Query{
Model: model,
Messages: []llm.Message{
{Role: "system", Content: systemContent},
{Role: "user", Content: userContent},
},
Options: options,
Stream: false,
}
_, err := completion.ChatStream(ollamaUrl, query,
func(answer llm.Answer) error {
fmt.Print(answer.Message.Content)
return nil
})
if err != nil {
log.Fatal("๐ก:", err)
}
}
|
Chat completion with conversational memory
In memory history
To store the messages in memory, use history.MemoryMessages
| package main
import (
"fmt"
"log"
"github.com/parakeet-nest/parakeet/completion"
"github.com/parakeet-nest/parakeet/history"
"github.com/parakeet-nest/parakeet/llm"
)
func main() {
ollamaUrl := "http://localhost:11434"
model := "tinydolphin" // fast, and perfect answer (short, brief)
conversation := history.MemoryMessages{
Messages: make(map[string]llm.MessageRecord),
}
systemContent := `You are an expert with the Star Trek series. use the history of the conversation to answer the question`
userContent := `Who is James T Kirk?`
options := llm.Options{
Temperature: 0.5,
RepeatLastN: 2,
}
query := llm.Query{
Model: model,
Messages: []llm.Message{
{Role: "system", Content: systemContent},
{Role: "user", Content: userContent},
},
Options: options,
}
// Ask the question
answer, err := completion.ChatStream(ollamaUrl, query,
func(answer llm.Answer) error {
fmt.Print(answer.Message.Content)
return nil
},
)
if err != nil {
log.Fatal("๐ก:", err)
}
// Save the conversation
_, err = conversation.SaveMessage("1", llm.Message{
Role: "user",
Content: userContent,
})
if err != nil {
log.Fatal("๐ก:", err)
}
_, err = conversation.SaveMessage("2", llm.Message{
Role: "system",
Content: answer.Message.Content,
})
if err != nil {
log.Fatal("๐ก:", err)
}
// New question
userContent = `Who is his best friend ?`
previousMessages, _ := conversation.GetAllMessages()
// (Re)Create the conversation
conversationMessages := []llm.Message{}
// instruction
conversationMessages = append(conversationMessages, llm.Message{Role: "system", Content: systemContent})
// history
conversationMessages = append(conversationMessages, previousMessages...)
// last question
conversationMessages = append(conversationMessages, llm.Message{Role: "user", Content: userContent})
query = llm.Query{
Model: model,
Messages: conversationMessages,
Options: options,
}
answer, err = completion.ChatStream(ollamaUrl, query,
func(answer llm.Answer) error {
fmt.Print(answer.Message.Content)
return nil
},
)
fmt.Println()
if err != nil {
log.Fatal("๐ก:", err)
}
}
|
Bbolt history
Bbolt is an embedded key/value database for Go.
To store the messages in a bbolt bucket, use history.BboltMessages
| conversation := history.BboltMessages{}
conversation.Initialize("../conversation.db")
|
Note
๐ you will find a complete example in:
Conversational history: remove messages
In Memory
- Remove a message by id
history.RemoveMessage(id string)
Note
๐ you will find a complete example in:
Bbolt Memory
- Remove a message by id
history.RemoveMessage(id string)
Conversational history: handling sessions
In Memory
history.SaveMessageWithSession(sessionId, messageId string, message llm.Message)
history.RemoveTopMessageOfSession(sessionId string)
Note
๐ you will find a complete example in:
Bbolt Memory
history.SaveMessageWithSession(sessionId, messageId string, message llm.Message)
history.RemoveTopMessageOfSession(sessionId string)
Note
๐ you will find a complete example in:
In Memory and Bbolt
history.RemoveTopMessage() error
: removes the oldest message from the Messages list.
history.KeepLastN(n int) error
: keeps the last n messages in the Messages list (and remove the oldest).
history.KeepLastNOfSession(sessionId string, n int) error
: keeps the last n messages of the session in the Messages list (and remove the oldest).
history.GetLastNMessages(n int) ([]llm.Message, error)
: returns the last n messages in the Messages list.
Note
๐ you will find a complete example in:
Complex conversation
You can use the helper function llm.Conversation
. Conversation
creates or extends a conversation with provided messages. It can accept either single messages or slices of messages as variadic parameters:
| conversationMessages := llm.Conversation(
llm.Message{Role: "system", Content: "Enable deep thinking subroutine."},
llm.Message{Role: "system", Content: systemInstructions},
[]llm.Message{
llm.Message{Role: "user", Content: question},
llm.Message{Role: "assistant", Content: assistantMessage},
},
llm.Message{Role: "user", Content: userMessage},
)
|
It returns a slice of messages []llm.Message