Protected endpoint If your Ollama endpoint is protected with a header token, you can specify the token like this: 1 2 3 4 5 6 7 8 9 10query := llm.Query{ Model: model, Messages: []llm.Message{ {Role: "system", Content: systemContent}, {Role: "user", Content: userContent}, }, Options: options, TokenHeaderName: "X-TOKEN", TokenHeaderValue: "john_doe", }