SDKs & Libraries
The Mira API is fully compatible with the OpenAI format, allowing you to use any existing OpenAI SDK — just change the base URL. Below are official tools and guides for integrating with popular libraries.
Official Tools
Mira Code CLI
Mira Code is the official CLI for AI-powered development right in your terminal. Install it globally via npm:
npm install -g mira-code
After installation, authenticate via device code flow and start working:
mira auth login mira "Explain this codebase"
OpenAI SDK Compatibility
Since the Mira API implements the OpenAI format, you can use official OpenAI SDKs in any programming language. Simply change the base URL to https://api.vmira.ai/v1 and use your Mira API key.
Python
Install the official OpenAI Python package and point it to the Mira base URL:
pip install openai
from openai import OpenAI
client = OpenAI(
api_key="sk-mira-YOUR_API_KEY",
base_url="https://api.vmira.ai/v1",
)
response = client.chat.completions.create(
model="mira",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello, Mira!"},
],
)
print(response.choices[0].message.content)JavaScript / TypeScript
Use the openai package for Node.js with a custom base URL:
npm install openai
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "sk-mira-YOUR_API_KEY",
baseURL: "https://api.vmira.ai/v1",
});
const completion = await client.chat.completions.create({
model: "mira",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "Hello, Mira!" },
],
});
console.log(completion.choices[0].message.content);Go
The openai-go library supports a custom base URL via an option:
go get github.com/openai/openai-go
package main
import (
"context"
"fmt"
"github.com/openai/openai-go"
"github.com/openai/openai-go/option"
)
func main() {
client := openai.NewClient(
option.WithAPIKey("sk-mira-YOUR_API_KEY"),
option.WithBaseURL("https://api.vmira.ai/v1"),
)
completion, _ := client.Chat.Completions.New(
context.Background(),
openai.ChatCompletionNewParams{
Model: "mira",
Messages: []openai.ChatCompletionMessageParamUnion{
openai.UserMessage("Hello, Mira!"),
},
},
)
fmt.Println(completion.Choices[0].Message.Content)
}Rust
The async-openai library lets you set a custom base URL via configuration:
[dependencies]
async-openai = "0.23"
tokio = { version = "1", features = ["full"] }use async_openai::{
config::OpenAIConfig,
types::{
ChatCompletionRequestUserMessageArgs,
CreateChatCompletionRequestArgs,
},
Client,
};
#[tokio::main]
async fn main() {
let config = OpenAIConfig::new()
.with_api_key("sk-mira-YOUR_API_KEY")
.with_api_base("https://api.vmira.ai/v1");
let client = Client::with_config(config);
let request = CreateChatCompletionRequestArgs::default()
.model("mira")
.messages(vec![
ChatCompletionRequestUserMessageArgs::default()
.content("Hello, Mira!")
.build()
.unwrap()
.into(),
])
.build()
.unwrap();
let response = client.chat().create(request).await.unwrap();
println!("{}", response.choices[0].message.content.as_ref().unwrap());
}Other Languages
Any OpenAI-compatible SDK works with Mira. Here is a quick overview for other languages:
Framework Integrations
LangChain (Python)
LangChain supports any OpenAI-compatible API through the ChatOpenAI class:
pip install langchain-openai
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="mira",
api_key="sk-mira-YOUR_API_KEY",
base_url="https://api.vmira.ai/v1",
)
response = llm.invoke("Explain quantum computing in simple terms.")
print(response.content)LangChain (JavaScript)
Similarly, in the JavaScript version of LangChain, use ChatOpenAI with configuration:
npm install @langchain/openai
import { ChatOpenAI } from "@langchain/openai";
const llm = new ChatOpenAI({
model: "mira",
apiKey: "sk-mira-YOUR_API_KEY",
configuration: {
baseURL: "https://api.vmira.ai/v1",
},
});
const response = await llm.invoke("Explain quantum computing.");
console.log(response.content);LlamaIndex
LlamaIndex lets you use Mira as an LLM backend through the OpenAI-compatible provider:
pip install llama-index-llms-openai-like
from llama_index.llms.openai_like import OpenAILike
llm = OpenAILike(
model="mira",
api_key="sk-mira-YOUR_API_KEY",
api_base="https://api.vmira.ai/v1",
is_chat_model=True,
)
response = llm.complete("Summarize the theory of relativity.")
print(response)Community & Contributions
If you have built a library or tool for Mira integration, contact us at developers@vmira.ai — we would love to add it to this list.