如何同时调用OpenAI、Anthropic等多个大模型?2026年Abso库使用推荐
AI Summary (BLUF)
Abso is a lightweight, OpenAI-compatible JavaScript library that provides a unified interface for calling multiple LLM providers (OpenAI, Anthropic, Groq, Ollama, etc.) with full type safety. It suppo
Introduction
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。 provides a unified interface for calling various LLMs while maintaining full type safety. It is a drop-in replacement for the OpenAI SDK, allowing you to switch between providers with minimal code changes.
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。 提供了一个统一的接口来调用多种大语言模型(LLMLarge Language Model, the underlying technology for generative AI.),并保持完整的类型安全性。它是 OpenAI SDK 的直接替换方案,让您能以最少的代码变更在不同提供商之间切换。
Features
- OpenAI-compatible APIAPI design that mirrors OpenAI's interface specifications for seamless integration 🔁 (弃用替换 OpenAI 的 API)
- Call any LLMLarge Language Model, the underlying technology for generative AI. provider (调用任何 LLMLarge Language Model, the underlying technology for generative AI. 提供商:OpenAI、Anthropic、Groq、Ollama 等)
- Lightweight & Fast ⚡ (轻量且快速)
- Embeddings嵌入向量,将文本转换为数值向量,用于语义搜索、聚类等任务。 support 🧮 (支持嵌入向量)
- Unified tool calling 🛠️ (统一的工具调用)
- TokenizerA component that converts text into numerical tokens for AI model processing, with different strategies affecting context window comparisons across platforms. and cost calculation (soon) 🔢 (分词器和成本计算,即将推出)
- Smart routing (soon) (智能路由,即将推出)
Provider Support
The following table outlines the current support status for key features across all integrated providers.
| Provider | Chat | Streaming | Tool Calling | Embeddings嵌入向量,将文本转换为数值向量,用于语义搜索、聚类等任务。 | TokenizerA component that converts text into numerical tokens for AI model processing, with different strategies affecting context window comparisons across platforms. | Cost Calculation |
|---|---|---|---|---|---|---|
| OpenAI | ✅ | ✅ | ✅ | ✅ | 🚧 | 🚧 |
| Anthropic | ✅ | ✅ | ✅ | ❌ | 🚧 | 🚧 |
| xAI Grok | ✅ | ✅ | ✅ | ❌ | 🚧 | 🚧 |
| Mistral | ✅ | ✅ | ✅ | ❌ | 🚧 | 🚧 |
| Groq | ✅ | ✅ | ✅ | ❌ | ❌ | 🚧 |
| Ollama | ✅ | ✅ | ✅ | ❌ | ❌ | 🚧 |
| OpenRouter | ✅ | ✅ | ✅ | ❌ | ❌ | 🚧 |
| Voyage | ❌ | ❌ | ❌ | ✅ | ❌ | ❌ |
| Azure | 🚧 | 🚧 | 🚧 | 🚧 | ❌ | 🚧 |
| Bedrock | 🚧 | 🚧 | 🚧 | 🚧 | ❌ | 🚧 |
| Gemini | ✅ | ✅ | ✅ | ❌ | 🚧 | ❌ |
| DeepSeek | ✅ | ✅ | ✅ | ❌ | 🚧 | ❌ |
| Perplexity | ✅ | ✅ | ❌ | ❌ | 🚧 | ❌ |
Legend: ✅ = Supported, ❌ = Not supported, 🚧 = Under development
Installation
Install the package via npm:
npm install abso-ai
通过 npm 安装包。
Usage
The simplest way to get started is to import the pre-configured abso instance and make a chat completion call:
import { abso } from "abso-ai"
const result = await abso.chat.completions.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "gpt-4o",
})
console.log(result.choices[0].message.content)
最快速的方式是导入预配置的
abso实例,然后发起一次聊天补全调用。
Manually Selecting a Provider
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。 automatically infers the correct provider for a given model, but you can override this by specifying a provider in the request.
const result = await abso.chat.completions.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "openai/gpt-4o",
provider: "openrouter",
})
console.log(result.choices[0].message.content)
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。 会自动为给定模型推断正确的提供商,但您也可以通过请求中的
provider字段手动指定。
Streaming
Streaming works seamlessly with the same API. Use the stream option and iterate over the chunks. A helper method finalChatCompletion() is available to assemble the complete response.
const stream = await abso.chat.completions.create({
messages: [{ role: "user", content: "Say this is a test" }],
model: "gpt-4o",
stream: true,
})
for await (const chunk of stream) {
console.log(chunk)
}
// Helper to get the final result
const fullResult = await stream.finalChatCompletion()
console.log(fullResult)
流式响应与相同 API 无缝集成。使用
stream: true选项,然后迭代数据块。使用finalChatCompletion()辅助方法获取完整响应。
Embeddings嵌入向量,将文本转换为数值向量,用于语义搜索、聚类等任务。
Generate embeddings嵌入向量,将文本转换为数值向量,用于语义搜索、聚类等任务。 for text inputs with a simple call:
const embeddings = await abso.embeddings.create({
model: "text-embedding-3-small",
input: ["A cat was playing with a ball on the floor"],
})
console.log(embeddings.data[0].embedding)
通过简单调用为文本输入生成嵌入向量。
Tokenizers (Coming Soon)
The tokenizerA component that converts text into numerical tokens for AI model processing, with different strategies affecting context window comparisons across platforms. API will allow you to count tokens for any supported model:
const tokens = await abso.chat.tokenize({
messages: [{ role: "user", content: "Hello, world!" }],
model: "gpt-4o",
})
console.log(`${tokens.count} tokens`)
分词器 API 即将推出,可让您统计任意支持模型的 token 数量。
Custom Providers
You can configure built-in providers by passing a configuration object when instantiating Abso:
import { Abso } from "abso-ai"
const abso = new Abso({
openai: { apiKey: "your-openai-key" },
anthropic: { apiKey: "your-anthropic-key" },
// add other providers as needed
})
const result = await abso.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
})
console.log(result.choices[0].message.content)
Alternatively, you can also change the providers that are loaded by passing a custom providers array to the constructor.
您可以在实例化
Abso时通过传递配置对象来配置内置提供商。另外,也可以通过向构造函数传递自定义providers数组来更改加载的提供商。
Observability
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。 integrates with Lunary for instant observability into your LLMLarge Language Model, the underlying technology for generative AI. usage. Sign up for Lunary to get your public key, then set the LUNARY_PUBLIC_KEY environment variable to enable monitoring.
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。 可与 Lunary 集成,让您立即获得 LLMLarge Language Model, the underlying technology for generative AI. 使用的可观测性。注册 Lunary 获取公钥,然后设置
LUNARY_PUBLIC_KEY环境变量即可启用监控。
Contributing
We welcome contributions! Please see our Contributing Guide for more details.
欢迎贡献!更多详情请参阅我们的贡献指南。
Roadmap
- More providers – 更多提供商支持
- Built-in caching – 内置缓存
- Tokenizers – 分词器
- Cost calculation – 成本计算
- Smart routing – 智能路由
路线图:更多提供商支持、内置缓存、分词器、成本计算、智能路由。
常见问题(FAQ)
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。支持哪些 LLMLarge Language Model, the underlying technology for generative AI. 提供商?如何在不同提供商间切换?
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。 支持 OpenAI、Anthropic、Groq、Ollama 等十余个提供商,自动根据模型推断提供商。也可在请求中通过 provider 字段手动指定,例如 provider: 'openrouter'。
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。 是否支持流式输出?怎么用?
支持。将 stream 参数设为 true,即可迭代流式数据块。还可以调用 finalChatCompletion() 方法获取完整的最终响应结果。
Abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。 的嵌入功能怎么调用?
通过 abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。.embeddings嵌入向量,将文本转换为数值向量,用于语义搜索、聚类等任务。.create() 方法,传入 model 和 input 数组即可。例如:abso一个开源的JavaScript库,提供统一的LLM(大型语言模型)调用接口,兼容OpenAI API格式。.embeddings嵌入向量,将文本转换为数值向量,用于语义搜索、聚类等任务。.create({ model: 'text-embedding-3-small', input: ['你的文本'] })。
版权与免责声明:本文仅用于信息分享与交流,不构成任何形式的法律、投资、医疗或其他专业建议,也不构成对任何结果的承诺或保证。
文中提及的商标、品牌、Logo、产品名称及相关图片/素材,其权利归各自合法权利人所有。本站内容可能基于公开资料整理,亦可能使用 AI 辅助生成或润色;我们尽力确保准确与合规,但不保证完整性、时效性与适用性,请读者自行甄别并以官方信息为准。
若本文内容或素材涉嫌侵权、隐私不当或存在错误,请相关权利人/当事人联系本站,我们将及时核实并采取删除、修正或下架等处理措施。 也请勿在评论或联系信息中提交身份证号、手机号、住址等个人敏感信息。