add-provider-package
by verceladd-provider-package is a focused guide for creating a new @ai-sdk/<provider> package in vercel/ai. It helps contributors and API teams follow the AI SDK provider architecture, package structure, and implementation workflow when adding a provider.
This skill scores 68/100, which means it is listable for directory users but best treated as a contributor-oriented implementation guide rather than a turnkey install-and-run skill. The repository evidence shows substantial real workflow content for adding a new `@ai-sdk/<provider>` package, but adoption requires reading a long document and following repository-specific conventions without helper scripts or a quick-start install path.
- Strong triggerability: the description clearly states it is for creating a new `@ai-sdk/<provider>` package in the AI SDK.
- Real operational substance: the skill has a long step-by-step guide, code fences, architecture explanation, and file-path conventions for package creation.
- Evidence-backed trust signal: it links to a concrete pull request example (`vercel/ai` PR #8136) showing a real implementation reference.
- Repository-specific fit: the guide appears aimed at contributors to `vercel/ai`, not general agents adding provider packages in arbitrary codebases.
- Limited execution support: there are no install commands, scripts, support files, or packaged resources to reduce implementation guesswork.
Overview of add-provider-package skill
The add-provider-package skill is a focused implementation guide for creating a new @ai-sdk/<provider> package inside vercel/ai. It is best for maintainers, contributors, and API platform teams who want to integrate a model provider into the AI SDK with the right package shape, adapter layering, and repository conventions.
What add-provider-package actually helps you do
The real job-to-be-done is not “write some wrapper code.” The add-provider-package skill helps you add a provider package that fits the AI SDK’s internal architecture, exports the expected provider surface, and follows the repo’s package layout, test strategy, and implementation patterns.
Who should use add-provider-package
Use the add-provider-package skill if you are:
- Adding a new model provider to the
vercel/aimonorepo - Building a third-party provider package modeled after first-party packages
- Trying to map an external API into AI SDK abstractions like language models or embeddings
- Looking for a concrete path beyond a generic “build an adapter” prompt
It is especially useful if you already understand the provider’s HTTP API but need help translating that into AI SDK package conventions.
Best-fit use cases
The add-provider-package skill is strongest when you need to:
- Create a new
packages/<provider>folder with the expected structure - Implement provider classes against
@ai-sdk/providerinterfaces - Reuse patterns from existing providers instead of inventing your own
- Understand first-party vs third-party expectations before investing time
Key differentiators from a generic coding prompt
A generic prompt can draft adapter code. add-provider-package is more valuable when you need repository-aware guidance: where the package lives, how the provider architecture is layered, what files typically exist, and how a complete provider package is expected to look in this codebase.
Most important adoption constraint
This skill is narrow by design. It is for @ai-sdk/<provider> package creation, not for general API SDK design, unrelated wrappers, or arbitrary plugin systems. If your goal is outside the AI SDK provider architecture, this skill will feel overly specific.
How to Use add-provider-package skill
Install context for add-provider-package
This skill lives in vercel/ai under skills/add-provider-package. In a Skills-enabled workflow, install it with:
npx skills add vercel/ai --skill add-provider-package
If your environment already exposes repository skills automatically, you may only need to invoke the skill by name.
Start with the one file that matters
Read skills/add-provider-package/SKILL.md first. This repository snapshot shows the skill’s main guidance is concentrated there, including:
- first-party vs third-party package expectations
- the layered provider architecture
- package structure
- step-by-step implementation guidance
- a reference PR for a complete example
Because there are no extra resources/, rules/, or helper scripts surfaced here, most of the useful signal is in SKILL.md plus existing provider packages elsewhere in the monorepo.
What input add-provider-package needs from you
To get useful output, give the skill concrete provider details, not just “add support for X.” The minimum helpful inputs are:
- provider name and package target, such as
@ai-sdk/acme - API auth method
- supported model types: chat, completion, embeddings, image, etc.
- streaming behavior
- request and response schemas
- error format and rate-limit behavior
- any provider-specific quirks, like tool calling or JSON mode differences
Without these details, the skill can outline structure but cannot reliably shape the adapter.
Turn a rough goal into a strong prompt
Weak prompt:
Use add-provider-package to add Acme AI to the SDK.
Stronger prompt:
Use add-provider-package to scaffold a new packages/acme provider for vercel/ai. Acme uses API key auth via Authorization: Bearer <key>. It supports text generation and embeddings, with SSE streaming for text. I need the package structure, main source files, likely exports, and the mapping from Acme endpoints to AI SDK model interfaces. Show the repo files I should create first and call out any ambiguous areas I must resolve from the API docs.
This works better because it gives the skill enough information to choose the right provider surface and expose unresolved integration risks early.
Recommended workflow for add-provider-package usage
A practical add-provider-package usage flow is:
- Confirm whether you are targeting a first-party or third-party package.
- Read
SKILL.mdfor the architecture and expected package layout. - Inspect the linked reference example PR:
https://github.com/vercel/ai/pull/8136/files. - Compare one or two existing provider packages in
packages/. - Ask the skill to map your provider API to AI SDK interfaces before generating files.
- Then ask for a package skeleton, implementation plan, and test checklist.
This sequence reduces rework because interface mismatches are caught before code generation.
Repository paths to inspect after SKILL.md
For real implementation decisions, the best next reading path is usually:
skills/add-provider-package/SKILL.md- the reference PR linked inside the skill
- existing
packages/<provider>/src/*implementations in the monorepo - shared interfaces in
@ai-sdk/provider - helper patterns in
@ai-sdk/provider-utils
The skill explicitly describes this layered architecture, so reading those areas is how you validate generated code against the repo’s current patterns.
What add-provider-package covers well
The add-provider-package guide is most useful for:
- package scaffolding
- architectural fit within the AI SDK
- identifying which provider interfaces matter
- understanding how to model a provider as an adapter instead of a bespoke client
- using an existing provider addition as a reference implementation
What it does not decide for you
The skill will not remove the need to interpret the upstream provider API docs. You still need to decide:
- which capabilities deserve first-class support
- how to handle provider-specific request options
- which models to expose
- how to translate nonstandard errors and streaming payloads
- whether unsupported features should be omitted or surfaced behind provider-specific options
Practical prompt patterns for add-provider-package for API Development
If you are using add-provider-package for API Development, ask for outputs in decision order:
- capability mapping
- package/file plan
- type/interface plan
- request/response transformation plan
- tests and edge cases
Example:
Use add-provider-package to plan an @ai-sdk/zen package. First, map Zen's endpoints to AI SDK interfaces. Second, propose the package file tree. Third, list the core transforms for text generation, embeddings, and streaming. Finally, list the top 10 edge cases to test.
This produces a more implementable result than asking for a full code dump up front.
Common blockers before coding
The biggest blockers are usually not syntax errors but missing product decisions:
- Is this intended to be first-party or third-party?
- Does the provider API actually match AI SDK abstractions cleanly?
- Are streaming events stable enough to adapt?
- Which model capabilities are mature enough to expose?
- Are there existing providers with a closer shape you should copy?
Use the skill early to surface those questions before building the package.
add-provider-package skill FAQ
Is add-provider-package only for Vercel maintainers?
No. The skill is useful for outside contributors and third-party package authors too. The source explicitly distinguishes third-party packages from first-party @ai-sdk/<provider> packages and notes that first-party additions should be discussed first.
Is add-provider-package good for beginners?
It is usable by beginners who already know the target provider’s API, but it is not a beginner-first tutorial on TypeScript, package publishing, or SDK design. Its value is highest for people who need repo-specific guidance and architectural fit.
How is this different from asking an LLM to build a provider wrapper?
A normal prompt may generate plausible code without matching the AI SDK’s package structure or interfaces. The add-provider-package skill anchors the work to the monorepo’s adapter architecture and points you toward a concrete reference implementation.
Can I use add-provider-package outside the vercel/ai repo?
Yes, as a pattern reference. But the closer your project is to the AI SDK’s provider abstractions and package layout, the more transferable the output will be. If your codebase uses different interfaces or publishing conventions, expect adaptation work.
When should I not use add-provider-package?
Skip it if you are:
- building a general-purpose API client
- integrating a provider outside the AI SDK model
- looking for frontend app examples rather than provider package code
- not prepared to inspect existing provider implementations for parity
Does add-provider-package include a full end-to-end example?
It includes a reference pointer to a full provider addition via PR: https://github.com/vercel/ai/pull/8136/files. That is one of the most valuable parts of the skill because it shows what a completed addition looks like in context.
How to Improve add-provider-package skill
Give add-provider-package capability-level inputs
The fastest way to improve output quality is to describe provider capabilities precisely. Instead of “supports chat,” provide:
- endpoint names
- streaming protocol
- tool calling support
- structured output behavior
- embeddings dimensionality or request format
- auth and headers
- retry or rate-limit quirks
This lets the skill reason about interface fit rather than guessing from marketing terms.
Ask for a gap analysis before code generation
A strong first step is:
Use add-provider-package to identify the gaps between this provider API and AI SDK expectations before proposing code.
This is often better than asking for scaffolding immediately, because it surfaces missing features, incompatible streaming formats, or areas that need provider-specific options.
Reference a similar existing provider
If you know a provider in vercel/ai with a similar API shape, say so. For example:
Use add-provider-package and model this after the provider package that has the closest SSE text streaming and embeddings support.
This improves consistency and reduces invented abstractions.
Request file-by-file output, not one giant draft
The skill is easier to validate when you ask for:
- package tree
src/index.tsexports- provider factory
- model implementation files
- tests
- package metadata
This makes review more reliable than generating an entire package in one pass.
Common failure modes to watch for
When using add-provider-package, review outputs for:
- unsupported capabilities exposed as if they are stable
- streaming mapped too optimistically
- provider-specific options leaking into generic interfaces
- missing error normalization
- package structure that does not match existing repo conventions
- code that ignores first-party vs third-party process differences
Improve prompts with concrete API samples
If the first output is too abstract, add real request and response samples from the provider docs. This is one of the highest-leverage improvements because provider packages live or die on transformation correctness.
Good follow-up prompt:
Here are the exact JSON request and SSE response shapes for text generation. Revise the add-provider-package plan so the model implementation and streaming parser match these payloads.
Iterate on unresolved decisions explicitly
After the first pass, ask the skill to separate:
- confident implementation steps
- assumptions
- open questions requiring provider docs
- likely tests
That structure makes the skill more actionable and reduces hidden guesswork.
Best way to validate add-provider-package output
Treat the output as a repo-aware implementation plan, then validate it against:
SKILL.md- the linked reference PR
- one or two existing provider packages
@ai-sdk/providerinterfaces- the target provider’s official API docs
That validation loop is the best way to improve add-provider-package usage from “helpful draft” to a package you can actually merge or publish.
