AI engine
In CyberSEO Pro and RSS Retriever, the AI engine is a unique identifier that fully describes both the AI provider (or endpoint) and the specific model being used. It’s a single string that combines the provider_id
or endpoint_id
and the model_id
, separated by a hyphen (e.g., openai-gpt-4o
) or, in the case of OpenRouter, a slash (e.g., meta-llama/llama-3.1-405b-instruct
).
There are three main formats:
provider_id-model_id
— for integrated providers (e.g., OpenAI, Google, Anthropic, xAI)provider_id/model_id
— for OpenRouterendpoint_id-model_id
— for custom endpoints defined by the user
The format is unified regardless of whether the provider is integrated or custom. The part before the first hyphen is always treated as the provider_id
(for integrated support) or the endpoint_id
(for user-defined APIs). Everything after the first hyphen is interpreted as the model_id
— even if it contains additional hyphens or slashes.
Essentially, the endpoint_id
defines the API location, which means it implicitly identifies the provider. Whether it’s OpenAI, a third-party like OpenRouter, or a self-hosted model, the unique ID used to register the endpoint becomes the prefix for all associated models. This makes engine strings unambiguous and allows full flexibility when configuring models.
💡 In all fields of the plugin interface where you enter an AI engine, there’s also a “Quick Select” drop-down menu that shows examples of engines for different AI providers and models. Beginners are strongly encouraged to use this menu to avoid mistakes while learning the format.
Examples:
- Integrated AI provider:
openai-gpt-4o
- OpenRouter (built-in API access):
meta-llama/llama-3.1-405b-instruct
- Custom endpoint (added manually by user):
openrouter-meta-llama/llama-3.1-405b-instruct
The AI model, in contrast, refers to just the model ID passed directly to the API, without including the provider or endpoint.
Examples of model
usage:
[openai_gpt model="gpt-4o"]
[claude model="claude-3-sonnet"]
[gemini model="gemini-pro"]
[xai model="grok"]
[custom_ai id="deepseek" model="deepseek-chat"]
[or_text model="meta-llama/llama-3.1-405b-instruct"]
In each case, the provider is determined either by the shortcode name (e.g., [openai_gpt]
) or by the id
attribute in the shortcode (e.g., [custom_ai id="..."]
).
Where AI engines are used
AI engine strings are used in the following places:
- Universal shortcodes, e.g.:
[ai_generate engine="openai-gpt-4o"]
[gpt_article engine="meta-llama/llama-3.1-405b-instruct"]
- Plugin UI settings where a single field defines both provider and model (e.g., rewriter, translator, comment bot, etc.)
This format allows maximum flexibility and compatibility with native APIs, OpenRouter, and any custom-defined AI endpoints.
Rules for naming engines with custom endpoints
You can register custom AI endpoints via the Accounts tab in the plugin settings. When creating a custom endpoint, you specify a unique endpoint_id
, which serves as the first part of your engine string.
The format is:
engine = endpoint_id + "-" + model_id
The model_id
can include hyphens or slashes — that doesn’t affect parsing. Everything after the first hyphen is treated as the model ID.
Example of using DeepSeek via a user-defined endpoint:
- Endpoint ID:
deepseek
- Endpoint URL:
https://api.deepseek.com/chat/completions
- Model ID:
deepseek-chat
Engine: deepseek-deepseek-chat
This approach allows you to integrate third-party APIs, self-hosted models, and new providers not yet supported directly by the plugin. Since the endpoint_id
points to a specific API URL, it also inherently defines the provider for that engine string.
Summary
- Use engine with
[ai_generate]
,[gpt_article]
, or in plugin fields where both provider and model must be specified. - Use model with dedicated shortcodes like
[openai_gpt]
,[claude]
,[custom_ai]
, or[or_text]
.
Parameter | Example | Used In |
---|---|---|
engine | openai-gpt-4o |
Plugin settings user interface, [ai_generate engine="..."] , [gpt_article engine="..."] |
meta-llama/llama-3.1-405b-instruct |
||
deepseek-deepseek-chat |
||
model | gpt-4o |
[openai_gpt model="..."] |
claude-3-sonnet |
[claude model="..."] |
|
meta-llama/llama-3.1-405b-instruct |
[or_text model="..."] |
If you’re unsure, use [ai_generate]
— it’s universal and compatible with all supported and custom-defined engines.