AllTopicsTodayAllTopicsToday
Notification
Font ResizerAa
  • Home
  • Tech
  • Investing & Finance
  • AI
  • Entertainment
  • Wellness
  • Gaming
  • Movies
Reading: Introducing Pipelines for Long-Running AI Workflows
Share
Font ResizerAa
AllTopicsTodayAllTopicsToday
  • Home
  • Blog
  • About Us
  • Contact
Search
  • Home
  • Tech
  • Investing & Finance
  • AI
  • Entertainment
  • Wellness
  • Gaming
  • Movies
Have an existing account? Sign In
Follow US
©AllTopicsToday 2026. All Rights Reserved.
AllTopicsToday > Blog > AI > Introducing Pipelines for Long-Running AI Workflows
12.020blog thumb.png
AI

Introducing Pipelines for Long-Running AI Workflows

AllTopicsToday
Last updated: January 18, 2026 8:22 pm
AllTopicsToday
Published: January 18, 2026
Share
SHARE

This weblog put up will concentrate on new options and enhancements. For a complete checklist together with bug fixes, see Launch notes.

Clarifai’s compute orchestration permits you to deploy your fashions by yourself compute, management how they scale, and resolve the place to run inference throughout clusters and node swimming pools.

As AI techniques transfer past single inference calls to long-running duties, multi-step workflows, and agent-driven execution, orchestration is extra than simply launching containers. You must handle execution over time, deal with failures, and intelligently route site visitors throughout your compute.

This launch is constructed on a basis with native assist for long-running pipelines, mannequin routing between node swimming pools and environments, and agent mannequin execution utilizing Mannequin Context Protocol (MCP).

Deploying pipelines for long-running, multi-step AI workflows

AI techniques don’t break throughout inference. Workflows are interrupted after they span a number of steps, run for hours, or have to get better from a failure.

At the moment, groups depend on stringed collectively scripts, cron jobs, and queue employees to handle these workflows. As agent workloads and MLOps pipelines turn out to be extra complicated, this setup turns into tougher to function, debug, and scale.

In Clarifai 12.0, pipelineis the native option to outline, execute, and handle long-running, multi-step AI workflows instantly on the Clarifai platform.

Why use pipelines?

Most AI platforms are optimized for short-lived inference calls. Nevertheless, the precise manufacturing workflow could be very completely different.

Multi-step agent logic throughout instruments, fashions, and exterior APIs

Lengthy-running jobs akin to batch processing, fine-tuning, and analysis

Finish-to-end MLOps workflows that require reproducibility, versioning, and management

Pipelines are constructed to deal with this class of issues.

Clarifai Pipelines serves because the orchestration spine for superior AI techniques. They permit you to outline container-based steps, management execution order and parallelism, handle state and secrets and techniques, and monitor execution from starting to finish with out having to bolt collectively separate orchestration infrastructures.

Every pipeline is versioned, reproducible, and runs on Clarifai-managed compute, supplying you with granular management over how complicated AI workflows run at scale.

Learn the way pipelines work, what you’ll be able to construct with them, and the way to get began utilizing the CLI and API.

How the pipeline works

At a excessive stage, a Clarifai pipeline is a versioned, multi-step workflow consisting of containerized steps that run asynchronously on Clarifai Compute.

Every step is an remoted unit of execution with its personal code, dependencies, and useful resource settings. A pipeline defines how these steps are related, whether or not they run sequentially or in parallel, and the way knowledge flows between them.

Outline and add your pipeline as soon as to set off executions that may run for minutes, hours, or longer.

Initialize the pipeline venture

It scaffolds an entire pipeline venture utilizing the identical construction and conventions because the Clarifai customized mannequin.

Every pipeline step follows the very same footprint (configuration information, dependency information, and executable Python entry factors) that builders already use when importing fashions to Clarifai.

A typical scaffolded pipeline appears to be like like this:

On the pipeline stage, config.yaml Outline how steps are related and coordinated, together with execution order, parameters, and dependencies between steps.

Every step is a self-contained unit that appears and behaves precisely just like the customized mannequin.

config.yaml Outline enter, runtime, and calculation necessities for the step

necessities.txt Specify Python dependencies for that step

pipeline_step.py Incorporates the precise execution logic, the place you write code to course of knowledge, name fashions, and work together with exterior techniques.

Because of this constructing pipelines feels immediately acquainted. If you happen to’ve already uploaded a customized mannequin to Clarifai, you are working with the identical configuration type, the identical versioning mannequin, the identical deployment mechanism, and also you’re simply configuring a multi-step workflow.

Add the pipeline

Clarifai builds and variations every step as a containerized artifact, making certain reproducible execution.

Run the pipeline

After execution, you’ll be able to monitor progress, examine logs, and handle execution instantly by way of the platform.

Underneath the hood, pipeline execution is powered by Argo Workflows, which permits Clarifai to reliably orchestrate long-running, multi-step jobs with correct dependency administration, retries, and failure dealing with.

Pipelines are designed to assist all the pieces from automated MLOps workflows to superior AI agent orchestration with out the necessity to function your personal workflow engine.

Be aware: Pipeline is at the moment out there in public preview.

Strive it now. We welcome your suggestions as we proceed to iterate. For step-by-step guides on defining steps, importing pipelines, managing runs, and constructing extra superior workflows, try our detailed documentation right here.

Mannequin routing with multi-node pool deployment

This launch now helps compute orchestration. Routing fashions throughout a number of node swimming pools inside a single deployment.

Mannequin routing permits your deployment to reference a number of present node swimming pools. deployment_config.yaml. These node swimming pools can belong to completely different clusters and might span cloud, on-premises, or hybrid environments.

Here is how mannequin routing works:

Node swimming pools are handled as an ordered precedence checklist. By default, requests are routed to the primary node pool.

A node pool is taken into account absolutely loaded when queued requests exceed the configured age or quantity threshold and the deployment reaches its limits. max_replicasor the node pool has reached most occasion capability.

When this occurs, the subsequent node pool within the checklist will routinely heat up and a few of your site visitors can be routed to it.

of deployment min_replicas Applies solely to major node swimming pools.

of deployment max_replicas Utilized to every node pool independently, not as a worldwide whole.

This method allows excessive availability and predictable scaling with out duplicating deployments or manually managing failover. Deployments can now span a number of compute swimming pools whereas working as a single resilient service.

Study extra about multi-node pool deployments right here.

Agent performance with MCP assist

Clarifai extends assist for agent AI techniques by simply combining agent-enabled fashions and mannequin context protocol integration. Fashions stay absolutely managed on the Clarifai platform and might uncover, name, and infer each customized and open supply MCP servers throughout inference.

Agent mannequin with MCP integration

To add a mannequin with agent performance, AgenticModelClassextends the usual mannequin lessons to assist device discovery and execution. The add workflow is identical as your present customized mannequin and makes use of the identical venture construction, configuration information, and deployment course of.

The agent mannequin is configured to work with an MCP server that exposes instruments that the mannequin can name throughout inference.

The primary options are:

Repeated device calls inside a single prediction or era request

Discovery and execution of instruments dealt with by agent mannequin lessons

Assist for each streaming and non-streaming inference

Appropriate with OpenAI Appropriate API and Clarifai SDK

A whole instance of importing and operating an agent mannequin is accessible right here. This repository reveals the way to add a GPT-OSS-20B mannequin with agent performance enabled utilizing AgenticModelClass.

Deploying a public MCP server on Clarifai

Clarifai already helps deployment Customized MCP serverpermitting groups to construct their very own device servers and run them on the platform. This launch extends its performance by making deployment simpler. Public MCP server Straight on the platform.

Groups can now add public MCP servers utilizing easy configuration with out having to host or handle server infrastructure. These servers may be deployed and shared between fashions and workflows, permitting agent fashions to entry the identical instruments.

This instance reveals the way to deploy a public open supply MCP server as an API endpoint to Clarifai.

Pay-as-you-go system with pay as you go credit

launched one thing new Pay-as-you-go (PAYG) We plan to make billing easier and extra predictable for self-service customers.

The PAYG plan contains No month-to-month minimal quantity And the variety of characteristic gates is far smaller. Prepay credit to make use of throughout the platform and pay just for what you devour. To enhance reliability, the plan additionally contains: automated rechargeSo long-running jobs do not cease unexpectedly while you run low on credit.

That can assist you get began, all authenticated customers can One-time $5 welcome credit scoreCan be utilized throughout , inference, compute orchestration, deployment, and extra. You can too cost your group an extra $5.

If you would like to study extra about how pay as you go credit work, what’s modified from earlier plans, and why we have made this alteration, learn this weblog for extra info.

Clarifai as an inference supplier for Vercel AI SDK

Clarifai is now out there as an inference supplier for the Vercel AI SDK. Clarifai-hosted fashions can be utilized instantly by way of an OpenAI-compatible interface. @ai-sdk/openai-compatiblewith out altering present utility logic.

This lets you simply swap out Clarifai-assisted fashions for manufacturing inference whereas persevering with to make use of the identical Vercel AI SDK workflows you already depend on. Click on right here for particulars

New inference mannequin for the Ministral 3 household

Clarifai has revealed two new open weight inference fashions within the Ministeral 3 household.

Minister-3-3B-Inference-2512

Designed for effectivity, compact inference fashions present robust efficiency whereas remaining sensible to deploy on actual {hardware}.

Minister-3-14B-Inference-2512

The most important mannequin within the Ministral 3 household, it gives inference efficiency that approaches a lot bigger techniques whereas retaining the advantages of an environment friendly open-weight design.

Each fashions can be found now and can be utilized throughout Clarifai inference, orchestration, and deployment workflows.

Further modifications

Platform updates

We have made a number of focused enhancements throughout the platform to enhance usability and every day workflow.

We have added clear filters to the Management Middle to make charts simpler to navigate and interpret.

Guarantee right now’s audit logs are included when deciding on the final 7 days[チームとログ]Improved the view.

When utilizing Evaluate Mode in Playground, now you can cease responding instantly from the fitting panel.

Python SDK replace

This launch contains intensive enhancements to the Python SDK and CLI, specializing in stability, native runners, and developer expertise.

Native mannequin runner reliability enhancements together with vLLM compatibility, checkpoint obtain, and runner ID battle fixes.

Launched higher artifact administration and interactivity. config.yaml Created throughout the mannequin add movement.

We have expanded our testing protection and improved error dealing with throughout the runner, mannequin loading, and OpenAI-compatible API calls.

Incorporates a number of further fixes and enhancements masking dependency upgrades, setting dealing with, and CLI robustness. Click on right here for extra info.

Prepared to start out constructing?

Begin constructing right now with Clarifai Pipelines and run long-running, multi-step workflows instantly on the platform. Outline steps, add them utilizing the CLI, and monitor their execution throughout compute.

For manufacturing environments, mannequin routing means that you can scale throughout a number of node swimming pools and clusters with built-in spillover and excessive availability.

If you’re constructing an agent system, you can even allow assist for agent fashions in your MCP server to present your fashions entry to instruments throughout inference.

Pipeline is accessible in public preview. We welcome your suggestions as we construct.

How to Ensure Your AI-Generated Content Passes AI Detection Tests
Gemini provides automated feedback for theoretical computer scientists at STOC 2026
Foxconn + OpenAI Team Up to Build the Backbone of the AI Boom
How AI Can Turn Your Photos into Videos with Voiceovers
When and why agent systems work
TAGGED:IntroducingLongRunningPipelinesWorkflows
Share This Article
Facebook Email Print
Leave a Comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Follow US

Find US on Social Medias
FacebookLike
XFollow
YoutubeSubscribe
TelegramFollow

Weekly Newsletter

Subscribe to our newsletter to get our newest articles instantly!

Popular News
Tvl editorial lestat 00959 rt.jpg
Movies

Interview With The Vampire Season 3 Release Date and Title Sequence

AllTopicsToday
AllTopicsToday
March 13, 2026
Master Vibe Coding: Pros, Cons, and Best Practices for Data Engineers
Crypto sell-off, Nikkei 225, Hang Seng Index
Every Movie & TV Show Trailer From SDCC 2025
GFN Thursday: ‘Call of Duty: Black Ops 7’
- Advertisement -
Ad space (1)

Categories

  • Tech
  • Investing & Finance
  • AI
  • Entertainment
  • Wellness
  • Gaming
  • Movies

About US

We believe in the power of information to empower decisions, fuel curiosity, and spark innovation.
Quick Links
  • Home
  • Blog
  • About Us
  • Contact
Important Links
  • About Us
  • Privacy Policy
  • Terms and Conditions
  • Disclaimer
  • Contact

Subscribe US

Subscribe to our newsletter to get our newest articles instantly!

©AllTopicsToday 2026. All Rights Reserved.
1 2
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?