This weblog put up focuses on new options and enhancements. For a complete record, together with bug fixes, please see the launch notes.
Three-Command CLI Workflow for Mannequin Deployment
Getting fashions from improvement to manufacturing sometimes includes a number of instruments, configuration recordsdata, and deployment steps. You scaffold a mannequin regionally, check it in isolation, configure infrastructure, write deployment scripts, after which push to manufacturing. Every step requires context switching and handbook coordination.
With Clarifai 12.2, we have streamlined this right into a 3-command workflow: mannequin init, mannequin serve, and mannequin deploy. These instructions deal with scaffolding, native testing, and manufacturing deployment with computerized infrastructure provisioning, GPU choice, and well being checks in-built.
This is not simply quicker. It removes the friction between constructing a mannequin and working it at scale. The CLI handles dependency administration, runtime configuration, and deployment orchestration, so you may give attention to mannequin logic as a substitute of infrastructure setup.
This launch additionally introduces Coaching on Pipelines, permitting you to coach fashions instantly inside pipeline workflows utilizing devoted compute assets. We have added Video Intelligence help by way of the UI, improved artifact lifecycle administration, and expanded deployment capabilities with dynamic nodepool routing and new cloud supplier help.
Let’s stroll by way of what’s new and the best way to get began.
Streamlined Mannequin Deployment: 3 Instructions to Manufacturing
The everyday mannequin deployment workflow includes a number of steps: scaffold a undertaking construction, set up dependencies, write configuration recordsdata, check regionally, containerize, provision infrastructure, and deploy. Every step requires switching contexts and managing configuration throughout completely different instruments.
Clarifai’s CLI consolidates this into three instructions that deal with the whole lifecycle from scaffolding to manufacturing deployment.
How It Works
1. Initialize a mannequin undertaking
clarifai mannequin init --toolkit vllm --model-name Qwen/Qwen3-0.6B
This scaffolds an entire mannequin listing with the construction Clarifai expects: config.yaml, necessities.txt, and mannequin.py. You need to use built-in toolkits (HuggingFace, vLLM, LMStudio, Ollama) or begin from scratch with a base template.
The generated config.yaml consists of good defaults for runtime settings, compute necessities, and deployment configuration. You possibly can modify these or go away them as-is for fundamental deployments.
2. Check regionally
clarifai mannequin serve
This begins a neighborhood inference server that behaves precisely just like the manufacturing deployment. You possibly can check your mannequin with actual requests, confirm conduct, and iterate rapidly with out deploying to the cloud.
The serve command helps a number of modes:
- Setting mode: Runs instantly in your native Python surroundings
- Docker mode: Builds and runs in a container for manufacturing parity
- Standalone gRPC mode: Exposes a gRPC endpoint for integration testing
3. Deploy to manufacturing
clarifai mannequin deploy
This command handles every part: validates your config, builds the container, provisions infrastructure (cluster, nodepool, deployment), and displays till the mannequin is prepared.
The CLI reveals structured deployment phases with progress indicators, so you recognize precisely what’s occurring at every step. As soon as deployed, you get a public API endpoint that is able to deal with inference requests.
Clever Infrastructure Provisioning
The CLI now handles GPU choice mechanically throughout mannequin initialization. GPU auto-selection analyzes your mannequin’s reminiscence necessities and toolkit specs, then selects applicable GPU situations.
Multi-cloud occasion discovery works throughout cloud suppliers. You need to use GPU shorthands like h100 or legacy occasion names, and the CLI normalizes them throughout AWS, Azure, DigitalOcean, and different supported suppliers.
Customized Docker base photographs allow you to optimize construct occasions. If in case you have a pre-built picture with widespread dependencies, the CLI can use it as a base layer for quicker toolkit builds.
Deployment Lifecycle Administration
As soon as deployed, you want visibility into how fashions are working and the power to regulate them. The CLI gives instructions for the total deployment lifecycle:
Examine deployment standing:
clarifai mannequin standing --deployment
View logs:
clarifai mannequin logs --deployment
Undeploy:
clarifai mannequin undeploy --deployment
The CLI additionally helps managing deployments instantly by ID, which is beneficial for scripting or CI/CD pipelines.
Enhanced Native Improvement
Native testing is essential for quick iteration, but it surely typically diverges from manufacturing conduct. The CLI bridges this hole with native runners that mirror manufacturing environments.
The mannequin serve command now helps:
- Concurrency controls: Restrict the variety of simultaneous requests to simulate manufacturing load
- Non-obligatory Docker picture retention: Maintain constructed photographs for quicker restarts throughout improvement
- Well being-check configuration: Configure health-check settings utilizing flags like
--health-check-port,--disable-health-check, and--auto-find-health-check-port
Native runners additionally help the identical inference modes as manufacturing (streaming, batch, multi-input), so you may check advanced workflows regionally earlier than deploying.
Simplified Configuration
Mannequin configuration used to require manually modifying YAML recordsdata with actual discipline names and nested constructions. The CLI now handles normalization mechanically.
While you initialize a mannequin, config.yaml consists of solely the fields it’s essential customise. Sensible defaults fill in the remaining. In case you add fields with barely incorrect names or codecs, the CLI normalizes them throughout deployment.
This reduces configuration errors and makes it simpler emigrate present fashions to Clarifai.
Why This Issues
The three-command workflow removes friction from mannequin deployment. You go from concept to manufacturing API in minutes as a substitute of hours or days. The CLI handles infrastructure complexity, so you do not must be an knowledgeable in Kubernetes, Docker, or cloud compute to deploy fashions at scale.
This additionally standardizes deployment throughout groups. Everybody makes use of the identical instructions, the identical configuration format, and the identical testing workflow. This makes it simpler to share fashions, reproduce deployments, and onboard new crew members.
For an entire information on the brand new CLI workflow, together with examples and superior configuration choices, see the Deploy Your First Mannequin by way of CLI documentation.
Coaching on Pipelines
Clarifai Pipelines, launched in 12.0, mean you can outline and execute long-running, multi-step AI workflows. With 12.2, now you can practice fashions instantly inside pipeline workflows utilizing devoted compute assets.
Coaching on Pipelines integrates mannequin coaching into the identical orchestration layer as inference and information processing. This implies coaching jobs run on the identical infrastructure as your different workloads, with the identical autoscaling, monitoring, and value controls.
How It Works
You possibly can initialize coaching pipelines utilizing templates by way of the CLI. This creates a pipeline construction with pre-configured coaching steps. You specify your dataset, mannequin structure, and coaching parameters within the pipeline configuration, then run it like another pipeline.
This creates a pipeline construction with pre-configured coaching steps. You specify your dataset, mannequin structure, and coaching parameters within the pipeline configuration, then run it like another pipeline.
The platform handles:
- Provisioning GPUs for coaching workloads
- Scaling compute primarily based on job necessities
- Saving checkpoints as Artifacts for versioning
- Monitoring coaching metrics and logs
As soon as coaching completes, the ensuing mannequin is mechanically suitable with Clarifai’s Compute Orchestration platform, so you may deploy it utilizing the identical mannequin deploy workflow. Learn extra about Pipelines right here.
UI Expertise
We have additionally launched a brand new UI for coaching fashions inside pipelines. You possibly can configure coaching parameters, choose datasets, and monitor progress instantly from the platform with out writing code or managing infrastructure.
This makes it simpler for groups with out deep ML engineering experience to coach customized fashions and combine them into manufacturing workflows.
Coaching on Pipelines is offered in Public Preview. For extra particulars, see the Pipelines documentation.
Artifact Lifecycle Enhancements
With 12.2, we have improved how Artifacts deal with expiration and versioning.
Artifacts now not expire mechanically by default. Beforehand, artifacts had a default retention coverage that will delete them after a sure interval. Now, artifacts persist indefinitely except you explicitly set an expires_at worth throughout add.
This offers you full management over artifact lifecycle administration. You possibly can set expiration dates for momentary outputs (like intermediate checkpoints throughout experimentation) whereas protecting manufacturing artifacts indefinitely.
The CLI now shows latest-version-id alongside artifact visibility, making it simpler to reference the latest model with out itemizing all variations first.
These adjustments make Artifacts extra predictable and simpler to handle for long-term storage of pipeline outputs.
Video Intelligence
Clarifai now helps video intelligence by way of the UI. You possibly can join video streams to your utility and apply AI evaluation to detect objects, observe motion, and generate insights in actual time.
This expands Clarifai’s capabilities past picture and textual content processing to deal with dwell video feeds, enabling use circumstances like safety monitoring, retail analytics, and automatic content material moderation for video platforms.
Video Intelligence is offered now.
Deployment Enhancements
We have made a number of enhancements to how deployments work throughout compute infrastructure.
Dynamic nodepool routing permits you to connect a number of nodepools to a single deployment with configurable scheduling methods. This offers you extra management over how visitors is distributed throughout completely different compute assets, which is beneficial for dealing with spillover visitors or routing to particular {hardware} primarily based on request kind.
Deployment visibility has been improved with standing chips and enhanced record views throughout Deployments, Nodepools, and Clusters. You possibly can see at a look which deployments are wholesome, that are scaling, and which want consideration.
New cloud supplier help: We have added DigitalOcean and Azure as supported occasion suppliers, supplying you with extra flexibility in the place you deploy fashions.
Begin and cease deployments explicitly: Now you can pause deployments with out deleting them. This preserves configuration whereas releasing up compute assets, which is beneficial for dev/check environments or fashions with intermittent visitors.
Redesigned Deployment particulars web page gives expanded standing visibility, together with duplicate counts, nodepool well being, and request metrics, multi function view.
Further Modifications
Platform Updates
We have launched a number of UI enhancements to make the platform simpler to navigate and use:
- New Mannequin Library UI gives a streamlined expertise for looking and exploring fashions
- Common Search added to the navbar for fast entry to fashions, datasets, and workflows
- New account expertise with improved onboarding and settings administration
- Dwelling 3.0 interface with a refreshed design and higher group of current exercise
Playground Enhancements
The Playground now consists of main upgrades to the Common Search expertise, with multi-panel (examine mode) help, improved workspace dealing with, and smarter mannequin auto-selection. Mannequin choices are panel-aware to forestall cross-panel conflicts, and the UI can show simplified mannequin names for a cleaner expertise.
Pipeline Step Visibility
Now you can set pipeline steps to be publicly seen throughout initialization by way of each the CLI and builder APIs. By default, pipelines and pipeline step templates are created with PRIVATE visibility, however you may override this when sharing workflows throughout groups or with the group.
Modules Deprecation
Help for Modules has been absolutely dropped. Modules beforehand prolonged Clarifai’s UIs and enabled personalized backend processing, however they have been changed by extra versatile alternate options like Artifacts and Pipelines.
Python SDK Updates
We have made a number of enhancements to the Python SDK, together with:
- Fastened ModelRunner well being server beginning twice, which may trigger “Deal with already in use” errors
- Added admission-control help for mannequin runners
- Improved sign dealing with and zombie course of reaping in runner containers
- Refactored the MCP server implementation for higher logging readability
For an entire record of SDK updates, see the Python SDK changelog.
Able to Begin Constructing?
You can begin utilizing the brand new 3-command deployment workflow at present. Initialize a mannequin with clarifai mannequin init, check it regionally with clarifai mannequin serve, and deploy to manufacturing with clarifai mannequin deploy.
For groups working long-running coaching jobs, Coaching on Pipelines gives a strategy to combine mannequin coaching into the identical orchestration layer as your inference workloads, with devoted compute and computerized checkpoint administration.
Video Intelligence help provides real-time video stream processing to the platform, and deployment enhancements offer you extra management over how fashions run throughout completely different compute environments.
The brand new CLI workflow is offered now. Try the Deploy Your First Mannequin by way of CLI information to get began, or discover the total 12.2 launch notes for full particulars.
Enroll right here to get began with Clarifai, or try the documentation for extra info.
If in case you have questions or need assistance whereas constructing, be a part of us on Discord. Our group and crew are there to assist.
