Thursday, January 15, 2026

How AutoScout24 constructed a Bot Manufacturing unit to standardize AI agent growth with Amazon Bedrock


AutoScout24 is Europe’s main automotive market platform that connects consumers and sellers of recent and used automobiles, bikes, and business autos throughout a number of European international locations. Their long-term imaginative and prescient is to construct a Bot Manufacturing unit, a centralized framework for creating and deploying synthetic intelligence (AI) brokers that may carry out duties and make selections inside workflows, to considerably enhance operational effectivity throughout their group.

From disparate experiments to a standardized framework

As generative AI brokers (techniques that may purpose, plan, and act) change into extra highly effective, the chance to enhance inside productiveness for AutoScout24 was clear. This led to numerous engineering groups experimenting with the know-how. As AI innovation accelerated throughout AutoScout24, they acknowledged a possibility to pioneer a standardized method for AI growth. Whereas AutoScout24 had efficiently experimented with varied instruments and frameworks on Amazon Internet Companies (AWS), they envisioned making a unified, enterprise-grade framework that might allow sooner innovation. Their aim was to determine a paved path that might make it simpler for groups throughout the group to construct safe, scalable, and maintainable AI brokers. The AutoScout24 AI Platform Engineering group partnered with the AWS Prototype and Cloud Engineering (PACE) group in a three-week AI bootcamp. The aim was to maneuver from fragmented experiments to a coherent technique by making a reusable blueprint, a Bot Manufacturing unit, to standardize how future AI brokers are constructed and operated inside their firm.

The problem: figuring out a high-impact use case

To floor the Bot Manufacturing unit blueprint in a tangible enterprise case, the group focused a big operational price: inside developer assist. The issue was well-defined. AutoScout24 AI Platform engineers had been spending as much as 30% of their time on repetitive duties like answering questions, granting entry to instruments, and finding documentation. This assist tax decreased general productiveness. It diverted expert engineers from high-priority characteristic growth and compelled different builders to attend for routine requests to be accomplished. An automatic assist bot was a great first use case as a result of it wanted to carry out two core agent capabilities:

  1. Data retrieval: Answering “how-to” questions by looking out inside documentation, a functionality often called Retrieval Augmented Era (RAG).
  2. Motion execution: Performing duties in different techniques, comparable to assigning a GitHub Copilot license, which requires safe API integration, or “software use.”

By constructing a bot that might do each, the group may validate the blueprint whereas delivering fast enterprise worth.

Architectural overview

On this publish, we discover the structure that AutoScout24 used to construct their standardized AI growth framework, enabling speedy deployment of safe and scalable AI brokers.

The structure is designed with a easy, decoupled circulate to ensure the system is each resilient and easy to keep up. The diagram offers a simplified view centered on the core generative-AI workflow. In a manufacturing setting, further AWS companies comparable to AWS Id and Entry Administration (IAM), Amazon CloudWatch, AWS X-Ray, AWS CloudTrail, AWS Internet Utility Firewall (WAF), and AWS Key Administration Service (KMS) might be built-in to boost safety, observability, and operational governance.

Right here is how a request flows via the system:

  1. Consumer interplay through Slack: A developer posts a message in a assist channel, for instance, “@SupportBot, can I get a GitHub Copilot license?“
  2. Safe ingress through Amazon API Gateway & AWS Lambda: Slack sends the occasion to an Amazon API Gateway endpoint, which triggers an AWS Lambda operate. This operate performs a vital safety verify, verifying the request’s cryptographic signature to substantiate it’s authentically from Slack.
  3. Decoupling through Amazon Easy Queue Service (SQS): The verified request is positioned onto an Amazon SQS First-In, First-Out (FIFO) queue. This decouples the front-end from the agent, bettering resilience. Utilizing a FIFO queue with the message’s thread timestamp because the MessageGroupId makes positive that replies inside a single dialog are processed so as, which is necessary for sustaining coherent conversations.
  4. Agent execution through Amazon Bedrock AgentCore: The SQS queue triggers a Lambda operate when messages arrive, which prompts the agent operating within the AgentCore Runtime. AgentCore manages the operational duties, together with orchestrating calls to the muse mannequin and the agent’s instruments. The Orchestrator Agent’s logic, constructed with Strands Brokers, analyzes the person’s immediate and determines the proper specialised agent to invoke—both the Data Base Agent for a query or the GitHub Agent for an motion request.

An important implementation element is how the system leverages AgentCore’s full session isolation. To keep up conversational context, the system generates a singular, deterministic sessionId for every Slack thread by combining the channel ID and the thread’s timestamp. This sessionId is handed with each agent invocation inside that thread. Interactions in a thread share this similar sessionId, so the agent treats them as one steady dialog. In the meantime, interactions in different threads get completely different sessionIds, maintaining their contexts separate. In impact, every dialog runs in an remoted session: AgentCore spins up separate assets per sessionId, so context and state don’t leak between threads. In follow, which means that if a developer sends a number of messages in a single Slack thread, the agent remembers the sooner components of that dialog. Every thread’s historical past is preserved robotically by AgentCore.

This session administration technique can also be important for observability. Based mostly on a singular sessionId, the interplay might be traced utilizing AWS X-Ray, which presents perception into the circulate – from the Slack message arriving at API Gateway to the message being enqueued in SQS. It follows the orchestrator’s processing, the decision to the muse mannequin, subsequent software invocations (comparable to a knowledge-base lookup or a GitHub API name), and eventually the response again to Slack.

Metadata and timing assist point out the circulate of every step to grasp the place time is spent. If a step fails or is gradual (for instance, a timeout on an exterior API name), X-Ray pinpoints which step brought on the problem. That is invaluable for diagnosing issues shortly and constructing confidence within the system’s conduct.

The answer: A reusable blueprint powered by AWS

The Bot Manufacturing unit structure designed by the AutoScout24 and AWS groups is event-driven, serverless, and constructed on a basis of managed AWS companies. This method offers a resilient and scalable sample that may be tailored for brand spanking new use instances.

The answer builds on Amazon Bedrock and its built-in capabilities:

  • Amazon Bedrock offers entry to high-performing basis fashions (FMs), which act because the reasoning engine for the agent.
  • Amazon Bedrock Data Bases allows the RAG functionality, permitting the agent to connect with AutoScout24’s inside documentation and retrieve data to reply questions precisely.
  • Amazon Bedrock AgentCore is a key part of the operational aspect of the blueprint. It offers the totally managed, serverless runtime setting to deploy, function, and scale the brokers.

This answer offers a big benefit for AutoScout24. As an alternative of constructing foundational infrastructure for session administration, safety, and observability, they use AgentCore’s purpose-built companies. This permits the group to deal with the agent’s enterprise logic somewhat than the underlying infrastructure. AgentCore additionally offers built-in safety and isolation options. Every agent invocation runs in its personal remoted container, serving to to forestall information leakage between classes. Brokers are assigned particular IAM roles to limit their AWS permissions (following the precept of least privilege). Credentials or tokens wanted by agent instruments (comparable to a GitHub API key) are saved securely in AWS Secrets and techniques Supervisor and accessed at runtime. These options give the group a safe setting for operating brokers with minimal customized infrastructure.

The agent itself was constructed utilizing the Strands Brokers SDK, an open-source framework that simplifies defining an agent’s logic, instruments, and conduct in Python. This mix proves efficient: Strands to construct the agent, and AgentCore to securely run it at scale. The group adopted a classy “agents-as-tools” design sample, the place a central orchestrator Agent acts as the principle controller. This orchestrator doesn’t comprise the logic for each potential job. As an alternative, it intelligently delegates requests to specialised, single-purpose brokers. For the assist bot, this included a Data Base agent for dealing with informational queries and a GitHub agent for executing actions like assigning licenses. This modular design makes it easy to increase the system with new capabilities, comparable to including a PR evaluation agent with out re-architecting all the pipeline. Operating these brokers on Amazon Bedrock additional enhances flexibility, for the reason that group can select from a broad vary of basis fashions. Extra highly effective fashions might be utilized to complicated reasoning duties, whereas lighter, cost-efficient fashions are well-suited for routine employee brokers comparable to GitHub license requests or operational workflows. This potential to combine and match fashions permits Autoscout24 to steadiness price, efficiency, and accuracy throughout their agent structure.

Orchestrator agent: constructed with Strands SDK

Utilizing the Strands Brokers SDK helped the group to outline the orchestrator agent with concise, declarative code. The framework makes use of a model-driven method, the place the developer focuses on defining the agent’s directions and instruments, and the muse mannequin handles the reasoning and planning. The orchestrator agent might be expressed in only a few dozen traces of Python. The instance snippet beneath (simplified for readability, not meant for direct use) reveals how the agent is configured with a mannequin, a system immediate, and a listing of instruments (which on this structure signify the specialised brokers):

# A simplified, consultant instance of the orchestrator agent logic
# constructed with the Strands Brokers SDK and deployed on Amazon Bedrock AgentCore.
from bedrock_agentcore.runtime import BedrockAgentCoreApp
from strands import Agent
from strands.fashions import BedrockModel
from instruments import knowledge_base_query_tool, github_copilot_seat_agent
# Initialize the AgentCore utility, which acts because the serverless container
app = BedrockAgentCoreApp()
class OrchestratorAgent:
    def __init__(self):
        # 1. Outline the Mannequin: Level to a basis mannequin in Amazon Bedrock.
        self.mannequin = BedrockModel(model_id="anthropic.claude-3-sonnet-20240229-v1:0")
        
        # 2. Outline the Immediate: Give the agent its core directions.
        self.system_prompt = """
        You're a useful and pleasant assist bot for the AutoScout24 Platform Engineering group.
        Your aim is to reply developer questions and automate widespread requests.
        Use your instruments to reply questions or carry out actions.
        In the event you can not deal with a request, politely say so.
        """
        
        # 3. Outline the Instruments: Present the agent with its capabilities.
        # These instruments are entry factors to different specialised Strands brokers.
        self.instruments = [
            knowledge_base_query_tool, 
            github_copilot_seat_agent
        ]
        
        # Create the agent occasion
        self.agent = Agent(
            mannequin=self.mannequin, 
            system_prompt=self.system_prompt, 
            instruments=self.instruments
        )
    def __call__(self, user_input: str):
        # Run the agent to get a response for the person's enter
        return self.agent(user_input)
# Outline the entry level that AgentCore will invoke when a brand new occasion arrives from SQS
@app.entrypoint
def predominant(occasion):
    # Extract the person's question from the incoming occasion
    user_query = occasion.get("immediate")
    
    # Instantiate and run the orchestrator agent
    return OrchestratorAgent()(user_query)

One other instance is the GitHub Copilot license agent. It’s carried out as a Strands software operate. The next snippet reveals how the group outlined it utilizing the @software decorator. This operate creates a GitHubCopilotSeatAgent, passes the person’s request (a GitHub username) to it, and returns the end result:

from strands import Agent, software
class GitHubCopilotSeatAgent:
def __call__(self, question: str):
agent = Agent(mannequin=self.mannequin, system_prompt=self.system_prompt, instruments=self.instruments)
return agent(question)

@software
def github_copilot_seat_agent(github_username: str) -> str:
agent = GitHubCopilotSeatAgent() response = agent(f"Request GitHub Copilot license for person: {github_username}")
return str(response)

Key advantages of this method embody clear separation of issues. The developer writes declarative code centered on the agent’s function. The complicated infrastructure logic, together with scaling, session administration, and safe execution, is dealt with by Amazon Bedrock AgentCore. This abstraction allows speedy growth and allowed AutoScout24 to maneuver from prototype to manufacturing extra shortly. The instruments record successfully makes different brokers callable capabilities, permitting the orchestrator to delegate duties while not having to know their inside implementation.

The affect: A validated blueprint for enterprise AI

The Bot Manufacturing unit venture delivers outcomes that prolonged past the preliminary prototype. It creates fast enterprise worth and establishes a strategic basis for future AI innovation at AutoScout24.The important thing outcomes had been:

  • A production-ready assist bot: The group deployed a useful Slack bot that’s actively lowering the guide assist load on the AutoScout24 AI Platform Engineering Crew, addressing the 30% of time beforehand spent on repetitive duties.
  • A reusable Bot Manufacturing unit blueprint: The venture produces a validated, reusable architectural sample. Now, groups at AutoScout24 can construct a brand new agent by beginning with this confirmed template (Slack -> API Gateway -> SQS -> AgentCore). This considerably accelerates innovation by permitting groups to deal with their distinctive enterprise logic, not on reinventing the infrastructure. This modular design additionally prepares them for extra superior multi-agent collaboration, probably utilizing requirements just like the Agent-to-Agent (A2A) protocol as their wants evolve.
  • Enabling broader AI growth: By abstracting away the infrastructure complexity, the Bot Manufacturing unit empowers extra individuals to construct AI options. A site knowledgeable in safety or information analytics can now create a brand new software or specialised agent and “plug it in” to the manufacturing unit while not having to be an knowledgeable in distributed techniques.

Conclusion: A brand new mannequin for enterprise brokers

AutoScout24’s partnership with AWS turned fragmented generative AI experiments right into a scalable, standardized framework. By adopting Amazon Bedrock AgentCore, the group moved their assist bot from prototype to manufacturing, whereas specializing in their Bot Manufacturing unit imaginative and prescient. AgentCore manages session state and scaling, so engineers can deal with high-value enterprise logic as a substitute of infrastructure. The result is greater than a assist bot: it’s a reusable basis for constructing enterprise brokers. With AgentCore, AutoScout24 can transfer from prototype to manufacturing effectively, setting a mannequin for a way organizations can standardize generative AI growth on AWS. To start out constructing enterprise brokers with Amazon Bedrock, discover the next assets:


In regards to the authors

Andrew Shved is a Senior AWS Prototyping Architect who leads groups and clients in constructing and delivery Generative AI–pushed options, from early prototypes to manufacturing on AWS.

Muhammad Uzair Aslam is a tenured Technical Program Supervisor on the AWS Prototyping group, the place he works intently with clients to speed up their cloud and AI journeys. He thrives on diving deep into technical particulars and turning complexity into impactful, value-driven options.

Arslan Mehboob is a Platform Engineer and AWS-certified options architect with deep experience in cloud infrastructure, scalable techniques, and software program engineering. He at present builds resilient cloud platforms and is obsessed with AI and rising applied sciences.

Vadim Shiianov is a Information Scientist specializing in machine studying and AI-driven techniques for real-world enterprise purposes. He works on designing and deploying ML and Generative AI options that translate complicated information into measurable affect. He’s obsessed with rising applied sciences and constructing sensible, scalable techniques round them.

Related Articles

Latest Articles