Skip to content

System Prompt Service - Complete Developer Documentation

Service: System Prompt Template Management
Port: 8009
Purpose: Upload and retrieve LLM system prompt templates
Technology: FastAPI (Python 3.9+)
Code Location: /systemprompt-service/src/main.py (141 lines, 2 endpoints)
Owner: Backend Team
Last Updated: 2025-12-26


Table of Contents

  1. Service Overview
  2. Complete Endpoints
  3. System Prompt Templates
  4. Database Schema
  5. Integration with Selection Service
  6. Deployment

Service Overview

The System Prompt Service is a simple, focused service that manages LLM system prompt templates. It stores default prompts for different agent types and models, which are used by the Selection Chatbot Service when users select an LLM model.

Key Responsibilities

Upload Prompts - Store system prompt templates as .txt files
Retrieve Prompts - Get prompts by model and/or agent type
Upsert Logic - Update existing or insert new prompts
Template Management - Default prompts for 11 models × 6 agent types

Statistics

  • Total Lines: 141 (SMALLEST service!)
  • Endpoints: 2
  • Supported Models: 11 (openai-4, openai-4o, openai-35, mistral, deepseek, llama, phi, openai-o1mini, gemini-flash-25, claude-sonnet-4, grok-3)
  • Agent Types: 6 (Sales, Service, Informational, Custom, Fitness coach, Interview Coach)

Complete Endpoints

1. POST /v2/upload-text

Purpose: Upload or update a system prompt template

Code Location: Lines 55-110

Request:

POST /v2/upload-text
Content-Type: multipart/form-data

file=@sales_openai35_prompt.txt
model=openai-35
agent_type=Sales-Agent

Supported Models:

  • openai-4
  • openai-4o
  • openai-35
  • mistral
  • deepseek
  • llama
  • phi
  • openai-o1mini
  • gemini-flash-25
  • claude-sonnet-4
  • grok-3

Supported Agent Types:

  • Sales-Agent
  • Service-Agent
  • Informational-Agent
  • Custom-Agent
  • Fitness coach
  • Interview Coach

File Validation:

if not file.filename.endswith(".txt"):
    raise HTTPException(status_code=400, detail="Only .txt files are allowed.")

Upsert Logic:

result = system_prompt_collection.update_one(
    {
        "filename": file.filename,
        "model": model,
        "agent_type": agent_type
    },
    {"$set": document},
    upsert=True  # Insert if not found, update if exists
)

action = "updated" if result.matched_count > 0 else "inserted"

Response (Insert):

{
  "message": "File inserted successfully",
  "filename": "sales_openai35_prompt.txt",
  "model": "openai-35",
  "agent_type": "Sales-Agent"
}

Response (Update):

{
  "message": "File updated successfully",
  "filename": "sales_openai35_prompt.txt",
  "model": "openai-35",
  "agent_type": "Sales-Agent"
}

File Processing

Code (Lines 75-82):

content = await file.read()
text_content = content.decode("utf-8")

document = {
    "filename": file.filename,
    "model": model,
    "agent_type": agent_type,
    "content": text_content,
    "uploaded_at": datetime.now(pytz.utc)
}

Example .txt File Content:

You are a helpful sales assistant for our company. Your primary goal is to:

1. Understand customer needs
2. Recommend appropriate products or services
3. Handle objections professionally
4. Guide customers towards a purchase decision

Always maintain a friendly, professional tone. Be persuasive but not pushy.
Use the customer's name when appropriate. Focus on benefits, not just features.

If you don't know the answer to a question, be honest and offer to find out more information.

2. GET /v2/get-uploaded-texts

Purpose: Retrieve system prompt templates with optional filtering

Code Location: Lines 113-137

Request (All Prompts):

GET /v2/get-uploaded-texts

Request (Filter by Model):

GET /v2/get-uploaded-texts?model=openai-35

Request (Filter by Agent Type):

GET /v2/get-uploaded-texts?agent_type=Sales-Agent

Request (Filter by Both):

GET /v2/get-uploaded-texts?model=openai-35&agent_type=Sales-Agent

Query Logic:

query = {}
if model:
    query["model"] = model
if agent_type:
    query["agent_type"] = agent_type

results = list(system_prompt_collection.find(query, {"_id": 0}))

Response:

{
  "results": [
    {
      "filename": "sales_openai35_prompt.txt",
      "model": "openai-35",
      "agent_type": "Sales-Agent",
      "content": "You are a helpful sales assistant...",
      "uploaded_at": "2025-01-15T14:00:00+00:00"
    },
    {
      "filename": "service_openai35_prompt.txt",
      "model": "openai-35",
      "agent_type": "Service-Agent",
      "content": "You are a customer service representative...",
      "uploaded_at": "2025-01-15T14:05:00+00:00"
    }
  ]
}

Error (No Results):

{
  "detail": "No matching documents found."
}

HTTP Status: 404


System Prompt Templates

Template Matrix

For a complete system, you need 66 templates (11 models × 6 agent types):

Agent Type openai-4 openai-35 mistral gemini-flash-25 claude-sonnet-4 ...
Sales-Agent ...
Service-Agent ...
Informational-Agent ...
Custom-Agent ...
Fitness coach ...
Interview Coach ...

Example Prompts

Sales-Agent + openai-35:

You are an expert sales consultant representing our company. Your role is to:

1. Qualify leads by understanding their needs and pain points
2. Present solutions that align with their requirements
3. Handle objections using the LAER framework (Listen, Acknowledge, Explore, Respond)
4. Close deals by creating urgency while maintaining trust

Communication Style:
- Be consultative, not pushy
- Ask open-ended questions
- Listen actively and paraphrase to show understanding
- Use social proof and case studies when appropriate

Remember: Your goal is to help customers make informed decisions, not just to sell.

Service-Agent + openai-35:

You are a dedicated customer service representative. Your mission is to:

1. Resolve customer issues quickly and effectively
2. Maintain a positive, empathetic tone even when dealing with complaints
3. Take ownership of problems and follow through to resolution
4. Educate customers on product features and best practices

Guidelines:
- Always greet customers warmly
- Apologize sincerely when appropriate
- Provide clear, step-by-step solutions
- Offer alternatives if the first solution doesn't work
- End interactions by confirming customer satisfaction

Your success is measured by customer happiness, not just issue resolution.

Fitness coach + Gemini:

You are a certified fitness coach with expertise in personalized training programs. Your role:

1. Assess client fitness levels and goals
2. Design safe, effective workout routines
3. Provide nutritional guidance (general, not medical advice)
4. Motivate and encourage clients through their fitness journey

Approach:
- Start by understanding current fitness level and any limitations
- Set SMART goals (Specific, Measurable, Achievable, Relevant, Time-bound)
- Focus on progressive overload and proper form
- Emphasize consistency over intensity for beginners
- Celebrate small wins to maintain motivation

Safety First: Always advise consulting a doctor before starting new exercise programs.

Database Schema

Collection: system_prompts

Document Structure:

{
    "_id": ObjectId("..."),
    "filename": "sales_openai35_prompt.txt",
    "model": "openai-35",
    "agent_type": "Sales-Agent",
    "content": "You are a helpful sales assistant for our company...",
    "uploaded_at": ISODate("2025-01-15T14:00:00Z")
}

Indexes:

// Composite index for efficient lookups
db.system_prompts.createIndex({ model: 1, agent_type: 1 });

// Unique index to prevent duplicates
db.system_prompts.createIndex(
  { filename: 1, model: 1, agent_type: 1 },
  { unique: true }
);

Query Examples:

Get all prompts for a model:

db.system_prompts.find({ model: "openai-35" });

Get specific prompt:

db.system_prompts.findOne({
  model: "openai-35",
  agent_type: "Sales-Agent",
});

Count prompts:

db.system_prompts.countDocuments({});
// Expected: ~66 (11 models × 6 agents)

Integration with Selection Service

How It Works

Step 1: User selects model in Selection Service

User chooses openai-35 + Service-Agent → Selection Service calls this service

Step 2: Selection Service fetches prompt

# In selection-chatbot-service (lines 264-268)
system_prompt = system_prompt_collection.find_one({
    "chatbot_purpose": chatbot_purpose,  # "Service Bot" → mapped to "Service-Agent"
    "model": selection_model.lower()      # "openai-35"
})

content = system_prompt.get("content", "")

Step 3: Selection Service stores in user's prompts

# system_prompts_user collection
user_system_prompt_collection.insert_one({
    "user_id": user_id,
    "project_id": project_id,
    "chatbot_purpose": chatbot_purpose,
    "model": selection_model,
    "system_prompt": content,  # From this service!
    "created_at": datetime.utcnow().isoformat()
})

Data Flow

sequenceDiagram
    participant U as User
    participant S as Selection Service
    participant SP as System Prompt Service
    participant DB as CosmosDB

    U->>S: POST /v2/select-model<br/>(model=openai-35, purpose=Service)

    Note over S: Map purpose to agent type
    S->>SP: Internal query to system_prompts
    SP->>DB: findOne({model, agent_type})
    DB-->>SP: Prompt template
    SP-->>S: System prompt content

    S->>DB: Insert to system_prompts_user
    S-->>U: Model selected + prompt configured

Deployment

Docker Configuration

Dockerfile:

FROM python:3.9-slim

WORKDIR /app

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY src/ .

EXPOSE 8009

CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8009"]

Docker Compose

systemprompt-service:
  build: ./systemprompt-service
  container_name: systemprompt-service
  ports:
    - "8009:8009"
  networks:
    - app-network
  labels:
    - "logging=loki"
    - "app=machine-agent-app"
  environment:
    - MONGO_URI=${MONGO_URI}
    - MONGO_DB_NAME=Machine_agent_demo
    - DD_SERVICE=systemprompt-service
    - DD_ENV=production
  restart: always

Requirements.txt

fastapi>=0.95.0
uvicorn[standard]>=0.22.0
pymongo>=4.3.3
python-multipart>=0.0.6
python-dotenv>=1.0.0
pytz>=2023.3
tiktoken>=0.5.0  # Token counting (not used in current code)
ddtrace>=1.19.0

Note: tiktoken is imported but not used in the current code


Performance Metrics

Operation Latency Notes
Upload prompt 10-30ms Single document upsert
Get prompts (all) 20-50ms Query 66 documents
Get prompts (filtered) 10-20ms Index-based lookup

Very fast - simple CRUD operations on small documents


Usage Example

Initial Setup: Upload All Templates

Script to upload prompts:

import requests

models = ["openai-4", "openai-35", "mistral", "gemini-flash-25"]
agents = ["Sales-Agent", "Service-Agent", "Fitness coach"]

for model in models:
    for agent in agents:
        filename = f"{agent.lower().replace(' ', '_')}_{model}.txt"

        with open(f"prompts/{filename}", "rb") as f:
            files = {"file": f}
            data = {
                "model": model,
                "agent_type": agent
            }

            response = requests.post(
                "http://localhost:8009/v2/upload-text",
                files=files,
                data=data
            )

            print(f"{filename}: {response.json()['message']}")

Retrieve Specific Prompt

import requests

response = requests.get(
    "http://localhost:8009/v2/get-uploaded-texts",
    params={
        "model": "openai-35",
        "agent_type": "Sales-Agent"
    }
)

prompts = response.json()["results"]
if prompts:
    print(prompts[0]["content"])


Recommendations

Improvements

  1. Add Versioning - Track prompt version history
  2. Add Validation - Verify prompt structure/format
  3. Add Token Counting - Use the imported tiktoken to count tokens
  4. Add Bulk Upload - Upload multiple prompts at once
  5. Add Soft Delete - Don't permanently delete prompts

Code Quality

  1. Remove Unused Import - tiktoken not used
  2. Add Input Sanitization - Validate file content
  3. Add Type Hints - Return types for functions
  4. Add Unit Tests - Test upsert logic

Last Updated: 2025-12-26
Code Version: systemprompt-service/src/main.py (141 lines)
Total Endpoints: 2
Review Cycle: Quarterly (Stable Service)


"Where prompts become personalities."