Back to Engineering
2 min read
Updated recently

Main Product vs Internal Tools vs Platforms

Strategic classification of AI systems: customer-facing apps, internal tools, and data pipelines.

Phase 6 – Main Product vs Internal Tools vs Platforms vs Pipelines

This phase decides which capabilities go where.


6.1 Classify the “Thing” You’re Building

Ask:

  • Primary user:
    • End user / customer.
    • Internal staff.
    • External devs / partners.
    • No direct human user (pipeline).

6.2 Main App vs Internal Tool

Main app:

  • Part of normal user flow.
  • Used primarily by customers.
  • Non-sensitive data.

Internal tool:

  • Operated by support, ops, admin, data/ML teams.
  • Complex, dangerous, noisy views (logs, raw data).
  • Admin roles & heavy monitoring.

6.3 App Feature vs API vs Pipeline

Ask:

  • Does this run interactive or offline?
    • Interactive: inside request path → API → possibly LLM/agent.
    • Offline: ETL, ingestion, training → pipeline.
  • Is this valuable as standalone to others?
    • Yes → API product (external or internal).

6.4 Data Producers & Consumers

Per data type (events, docs, logs, embeddings):

  • Identify producers:
    • Web/mobile app, ingestion services, webhooks.
  • Identify consumers:
    • UI, LLMs, ML services, analytics, partners.

If many producers + many consumers: Build a separate data/ML platform as an internal product:

  • Ingestion services
  • Kafka
  • Warehouse
  • ETL & feature store
  • Indexers for RAG

6.5 Decision Tree Summary

  1. Human vs Non-human
    • No human → pipeline / background service.
  2. Which human?
    • Customer → main app.
    • Internal staff → internal console.
    • External devs → API product.
  3. UX vs API-first
    • UI heavy → screen/workflow.
    • API heavy → service with docs + keys.
  4. Isolation
    • Separate service if:
      • Different scaling.
      • Different SLAs.
      • High risk (LLM cost, heavy GPU computation).
      • Experiment-heavy.

6.6 Example Mapping

AI Chat in product:

  • Main UI: part of web & mobile app.
  • Agents: separate orchestrator service.
  • LLM: separate LLM gateway service (vLLM/OpenAI).
  • Vector store: separate vector DB.

Document ingestion/indexing:

  • Occasional:
    • “Upload docs” button in app, background worker.
  • High-volume:
    • Separate ingestion service + admin console.

AI ranking API for partners:

  • Developer portal UI (web).
  • API product (separate gateway, billing, quotas).
  • Ranking model microservice under the hood.