⚙️ Build with PromptKt

The PromptKt Kotlin library gives you a programmatic API for document chunking, embeddings, prompt templates, multi-step agent pipelines, and MCP tool integration.

Repository Structure

The repository is organized into four top-level Maven modules:

Building the Project

Requirements: Java 17+ and Maven 3.9.3+.

# Build all modules in order
mvn -B install -DskipTests --file promptkt/pom.xml
mvn -B install -DskipTests --file promptex/pom.xml
mvn -B install -DskipTests --file promptrt/pom.xml
mvn -B install -DskipTests --file promptfx/pom.xml
# Run tests (excludes tests requiring API keys)
mvn -B test --file promptkt/pom.xml
Integration tests that require live API keys are tagged @Tag("openai") or @Tag("gemini") and are excluded from the default test run. See the Wiki for how to run them.

Core API Concepts

Text Completion & Chat

The AiModelProvider interface (in promptkt-core) defines the core API for text completion and chat. Provider implementations are in the promptkt-provider-* modules.

// Get a chat response from any configured AiModelProvider
val response = plugin.chat(listOf(
    ChatMessage(role = Role.System, content = "You are a helpful assistant."),
    ChatMessage(role = Role.User, content = "What is RAG?")
))

Prompt Templates

Prompts are defined as Mustache templates in YAML prompt library files and loaded at runtime via PromptLibrary:

val prompt = PromptLibrary.lookupPrompt("my-prompt-id")
val filled = prompt.fill(mapOf("language" to "French", "text" to "Hello, world!"))

Document Chunking & Embeddings

Use the utilities in promptex-docs to load documents, chunk them, and compute embeddings for a local vector store:

val chunks = DocumentChunker().chunk(file, chunkSize = 500)
val embeddings = embeddingPlugin.embedTexts(chunks.map { it.text })
// Store embeddings in LocalEmbeddingIndex for retrieval

RAG Pipelines

The document Q&A pipeline uses embeddings to retrieve relevant chunks and passes them as context to the completion model. You can assemble the same pipeline in code using promptex-pips:

val pipeline = DocQaPipeline(
    embeddingPlugin = embeddingPlugin,
    completionPlugin = completionPlugin,
    index = embeddingIndex
)
val answer = pipeline.ask("What does the paper say about transformers?")

Agent Pipelines & Tool Use

The promptex-pips module provides AiTask, ExecContext, and WorkflowSolver abstractions for multi-step agentic workflows with tool use:

val task = MyCustomTask()
val context = ExecContext(monitor = myMonitor, resources = mapOf(...))
val result = task.execute(context)

MCP Integration

The promptex-mcp module provides MCP (Model Context Protocol) server and client implementations, allowing PromptFx to expose tools to external agents or consume external MCP servers.

Adding a Custom AI Provider

To add a new AI provider (e.g. a locally hosted model or a different cloud API), use promptkt-provider-sample as a starting template:

  1. Copy promptkt-provider-sample/ to a new module directory.
  2. Implement the AiModelProvider (and optionally EmbeddingPlugin / ImageGenerator) interfaces.
  3. Register your provider via the service-loader mechanism (add it to META-INF/services/).
  4. Add the new module as a dependency in promptfx/pom.xml to enable it in the UI.

Extending the UI

PromptFx supports custom view plugins via the NavigableWorkspaceView interface. See promptfx-sample-view-plugin/ for a working example. Plugins are discovered at runtime via the standard Java service-loader mechanism.

API Key Configuration

API keys are never hardcoded. Load them from:

Resources

View on GitHub ↗ Read the Wiki ↗