Skip to content

aipack-ai/aipack

Repository files navigation

Static Badge Static Badge

AIPACK - Run, Build, and Share AI Packs

Checkout the site: https://aipack.ai for more information and links.

Open-source Agentic Runtime to run, build, and share AI Packs.

  • Supports all major AI providers and models.
  • Efficient and small (< 20MB), with zero dependencies.
  • Built in Rust, using Lua for embedded scripting (small and efficient).
  • Runs locally, completely IDE-agnostic.
  • Or in the cloud—server or serverless.

Static Badge

Quick Start

Install

For now, the install requires building it directly from source via Rust. Works great on all OSes.

NOTE: Ironically, while the binary is relatively small (<20MB with batteries included), the build process can take up quite a bit of space. However, Cargo should clean it up afterward.
Binaries and installation instructions will be available at https://aipack.ai

DISCLAIMER: For now, v0.6.x, AIPACK works on Linux & Mac, and requires WSL on Windows

IMPORTANT: Proper Windows support is coming sometime in v0.6.x and definitely by v0.7.x (about Mid / End of March)

Run

# In terminal go to your projct
cd /path/to/my/project/

# initialize workspace .aipack/ and ~/.aipack-base
aip init

# Make sure to export the desired API key
export OPENAI_API_KEY    = "sk...."
export ANTHROPIC_API_KEY = "...."
export GEMINI_API_KEY    = "..."
# For more keys, see below

# To proof read your README.md (namespace: demo, pack_name: proof)
aip run demo@proof -f ./README.md

# You can just use @pack_name if there is no other packname of this name
aip run @proof -f ./README.md

# To do some code crafting (will create _craft-code.md)
aip run demo@craft/code

# Or create your .aip file (you can omit the .aip)
aip run path/to/file.aip

# This is good agent to run to ask questions about aipack
# Can even generate aipack code
aip run core@ask-aipack
# prompt file will be at `.aipack/.prompt/core@ask-aipack/ask-prompt.md`

jc@coder

  • You can install jc@coder with aip install jc@coder, and then
  • Run it with aip run jc@coder or aip run @coder if you don't have any other @coder pack in a different namespace.

This is the agent I use every day for my production coding.

IMPORTANT 1: Make sure everything is committed before usage (at least while you are learning about aipack).

IMPORTANT 2: Make sure to have your API_KEY in an environment variable (on Mac, there is an experimental keychain support)

OPENAI_API_KEY
ANTHROPIC_API_KEY
GEMINI_API_KEY
XAI_API_KEY
DEEPSEEK_API_KEY
GROQ_API_KEY
COHERE_API_KEY

Info

  • Website: https://aipack.ai

  • AIPACK Overview Video

  • Preview 'devai' intro video for v0.5

  • Built on top of the Rust genai library, which supports all the top AI providers and models (OpenAI, Anthropic, Gemini, DeepSeek, Groq, Ollama, xAI, and Cohere).

  • Top new features: (see full CHANGELOG)

    • 2025-03-02 (v0.6.7) - Fixes and tuneup. Pack install test and other refactoring
    • 2025-03-02 (v0.6.4) - Fixes, and now support first repo pack aip install jc@coder
    • 2025-02-28 (v0.6.3) - aip pack .., aip instal local..., ai_response.price_usd, and more
    • 2025-02-26 (v0.6.0) - BIG UPDATE - to AIPACK, now with pack support (aip run demo@craft/code)
    • 2025-02-22 (v0.5.11) - Huge update with parametric agents, and coder (more info soon)
    • 2025-01-27 (v0.5.9) - Deepseek distill models support for Groq and Ollama (local)
    • 2025-01-23 (v0.5.7) - aipack run craft/text or aipack run craft/code (example of cool new agent module support)
    • 2025-01-06 (v0.5.4) - DeepSeek deepseek-chat support
    • 2024-12-08 (v0.5.1) - Added support for xAI
  • WINDOWS DISCLAIMER:

    • This CLI uses a path scheme from Mac/Unix-like systems, which might not function correctly in the Windows bat command line.
    • Full Windows local path support is in development.
    • RECOMMENDATION: Use PowerShell or WSL on Windows. Please log issues if small changes can accommodate Windows PowerShell/WSL.
  • Thanks to

How it works

  • One Agent == One Markdown
    • A .aip Agent file is just a Markdown File with sections for each stage of the agent processing.
    • See below for all the possible stages.
  • aip run demo@proof -f "./*.md"
    • will run the installed Agent file main.aip in the
    • pack name proof
    • namespace demo
    • agent file main.aip
    • Full path ~/.aipack-base/pack/installed/demo/proof/main.aip
    • You can pass input to your agent with
      • -f "path/with/optional/**/glob.*" -f "README.md (then the lua code will get a {path = .., name =..} FileMeta type of structure as input)
      • -i "some string" -i "another input" (then the lua code will get those strings as input)
      • Each input will be one run of the agent.
  • aip run some/path/to/agent
    • can end with .aip in this case direct file run
    • if no .aip extension, then,
      • ...agent.aip will be executed if exists
      • or ...agent/main.aip will be executed if exists
  • aipack agents are simple .aip files that can be placed anywhere on disk.
    • e.g., aipack run ./my-path/to/my-agent.aipack ...
  • Multi AI Provider / Models - aipack uses genai and therefore supports OpenAI, Anthropic, Gemini, Groq, Ollama, Cohere, and more to come.
  • Lua is used for all scripting (thanks to the great mlua crate).
  • Handlebars is used for all prompt templating (thanks to the great Rust native handlebars crate).

Multi Stage

A single aipack file may comprise any of the following stages.

Stage Language Description
# Before All Lua Reshape/generate inputs and add command global data to scope (the "map" of the map/reduce capability).
# Data Lua Gather additional data per input and return it for the next stages.
# System Handlebars Customize the prompt with the data and before_all data.
# Instruction Handlebars Customize the prompt with the data and before_all data.
# Assistant Handlebars Optional for special customizations, such as the "Jedi Mind Trick."
# Output Lua Processes the ai_response from the LLM. Otherwise, ai_response.content will be output to the terminal.
# After All Lua Called with inputs and outputs for post-processing after all inputs are completed.
  • # Before All / # After All can be considered as the map/reduce of the agent, and these will be run before and after the input processing.

more info on stages

See the aipack documentation at _init/doc/README.md (With Lua modules doc)

You can also run the ask-aipack agent.

# IMPORTANT - Make sure you have the `OPENAI_API_KEY` or the key of your model in your environment
aip run core@ask-aipack
# prompt file will be at `.aipack/.prompt/core@ask-aipack/ask-prompt.md`

About

Run, Build, Share your AI Packs

Resources

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •