Skip to content

Commit

Permalink
Merge branch 'main' into ms/remove-service-context
Browse files Browse the repository at this point in the history
  • Loading branch information
marcusschiesser committed Feb 12, 2025
2 parents c24fb86 + a87efb9 commit cdca687
Show file tree
Hide file tree
Showing 359 changed files with 8,415 additions and 7,803 deletions.
16 changes: 16 additions & 0 deletions .changeset/large-mugs-approve.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
---
"@llamaindex/milvus": minor
"@llamaindex/qdrant": minor
"@llamaindex/next-node-runtime-test": minor
"@llamaindex/azure": minor
"@llamaindex/cloudflare-hono": minor
"@llamaindex/anthropic": minor
"@llamaindex/llamaindex-test": minor
"llamaindex": minor
"@llamaindex/core": minor
"@llamaindex/doc": minor
"@llamaindex/examples": minor
"@llamaindex/e2e": minor
---

Remove re-exports from llamaindex main package
15 changes: 0 additions & 15 deletions .changeset/long-insects-collect.md

This file was deleted.

5 changes: 5 additions & 0 deletions .changeset/moody-eggs-destroy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"@llamaindex/doc": patch
---

docs: update chat engine docs
8 changes: 0 additions & 8 deletions .changeset/nasty-walls-grab.md

This file was deleted.

5 changes: 0 additions & 5 deletions .changeset/olive-foxes-watch.md

This file was deleted.

7 changes: 0 additions & 7 deletions .changeset/polite-coats-return.md

This file was deleted.

5 changes: 5 additions & 0 deletions .changeset/quiet-teachers-shave.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"@llamaindex/doc": patch
---

docs: update workflow doc
5 changes: 0 additions & 5 deletions .changeset/sharp-walls-marry.md

This file was deleted.

5 changes: 0 additions & 5 deletions .changeset/smart-cameras-run.md

This file was deleted.

6 changes: 6 additions & 0 deletions .changeset/tall-kids-prove.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
---
"@llamaindex/core": patch
"llamaindex": patch
---

feat: asChatEngine function for index
8 changes: 0 additions & 8 deletions .changeset/tender-swans-call.md

This file was deleted.

6 changes: 0 additions & 6 deletions .changeset/violet-panthers-rescue.md

This file was deleted.

2 changes: 1 addition & 1 deletion .github/workflows/preview.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,4 @@ jobs:
run: pnpm run build

- name: Pre Release
run: pnpx pkg-pr-new publish ./packages/* ./packages/providers/*
run: pnpx pkg-pr-new publish --compact --pnpm ./packages/* ./packages/providers/* ./packages/providers/storage/*
5 changes: 0 additions & 5 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -83,11 +83,6 @@ jobs:
run: pnpm install
- name: Build
run: pnpm run build
- name: Use Build For Examples
run: |
pnpm link ../packages/llamaindex/
cd readers && pnpm link ../../packages/llamaindex/
working-directory: ./examples
- name: Run Type Check
run: pnpm run type-check
- name: Run Circular Dependency Check
Expand Down
4 changes: 1 addition & 3 deletions .husky/pre-commit
Original file line number Diff line number Diff line change
@@ -1,3 +1 @@
pnpm format
pnpm lint
npx lint-staged
pnpm run lint-staged
3 changes: 2 additions & 1 deletion .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -14,5 +14,6 @@
"[json]": {
"editor.defaultFormatter": "esbenp.prettier-vscode"
},
"prettier.prettierPath": "./node_modules/prettier"
"prettier.prettierPath": "./node_modules/prettier",
"prettier.configPath": "prettier.config.mjs"
}
70 changes: 70 additions & 0 deletions apps/next/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,75 @@
# @llamaindex/doc

## 0.0.41

### Patch Changes

- Updated dependencies [1c908fd]
- @llamaindex/openai@0.1.51
- @llamaindex/node-parser@0.0.24
- @llamaindex/workflow@0.0.10
- @llamaindex/readers@1.0.25
- @llamaindex/cloud@2.0.24
- @llamaindex/core@0.4.23
- llamaindex@0.8.37

## 0.0.40

### Patch Changes

- Updated dependencies [cb608b5]
- @llamaindex/openai@0.1.50
- @llamaindex/node-parser@0.0.23
- @llamaindex/workflow@0.0.9
- @llamaindex/readers@1.0.24
- @llamaindex/cloud@2.0.23
- @llamaindex/core@0.4.22
- llamaindex@0.8.36

## 0.0.39

### Patch Changes

- 6d4d96f: chore: update examples and docs to use unified imports
- Updated dependencies [15563a0]
- @llamaindex/openai@0.1.49
- llamaindex@0.8.35

## 0.0.38

### Patch Changes

- Updated dependencies [9f8ad37]
- Updated dependencies [7265f74]
- llamaindex@0.8.34
- @llamaindex/openai@0.1.48

## 0.0.37

### Patch Changes

- Updated dependencies [2019a04]
- @llamaindex/openai@0.1.47
- llamaindex@0.8.33

## 0.0.36

### Patch Changes

- f02621e: Fix internal links between chapters
- Updated dependencies [34faf48]
- Updated dependencies [4df1fe6]
- Updated dependencies [9456616]
- Updated dependencies [d6c270e]
- Updated dependencies [1892e1c]
- Updated dependencies [1931bbc]
- llamaindex@0.8.32
- @llamaindex/core@0.4.21
- @llamaindex/cloud@2.0.22
- @llamaindex/openai@0.1.46
- @llamaindex/node-parser@0.0.22
- @llamaindex/readers@1.0.23

## 0.0.35

### Patch Changes
Expand Down
2 changes: 1 addition & 1 deletion apps/next/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@llamaindex/doc",
"version": "0.0.35",
"version": "0.0.41",
"private": true,
"scripts": {
"build": "pnpm run build:docs && next build",
Expand Down
14 changes: 10 additions & 4 deletions apps/next/src/app/(home)/page.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -76,15 +76,19 @@ export default function HomePage() {
>
<MagicMove
code={[
`import { OpenAI } from "llamaindex";
`import { OpenAI } from "@llamaindex/openai";
const llm = new OpenAI();
const response = await llm.complete({ prompt: "How are you?" });`,
`import { OpenAI } from "llamaindex";
`import { OpenAI } from "@llamaindex/openai";
const llm = new OpenAI();
const response = await llm.chat({
messages: [{ content: "Tell me a joke.", role: "user" }],
});`,
`import { OpenAI, ChatMemoryBuffer } from "llamaindex";
`import { ChatMemoryBuffer } from "llamaindex";
import { OpenAI } from "@llamaindex/openai";
const llm = new OpenAI({ model: 'gpt4o-turbo' });
const buffer = new ChatMemoryBuffer({
tokenLimit: 128_000,
Expand All @@ -94,7 +98,9 @@ const response = await llm.chat({
messages: buffer.getMessages(),
stream: true
});`,
`import { OpenAIAgent, ChatMemoryBuffer } from "llamaindex";
`import { ChatMemoryBuffer } from "llamaindex";
import { OpenAIAgent } from "@llamaindex/openai";
const agent = new OpenAIAgent({
llm,
tools: [...myTools]
Expand Down
20 changes: 20 additions & 0 deletions apps/next/src/content/docs/llamaindex/examples/agent_gemini.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,24 @@ title: Gemini Agent
import { DynamicCodeBlock } from 'fumadocs-ui/components/dynamic-codeblock';
import CodeSourceGemini from "!raw-loader!../../../../../../../examples/gemini/agent.ts";

## Installation

import { Tab, Tabs } from "fumadocs-ui/components/tabs";

<Tabs groupId="install" items={["npm", "yarn", "pnpm"]} persist>
```shell tab="npm"
npm install llamaindex @llamaindex/google
```

```shell tab="yarn"
yarn add llamaindex @llamaindex/google
```

```shell tab="pnpm"
pnpm add llamaindex @llamaindex/google
```
</Tabs>

## Source

<DynamicCodeBlock lang="ts" code={CodeSourceGemini} />
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,8 @@ Here's a simple example of how to use the Context-Aware Agent:
import {
Document,
VectorStoreIndex,
OpenAIContextAwareAgent,
OpenAI,
} from "llamaindex";
import { OpenAI, OpenAIContextAwareAgent } from "@llamaindex/openai";

async function createContextAwareAgent() {
// Create and index some documents
Expand Down Expand Up @@ -58,4 +57,3 @@ In this example, the Context-Aware Agent uses the retriever to fetch relevant co
## Available Context-Aware Agents

- `OpenAIContextAwareAgent`: A context-aware agent using OpenAI's models.
- `AnthropicContextAwareAgent`: A context-aware agent using Anthropic's models.
29 changes: 26 additions & 3 deletions apps/next/src/content/docs/llamaindex/examples/other_llms.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,36 @@ import CodeSource from "!raw-loader!../../../../../../../examples/mistral";

By default LlamaIndex.TS uses OpenAI's LLMs and embedding models, but we support [lots of other LLMs](../modules/llms) including models from Mistral (Mistral, Mixtral), Anthropic (Claude) and Google (Gemini).

If you don't want to use an API at all you can [run a local model](../../examples/local_llm)
If you don't want to use an API at all you can [run a local model](../../examples/local_llm).

This example runs you through the process of setting up a Mistral model:


## Installation

import { Tab, Tabs } from "fumadocs-ui/components/tabs";

<Tabs groupId="install" items={["npm", "yarn", "pnpm"]} persist>
```shell tab="npm"
npm install llamaindex @llamaindex/mistral
```

```shell tab="yarn"
yarn add llamaindex @llamaindex/mistral
```

```shell tab="pnpm"
pnpm add llamaindex @llamaindex/mistral
```
</Tabs>

## Using another LLM

You can specify what LLM LlamaIndex.TS will use on the `Settings` object, like this:

```typescript
import { MistralAI, Settings } from "llamaindex";
import { MistralAI } from "@llamaindex/mistral";
import { Settings } from "llamaindex";

Settings.llm = new MistralAI({
model: "mistral-tiny",
Expand All @@ -29,7 +51,8 @@ You can see examples of other APIs we support by checking out "Available LLMs" i
A frequent gotcha when trying to use a different API as your LLM is that LlamaIndex will also by default index and embed your data using OpenAI's embeddings. To completely switch away from OpenAI you will need to set your embedding model as well, for example:

```typescript
import { MistralAIEmbedding, Settings } from "llamaindex";
import { MistralAIEmbedding } from "@llamaindex/mistral";
import { Settings } from "llamaindex";

Settings.embedModel = new MistralAIEmbedding();
```
Expand Down
21 changes: 21 additions & 0 deletions apps/next/src/content/docs/llamaindex/getting_started/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ description: Install llamaindex by running a single command.

import { Tab, Tabs } from "fumadocs-ui/components/tabs";

To install llamaindex, run the following command:

<Tabs groupId="install" items={["npm", "yarn", "pnpm"]} persist>
```shell tab="npm"
npm install llamaindex
Expand All @@ -19,6 +21,25 @@ import { Tab, Tabs } from "fumadocs-ui/components/tabs";
```
</Tabs>

In most cases, you'll also need an LLM package to use LlamaIndex. For example, to use the OpenAI LLM, you would install the following:

<Tabs groupId="install" items={["npm", "yarn", "pnpm"]} persist>
```shell tab="npm"
npm install @llamaindex/openai
```

```shell tab="yarn"
yarn add @llamaindex/openai
```

```shell tab="pnpm"
pnpm add @llamaindex/openai
```
</Tabs>

Go to [Using other LLM APIs](/docs/llamaindex/examples/other_llms) to find out how to use other LLMs.


## What's next?

<Cards>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ LlamaIndex.TS is written in TypeScript and designed to be used in TypeScript pro
We do lots of work on strong typing to make sure you have a great typing experience with LlamaIndex.TS.

```ts twoslash
import { PromptTemplate } from '@llamaindex/core/prompts'
import { PromptTemplate } from 'llamaindex'
const promptTemplate = new PromptTemplate({
template: `Context information from multiple sources is below.
---------------------
Expand All @@ -29,7 +29,7 @@ promptTemplate.format({
```

```ts twoslash
import { FunctionTool } from '@llamaindex/core/tools'
import { FunctionTool } from 'llamaindex'
import { z } from 'zod'

// ---cut-before---
Expand Down
Loading

0 comments on commit cdca687

Please sign in to comment.