← All libraries

LlamaIndex

JS + Python
↗ Official site

Database & backend

Connect LLMs to your own data — PDFs, databases, APIs.

What it does

LlamaIndex specializes in ingesting and indexing your data (documents, PDFs, databases) so an LLM can answer questions about it. It's the go-to for building 'chat with your documents' features.

When to use it

Use this when you want an AI to answer questions based on your own documents, databases, or data.

Real example

You want users to upload their own PDF and immediately ask questions about it. Prompt: 'Use LlamaIndex to read the uploaded PDF with SimpleDirectoryReader, build a VectorStoreIndex, create a query engine, and return the result of query_engine.query(user_question) as the API response.'

Good to know

More focused than LangChain on data ingestion and retrieval. If your main use case is RAG (AI answering questions from documents), LlamaIndex is often simpler to use.

Alternatives

Install

$ pip install llama-index

Use cases

RAGdocument Q&Adata indexingPDF chatLLM

Language

JS + Python

Category

Database & backend

More in Database & backend

Other tools in the same category