LlamaIndex

Build data-augmented LLM applications

Integrating Open Source Models with LlamaIndex

LlamaIndex is a data framework for LLM applications. Learn how to use open source models with LlamaIndex.

#

Installation

bash
pip install llama-index llama-index-llms-huggingface

#

Basic Setup

python
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, Settings
from llama_index.llms.huggingface import HuggingFaceLLM

Settings.llm = HuggingFaceLLM(model_name="meta-llama/Llama-2-7b-chat-hf") documents = SimpleDirectoryReader("data").load_data() index = VectorStoreIndex.from_documents(documents) query_engine = index.as_query_engine() response = query_engine.query("What is the main topic?")

#

Advanced Features

  • Hierarchical indexing for large documents
  • Custom retrievers and query engines
  • Multi-document agents
  • Persistent storage with vector databases