llama_index

Amazon Q GitHub 統合が元リポジトリに PR を投げないよう、一旦 Fork したものをコピーして元リポジトリから切断

https://github.com/hmatsu47/llama_index

Science Score: 39.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (2.0%) to scientific vocabulary

Scientific Fields

Artificial Intelligence and Machine Learning Computer Science - 87% confidence
Last synced: 4 months ago · JSON representation

Repository

Amazon Q GitHub 統合が元リポジトリに PR を投げないよう、一旦 Fork したものをコピーして元リポジトリから切断

Basic Info
Statistics
  • Stars: 0
  • Watchers: 1
  • Forks: 0
  • Open Issues: 2
  • Releases: 0
Created 8 months ago · Last pushed 5 months ago
Metadata Files
Readme Changelog Contributing License Code of conduct Citation Security

README.md

LlamaIndex

PyPI - Downloads GitHub contributors Discord Twitter Reddit Ask AI

LlamaIndex (GPT Index) is a data framework for your LLM application. Building with LlamaIndex typically involves working with LlamaIndex core and a chosen set of integrations (or plugins). There are two ways to start building with LlamaIndex in Python:

  1. Starter: llama-index. A starter Python package that includes core LlamaIndex as well as a selection of integrations.

  2. Customized: llama-index-core. Install core LlamaIndex and add your chosen LlamaIndex integration packages on LlamaHub that are required for your application. There are over 300 LlamaIndex integration packages that work seamlessly with core, allowing you to build with your preferred LLM, embedding, and vector store providers.

The LlamaIndex Python library is namespaced such that import statements which include core imply that the core package is being used. In contrast, those statements without core imply that an integration package is being used.

```python

typical pattern

from llamaindex.core.xxx import ClassABC # core submodule xxx from llamaindex.xxx.yyy import ( SubclassABC, ) # integration yyy for submodule xxx

concrete example

from llamaindex.core.llms import LLM from llamaindex.llms.openai import OpenAI ```

Important Links

LlamaIndex.TS (Typescript/Javascript)

Documentation

X (formerly Twitter)

LinkedIn

Reddit

Discord

Ecosystem

Overview

NOTE: This README is not updated as frequently as the documentation. Please check out the documentation above for the latest updates!

Context

  • LLMs are a phenomenal piece of technology for knowledge generation and reasoning. They are pre-trained on large amounts of publicly available data.
  • How do we best augment LLMs with our own private data?

We need a comprehensive toolkit to help perform this data augmentation for LLMs.

Proposed Solution

That's where LlamaIndex comes in. LlamaIndex is a "data framework" to help you build LLM apps. It provides the following tools:

  • Offers data connectors to ingest your existing data sources and data formats (APIs, PDFs, docs, SQL, etc.).
  • Provides ways to structure your data (indices, graphs) so that this data can be easily used with LLMs.
  • Provides an advanced retrieval/query interface over your data: Feed in any LLM input prompt, get back retrieved context and knowledge-augmented output.
  • Allows easy integrations with your outer application framework (e.g. with LangChain, Flask, Docker, ChatGPT, or anything else).

LlamaIndex provides tools for both beginner users and advanced users. Our high-level API allows beginner users to use LlamaIndex to ingest and query their data in 5 lines of code. Our lower-level APIs allow advanced users to customize and extend any module (data connectors, indices, retrievers, query engines, reranking modules), to fit their needs.

Contributing

Interested in contributing? Contributions to LlamaIndex core as well as contributing integrations that build on the core are both accepted and highly encouraged! See our Contribution Guide for more details.

New integrations should meaningfully integrate with existing LlamaIndex framework components. At the discretion of LlamaIndex maintainers, some integrations may be declined.

Documentation

Full documentation can be found here

Please check it out for the most up-to-date tutorials, how-to guides, references, and other resources!

Example Usage

```sh

custom selection of integrations to work with core

pip install llama-index-core pip install llama-index-llms-openai pip install llama-index-llms-replicate pip install llama-index-embeddings-huggingface ```

Examples are in the docs/examples folder. Indices are in the indices folder (see list of indices below).

To build a simple vector store index using OpenAI:

```python import os

os.environ["OPENAIAPIKEY"] = "YOUROPENAIAPI_KEY"

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader

documents = SimpleDirectoryReader("YOURDATADIRECTORY").loaddata() index = VectorStoreIndex.fromdocuments(documents) ```

To build a simple vector store index using non-OpenAI LLMs, e.g. Llama 2 hosted on Replicate, where you can easily create a free trial API token:

```python import os

os.environ["REPLICATEAPITOKEN"] = "YOURREPLICATEAPI_TOKEN"

from llamaindex.core import Settings, VectorStoreIndex, SimpleDirectoryReader from llamaindex.embeddings.huggingface import HuggingFaceEmbedding from llama_index.llms.replicate import Replicate from transformers import AutoTokenizer

set the LLM

llama27bchat = "meta/llama-2-7b-chat:8e6975e5ed6174911a6ff3d60540dfd4844201974602551e10e9e87ab143d81e" Settings.llm = Replicate( model=llama27bchat, temperature=0.01, additionalkwargs={"topp": 1, "maxnewtokens": 300}, )

set tokenizer to match LLM

Settings.tokenizer = AutoTokenizer.from_pretrained( "NousResearch/Llama-2-7b-chat-hf" )

set the embed model

Settings.embedmodel = HuggingFaceEmbedding( modelname="BAAI/bge-small-en-v1.5" )

documents = SimpleDirectoryReader("YOURDATADIRECTORY").loaddata() index = VectorStoreIndex.fromdocuments( documents, ) ```

To query:

python query_engine = index.as_query_engine() query_engine.query("YOUR_QUESTION")

By default, data is stored in-memory. To persist to disk (under ./storage):

python index.storage_context.persist()

To reload from disk:

```python from llamaindex.core import StorageContext, loadindexfromstorage

rebuild storage context

storagecontext = StorageContext.fromdefaults(persist_dir="./storage")

load index

index = loadindexfromstorage(storagecontext) ```

Dependencies

We use poetry as the package manager for all Python packages. As a result, the dependencies of each Python package can be found by referencing the pyproject.toml file in each of the package's folders.

bash cd <desired-package-folder> pip install poetry poetry install --with dev

Citation

Reference to cite if you use LlamaIndex in a paper:

@software{Liu_LlamaIndex_2022, author = {Liu, Jerry}, doi = {10.5281/zenodo.1234}, month = {11}, title = {{LlamaIndex}}, url = {https://github.com/jerryjliu/llama_index}, year = {2022} }

Owner

  • Name: hmatsu47
  • Login: hmatsu47
  • Kind: user
  • Location: Aichi, Japan

MySQL / AWS / Alibaba Cloud / SC (RISS #001158) & NW & DB

GitHub Events

Total
  • Issues event: 13
  • Delete event: 6
  • Issue comment event: 74
  • Push event: 37
  • Pull request review comment event: 123
  • Pull request review event: 129
  • Pull request event: 21
  • Create event: 15
Last Year
  • Issues event: 13
  • Delete event: 6
  • Issue comment event: 74
  • Push event: 37
  • Pull request review comment event: 123
  • Pull request review event: 129
  • Pull request event: 21
  • Create event: 15

Dependencies

.github/workflows/build_package.yml actions
  • actions/checkout v4 composite
  • astral-sh/setup-uv v6 composite
.github/workflows/codeql.yml actions
  • actions/checkout v4 composite
  • github/codeql-action/analyze v3 composite
  • github/codeql-action/autobuild v3 composite
  • github/codeql-action/init v3 composite
.github/workflows/core-typecheck.yml actions
  • actions/checkout v4 composite
  • astral-sh/setup-uv v6 composite
.github/workflows/lint.yml actions
  • actions/checkout v4 composite
  • astral-sh/setup-uv v6 composite
.github/workflows/llama_dev_tests.yml actions
  • actions/checkout v4 composite
  • astral-sh/setup-uv v6 composite
.github/workflows/publish_release.yml actions
  • CSchoel/release-notes-from-changelog v1 composite
  • actions/checkout v4 composite
  • actions/create-release v1 composite
  • actions/upload-release-asset v1 composite
  • astral-sh/setup-uv v6 composite
.github/workflows/publish_sub_package.yml actions
  • actions/checkout v4 composite
  • astral-sh/setup-uv v6 composite
.github/workflows/unit_test.yml actions
  • actions/checkout v4 composite
  • astral-sh/setup-uv v6 composite
llama-index-core/tests/docker-compose.yml docker
  • docker.elastic.co/elasticsearch/elasticsearch 8.10.0
  • ghcr.io/chroma-core/chroma latest
llama-index-integrations/embeddings/llama-index-embeddings-elasticsearch/docker-compose.yml docker
  • docker.elastic.co/elasticsearch/elasticsearch 8.10.0
llama-index-integrations/vector_stores/llama-index-vector-stores-clickhouse/docker-compose.yml docker
  • clickhouse/clickhouse-server 24.1
llama-index-integrations/vector_stores/llama-index-vector-stores-elasticsearch/tests/docker-compose.yml docker
  • docker.elastic.co/elasticsearch/elasticsearch 8.13.2
llama-index-integrations/vector_stores/llama-index-vector-stores-opensearch/tests/docker-compose.yml docker
  • opensearchproject/opensearch latest
llama-index-networks/examples/demo/contributor-1/Dockerfile docker
  • python 3.10-slim build
llama-index-networks/examples/demo/contributor-2/Dockerfile docker
  • python 3.10-slim build
llama-index-networks/examples/demo/contributor-3/Dockerfile docker
  • python 3.10-slim build
llama-index-networks/examples/demo/docker-compose.yml docker
  • contributor_1 latest
  • contributor_2 latest
  • contributor_3 latest
llama-index-networks/examples/privacy_safe_retrieval/contributor-1/Dockerfile docker
  • python 3.10-slim build
llama-index-networks/examples/privacy_safe_retrieval/contributor-2/Dockerfile docker
  • python 3.10-slim build
llama-index-networks/examples/privacy_safe_retrieval/docker-compose.yml docker
  • contributor_1 latest
  • contributor_2 latest
docs/poetry.lock pypi
  • 155 dependencies
docs/pyproject.toml pypi
  • appnope 0.1.4
  • asttokens 2.4.1
  • attrs 23.2.0
  • babel 2.14.0
  • beautifulsoup4 4.12.3
  • bleach 6.1.0
  • certifi 2024.7.4
  • charset-normalizer 3.3.2
  • colorama 0.4.6
  • comm 0.2.2
  • debugpy 1.8.1
  • decorator 5.1.1
  • defusedxml 0.7.1
  • executing 2.0.1
  • fastjsonschema 2.19.1
  • ghp-import 2.1.0
  • griffe-fieldz ^0.2.0
  • idna 3.7
  • ipykernel 6.29.3
  • ipython 8.22.2
  • jedi 0.19.1
  • jinja2 ^3.1.6
  • jsonschema 4.21.1
  • jsonschema-specifications 2023.12.1
  • jupyter-client 8.6.1
  • jupyter-core 5.7.2
  • jupyterlab-pygments 0.3.0
  • jupytext 1.16.1
  • llama_deploy <1
  • markdown ^3.5.2
  • markdown-it-py ^3.0.0
  • markupsafe 2.1.5
  • matplotlib-inline 0.1.6
  • mdit-py-plugins 0.4.0
  • mdurl 0.1.2
  • mergedeep 1.3.4
  • mistune 3.0.2
  • mkdocs ^1.6.1
  • mkdocs-autorefs ^1.0.1
  • mkdocs-click ^0.8.1
  • mkdocs-github-admonitions-plugin ^0.0.2
  • mkdocs-include-dir-to-nav ^1.2.0
  • mkdocs-jupyter ^0.24.6
  • mkdocs-material ^9.5.13
  • mkdocs-material-extensions 1.3.1
  • mkdocs-redirects ^1.2.1
  • mkdocs-render-swagger-plugin ^0.1.2
  • mkdocstrings ^0.26.1
  • nbclient 0.10.0
  • nbconvert 7.16.2
  • nbformat 5.10.2
  • nest-asyncio 1.6.0
  • packaging 24.0
  • paginate 0.5.6
  • pandocfilters 1.5.1
  • parso 0.8.3
  • pathspec 0.12.1
  • pexpect 4.9.0
  • platformdirs 4.2.0
  • prompt-toolkit 3.0.43
  • psutil 5.9.8
  • ptyprocess 0.7.0
  • pure-eval 0.2.2
  • pygments 2.17.2
  • pymdown-extensions 10.7.1
  • python ^3.11
  • python-dateutil 2.9.0.post0
  • pyyaml 6.0.1
  • pyyaml-env-tag 0.1
  • pyzmq 25.1.2
  • referencing 0.33.0
  • regex 2023.12.25
  • requests ^2.32.0
  • rpds-py 0.18.0
  • six 1.16.0
  • soupsieve 2.5
  • stack-data 0.6.3
  • tinycss2 1.2.1
  • toml 0.10.2
  • tornado 6.4.2
  • traitlets 5.14.2
  • urllib3 2.2.2
  • watchdog 4.0.0
  • wcwidth 0.2.13
  • webencodings 0.5.1
llama-dev/pyproject.toml pypi
  • click *
  • packaging >=25.0
  • rich >=14.0.0
  • tomli >=2.2.1
llama-dev/tests/data/llama-index-integrations/storage/subcat/pkg2/pyproject.toml pypi
llama-dev/tests/data/llama-index-integrations/vector_stores/pkg1/pyproject.toml pypi
llama-dev/tests/data/llama-index-packs/pack1/pyproject.toml pypi
llama-dev/tests/data/llama-index-packs/pack2/pyproject.toml pypi
llama-dev/tests/data/llama-index-utils/util/pyproject.toml pypi
llama-dev/uv.lock pypi
  • chardet 5.2.0
  • click 8.1.8
  • colorama 0.4.6
  • coverage 7.8.0
  • diff-cover 9.2.4
  • exceptiongroup 1.2.2
  • iniconfig 2.1.0
  • jinja2 3.1.6
  • llama-dev 0.1
  • markdown-it-py 3.0.0
  • markupsafe 3.0.2
  • mdurl 0.1.2
  • packaging 25.0
  • pluggy 1.5.0
  • pygments 2.19.1
  • pytest 8.3.5
  • pytest-cov 6.1.1
  • rich 14.0.0
  • tomli 2.2.1
  • typing-extensions 4.13.2
llama-index-cli/pyproject.toml pypi
  • black <=23.9.1,>=23.7.0 develop
  • codespell >=v2.2.6 develop
  • ipython 8.10.0 develop
  • jupyter ^1.0.0 develop
  • llama-index-vector-stores-chroma ^0.4.1 develop
  • mypy 0.991 develop
  • pre-commit 3.2.0 develop
  • pylint 2.15.10 develop
  • pytest 7.2.1 develop
  • pytest-mock 3.11.1 develop
  • ruff 0.0.292 develop
  • tree-sitter-languages ^1.8.0 develop
  • types-Deprecated >=0.1.0 develop
  • types-PyYAML ^6.0.12.12 develop
  • types-protobuf ^4.24.0.4 develop
  • types-redis 4.5.5.0 develop
  • types-requests 2.28.11.8 develop
  • types-setuptools 67.1.0.0 develop
  • llama-index-core ^0.12.0
  • llama-index-embeddings-openai ^0.3.0
  • llama-index-llms-openai ^0.3.0
  • python >=3.9,<4.0
llama-index-core/pyproject.toml pypi
  • PyYAML >=6.0.1
  • SQLAlchemy [asyncio]>=1.4.49
  • aiohttp >=3.8.6,<4
  • aiosqlite *
  • banks >=2.0.0,<3
  • dataclasses-json *
  • deprecated >=1.2.9.3
  • dirtyjson >=1.0.8,<2
  • eval-type-backport >=0.2.0,<0.3 ; python_version < '3.10'
  • filetype >=1.2.0,<2
  • fsspec >=2023.5.0
  • httpx *
  • nest-asyncio >=1.5.8,<2
  • networkx >=3.0
  • nltk >3.8.1
  • numpy *
  • pillow >=9.0.0
  • pydantic >=2.8.0
  • requests >=2.31.0
  • tenacity >=8.2.0,!=8.4.0,<10.0.0
  • tiktoken >=0.7.0
  • tqdm >=4.66.1,<5
  • typing-extensions >=4.5.0
  • typing-inspect >=0.8.0
  • wrapt *
llama-index-core/uv.lock pypi
  • 121 dependencies
llama-index-experimental/pyproject.toml pypi
  • black <=23.9.1,>=23.7.0 develop
  • codespell >=v2.2.6 develop
  • ipython 8.10.0 develop
  • jupyter ^1.0.0 develop
  • mypy 0.991 develop
  • pre-commit 3.2.0 develop
  • pylint 2.15.10 develop
  • pytest 7.2.1 develop
  • pytest-mock 3.11.1 develop
  • ruff 0.0.292 develop
  • tree-sitter-languages ^1.8.0 develop
  • types-Deprecated >=0.1.0 develop
  • types-PyYAML ^6.0.12.12 develop
  • types-protobuf ^4.24.0.4 develop
  • types-redis 4.5.5.0 develop
  • types-requests 2.28.11.8 develop
  • types-setuptools 67.1.0.0 develop
  • duckdb ^1.0.0
  • llama-index-core ^0.12.13
  • llama-index-finetuning ^0.3.2
  • pandas *
  • python >=3.10,<4.0
llama-index-finetuning/pyproject.toml pypi
  • black <=23.9.1,>=23.7.0 develop
  • codespell >=v2.2.6 develop
  • ipython 8.10.0 develop
  • jupyter ^1.0.0 develop
  • mypy 0.991 develop
  • pre-commit 3.2.0 develop
  • pylint 2.15.10 develop
  • pytest 7.2.1 develop
  • pytest-mock 3.11.1 develop
  • ruff 0.0.292 develop
  • tree-sitter-languages ^1.8.0 develop
  • types-Deprecated >=0.1.0 develop
  • types-PyYAML ^6.0.12.12 develop
  • types-protobuf ^4.24.0.4 develop
  • types-redis 4.5.5.0 develop
  • types-requests 2.28.11.8 develop
  • types-setuptools 67.1.0.0 develop
  • llama-index-core ^0.12.0
  • llama-index-embeddings-adapter ^0.3.0
  • llama-index-llms-azure-openai ^0.3.0
  • llama-index-llms-mistralai ^0.4.0
  • llama-index-postprocessor-cohere-rerank ^0.3.0
  • mistralai >=1.7.0
  • python >=3.10,<4.0
  • sentence-transformers >=2.3.0
llama-index-integrations/agent/llama-index-agent-coa/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/agent/llama-index-agent-coa/uv.lock pypi
  • 186 dependencies
llama-index-integrations/agent/llama-index-agent-dashscope/pyproject.toml pypi
  • dashscope >=1.17.0
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/agent/llama-index-agent-dashscope/uv.lock pypi
  • 187 dependencies
llama-index-integrations/agent/llama-index-agent-introspective/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/agent/llama-index-agent-introspective/uv.lock pypi
  • 186 dependencies
llama-index-integrations/agent/llama-index-agent-lats/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/agent/llama-index-agent-lats/uv.lock pypi
  • 186 dependencies
llama-index-integrations/agent/llama-index-agent-llm-compiler/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
  • llama-index-llms-openai >=0.3.0,<0.4
llama-index-integrations/agent/llama-index-agent-llm-compiler/uv.lock pypi
  • 190 dependencies
llama-index-integrations/agent/llama-index-agent-openai/pyproject.toml pypi
  • llama-index-core >=0.12.18,<0.13
  • llama-index-llms-openai >=0.3.0,<0.4
  • openai >=1.14.0
llama-index-integrations/agent/llama-index-agent-openai/uv.lock pypi
  • 191 dependencies
llama-index-integrations/agent/llama-index-agent-openai-legacy/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
  • llama-index-llms-openai >=0.3.0,<0.4
llama-index-integrations/agent/llama-index-agent-openai-legacy/uv.lock pypi
  • 190 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-agentops/pyproject.toml pypi
  • agentops >=0.2.2,<0.3
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/callbacks/llama-index-callbacks-agentops/uv.lock pypi
  • 189 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-aim/pyproject.toml pypi
  • aim >=3.17.5,<4
  • jwt >=1.3.1,<2
  • llama-index-core >=0.10.1,<0.11
llama-index-integrations/callbacks/llama-index-callbacks-aim/uv.lock pypi
  • 204 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-argilla/pyproject.toml pypi
  • argilla >=1.22.0
  • argilla-llama-index >=1.0.0
llama-index-integrations/callbacks/llama-index-callbacks-argilla/uv.lock pypi
  • 220 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-arize-phoenix/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
  • openinference-instrumentation-llama-index >=4.1.0
llama-index-integrations/callbacks/llama-index-callbacks-arize-phoenix/uv.lock pypi
  • 196 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-deepeval/pyproject.toml pypi
  • deepeval *
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/callbacks/llama-index-callbacks-deepeval/uv.lock pypi
  • 256 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-honeyhive/pyproject.toml pypi
  • honeyhive >=0.1.79,<0.2
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/callbacks/llama-index-callbacks-honeyhive/uv.lock pypi
  • 223 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-langfuse/pyproject.toml pypi
  • langfuse >=2.21.2,<3
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/callbacks/llama-index-callbacks-langfuse/uv.lock pypi
  • 191 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-literalai/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/callbacks/llama-index-callbacks-literalai/uv.lock pypi
  • 189 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-openinference/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/callbacks/llama-index-callbacks-openinference/uv.lock pypi
  • 186 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-opik/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/callbacks/llama-index-callbacks-opik/uv.lock pypi
  • 189 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-promptlayer/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
  • promptlayer >=0.4.2,<0.5
llama-index-integrations/callbacks/llama-index-callbacks-promptlayer/uv.lock pypi
  • 187 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-uptrain/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
  • uptrain >=0.7.1
llama-index-integrations/callbacks/llama-index-callbacks-uptrain/uv.lock pypi
  • 209 dependencies
llama-index-integrations/callbacks/llama-index-callbacks-wandb/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
  • wandb >=0.16.2,<0.17
llama-index-integrations/callbacks/llama-index-callbacks-wandb/uv.lock pypi
  • 195 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-adapter/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
  • torch >=2.0.0
llama-index-integrations/embeddings/llama-index-embeddings-adapter/uv.lock pypi
  • 204 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-alephalpha/pyproject.toml pypi
  • aleph-alpha-client >=7.0.1,<8
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-alephalpha/uv.lock pypi
  • 194 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-alibabacloud-aisearch/pyproject.toml pypi
  • alibabacloud-searchplat20240529 >=1.1.0,<2
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-alibabacloud-aisearch/uv.lock pypi
  • 206 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-anyscale/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
  • llama-index-llms-openai >=0.3.0,<0.4
llama-index-integrations/embeddings/llama-index-embeddings-anyscale/uv.lock pypi
  • 190 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-autoembeddings/pyproject.toml pypi
  • chonkie [all]
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-autoembeddings/uv.lock pypi
  • 247 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-azure-inference/pyproject.toml pypi
  • aiohttp >=3.10.0,<4
  • azure-ai-inference >=1.0.0b5
  • azure-identity >=1.15.0,<2
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-azure-inference/uv.lock pypi
  • 193 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-azure-openai/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
  • llama-index-embeddings-openai >=0.3.0,<0.4
  • llama-index-llms-azure-openai >=0.3.0,<0.4
llama-index-integrations/embeddings/llama-index-embeddings-azure-openai/uv.lock pypi
  • 200 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-bedrock/pyproject.toml pypi
  • aioboto3 >=13.1.1,<14
  • boto3 >=1.34.23,<2
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-bedrock/uv.lock pypi
  • 196 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-clarifai/pyproject.toml pypi
  • clarifai >=10.0.1,<11
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-clarifai/uv.lock pypi
  • 204 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-clip/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-clip/uv.lock pypi
  • 186 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-cloudflare-workersai/pyproject.toml pypi
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-cloudflare-workersai/uv.lock pypi
  • 186 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-cohere/pyproject.toml pypi
  • cohere >=5.15,<6
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-cohere/uv.lock pypi
  • 192 dependencies
llama-index-integrations/embeddings/llama-index-embeddings-dashscope/pyproject.toml pypi
  • dashscope >1.10.0
  • llama-index-core >=0.12.0,<0.13
llama-index-integrations/embeddings/llama-index-embeddings-dashscope/uv.lock pypi
  • 187 dependencies