openelm

Evolution Through Large Models

https://github.com/carperai/openelm

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org, zenodo.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.1%) to scientific vocabulary
Last synced: 7 months ago · JSON representation ·

Repository

Evolution Through Large Models

Basic Info
  • Host: GitHub
  • Owner: CarperAI
  • License: mit
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 6.51 MB
Statistics
  • Stars: 731
  • Watchers: 25
  • Forks: 89
  • Open Issues: 7
  • Releases: 6
Created over 3 years ago · Last pushed over 2 years ago
Metadata Files
Readme Contributing License Code of conduct Citation

README.md

DOI

OpenELM

OpenELM is an open-source library by CarperAI, designed to enable evolutionary search with language models in both code and natural language.

The OpenELM project has the following goals: 1. Release an open-source version of ELM with its associated diff models. 2. Integrate with both open-source language models (run locally or on Colab) and with closed models via paid APIs, such as the OpenAI API. We want to support users with many different compute profiles! 3. Provide a simple interface to a range of example environments for evolutionary search, to let users adapt these easily for their domain. 4. Demonstrate the potential of evolution with LLMs.

For QDAIF: poetry domain currently implemented in main, and other experiment code with few-shot LMX domains currently in experimental branch

Install

pip install openelm

To use the sodarace environment, you must first pip install swig.

Then:

pip install openelm[sodaracer]

See the pyproject.toml for further install options.

Features

LLM integration with evolutionary algorithms

OpenELM supports the quality-diversity algorithms MAP-Elites, CVT-MAP-Elites, and Deep Grid MAP-Elites, as well as a simple genetic algorithm baseline.

Evolutionary operators

OpenELM supports: 1. Prompt-based mutation with instruct models 2. Diff models (specialised for code) 3. Crossover with language models

LLM support, efficiency, and safety

OpenELM’s language models are instantiated as Langchain classes by default, which means that OpenELM can support practically any existing LLM API, as well as models run on your local GPU via HuggingFace Transformers.

We also provide optional Nvidia Triton Inference Server support, intended for use cases where low latency on 8 or more GPUs is important. Finally, for code generation domains, we provide a sandbox environment, consisting of a container server backed with gVisor (a container runtime that introduces an additional barrier between the host and the container) as well as a heuristic-based safety guard.

Baseline environments

  1. Sodarace. A 2D physics-based simulation of robots moving across a variety of terrains. These robots are created by Python programs generated from an LLM.
  2. Image Generation. OpenELM can evolve over generated images by generating code that returns NumPy arrays containing the images. This serves as a simple test environment for code generation
  3. Programming Puzzles. OpenELM can be used to generate diverse solutions to programming puzzles. This environment supports co-evolution of both the problem and the solution at the same time.
  4. Prompts. OpenELM contains a generic environment suitable for evolving prompts for language models, customizable with Langchain templates to the desired domain.
  5. We also include a poetry environment, demonstrating the use of LLMs to evaluate both the quality and diversity of generated creative writing text, as described in a recent CarperAI blog post on Quality-Diversity with AI Feedback (QDAIF).

Architecture

Roughly, ELM consists of a pipeline of different components: 1. The Environment class. This class defines the mechanics of how to initialize members of the population, mutate them with the desired operator, and how to measure the fitness (and diversity) of individuals. 2. The MAPElites class. This class describes how the evolutionary algorithm works, and can be viewed as a wrapper around the environment defining the selection algorithm for generated individuals. 3. The MutationModel class, which is responsible for running the LLM to actually generate new individuals. This functions as a wrapper around the LangChain API. The environment is expected to call the MutationModel when a new individual is needed. 4. The ELM class calls the MAPElites algorithm class and runs the search.

All options for these classes are defined in configs.py, via dataclasses which are registered as a hydra config, and can be overriden via the command line when running one of the example scripts such as run_elm.py.

Running ELM

python run_elm.py will start an ELM evolutionary search using the defaults listed in configs.py. These can be overriden via the command line. For example, you can use run_elm.py env=image_evolution to run the Image Evolution environment.

Sandbox

To use the code execution sandbox, see the sandboxing readme for instructions to set it up in a Docker container with the gVisor runtime.

Triton

We also have code available to run models in Nvidia's Triton Inference Server. See the Triton Readme to get started

Contributing

If you'd like to contribute or have questions, go to the #openelm channel on the CarperAI discord!

Owner

  • Name: CarperAI
  • Login: CarperAI
  • Kind: organization

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it using the metadata from this file."
authors:
  - family-names: "Bradley"
    given-names: "Herbie"
    orcid: "https://orcid.org/0000-0001-5390-1257"
  - family-names: "Fan"
    given-names: "Honglu"
  - family-names: "Carvalho"
    given-names: "Francisco"
  - family-names: "Fisher"
    given-names: "Matthew"
  - family-names: "Castricato"
    given-names: "Louis"
    orcid: "https://orcid.org/0000-0003-2996-886X"
  - family-names: "reciprocated"
  - family-names: "dmayhem93"
  - family-names: "Purohit"
    given-names: "Shivanshu"
  - family-names: "Lehman"
    given-names: "Joel"
    orcid: "https://orcid.org/0000-0002-9535-1123"
title: "OpenELM"
version: 0.1.8
doi: 10.5281/zenodo.7361753
date-released: 2023-01-30
url: "https://github.com/CarperAI/OpenELM"

GitHub Events

Total
  • Watch event: 39
  • Fork event: 8
Last Year
  • Watch event: 39
  • Fork event: 8

Committers

Last synced: 11 months ago

All Time
  • Total Commits: 108
  • Total Committers: 9
  • Avg Commits per committer: 12.0
  • Development Distribution Score (DDS): 0.639
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Herbie Bradley m****l@h****m 39
Honglu Fan h****5@g****m 28
TheExGenesis 7****s 17
mathyouf 1****m@g****m 13
Louis Castricato w****p@g****m 4
Kyoung Whan Choe c****g@g****m 3
reciprocated 5****d 2
Dakota d****3@g****m 1
Andrew 3****9 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 11 months ago

All Time
  • Total issues: 20
  • Total pull requests: 114
  • Average time to close issues: 2 months
  • Average time to close pull requests: 13 days
  • Total issue authors: 7
  • Total pull request authors: 17
  • Average comments per issue: 1.0
  • Average comments per pull request: 0.76
  • Merged pull requests: 99
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • herbiebradley (5)
  • johnhalloran321 (1)
  • dataf3l (1)
  • mathyouf (1)
  • daia99 (1)
  • LinuxIsCool (1)
  • pecanjk (1)
Pull Request Authors
  • herbiebradley (12)
  • daia99 (11)
  • honglu2875 (9)
  • TheExGenesis (5)
  • ryanz8 (5)
  • kywch (4)
  • dsctt (3)
  • HannahBenita (2)
  • snat-s (2)
  • LouisCastricato (2)
  • marcobellagente93 (1)
  • eltociear (1)
  • maxreciprocate (1)
  • mathyouf (1)
  • dmahan93 (1)
Top Labels
Issue Labels
documentation (1) question (1)
Pull Request Labels
enhancement (4)