https://github.com/alixunxing/mario-gpt

Generating Mario Levels with GPT2. Code for the paper "MarioGPT: Open-Ended Text2Level Generation through Large Language Models" https://arxiv.org/abs/2302.05981

https://github.com/alixunxing/mario-gpt

Science Score: 23.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
    Found 3 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (11.7%) to scientific vocabulary
Last synced: 6 months ago · JSON representation

Repository

Generating Mario Levels with GPT2. Code for the paper "MarioGPT: Open-Ended Text2Level Generation through Large Language Models" https://arxiv.org/abs/2302.05981

Basic Info
Statistics
  • Stars: 0
  • Watchers: 0
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Fork of shyamsn97/mario-gpt
Created about 3 years ago · Last pushed about 3 years ago

https://github.com/alixunxing/mario-gpt/blob/main/

# MarioGPT: Open-Ended Text2Level Generation through Large Language Models [![Paper](https://img.shields.io/badge/paper-arxiv.2302.05981-B31B1B.svg)](https://arxiv.org/abs/2302.05981) [![PyPi version](https://badgen.net/pypi/v/mario-gpt/)](https://pypi.org/project/mario-gpt) HuggingFace Spaces [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/16KR9idJUim6RAiyPASoQAaC768AvOGxP?usp=sharing) [Playing Generated Level](#interacting-with-levels) | Generated Level :-------------------------:|:-------------------------: ![alt text](static/example_interactive.gif) | ![alt text](static/test_level.png)
How does it work? ---- Architecture | Example Prompt Generations :-------------------------:|:-------------------------: ![alt text](static/architecture.png) | ![alt text](static/prompt-samples.png) MarioGPT is a finetuned GPT2 model (specifically, [distilgpt2](https://huggingface.co/distilgpt2)), that is trained on a subset Super Mario Bros and Super Mario Bros: The Lost Levels levels, provided by [The Video Game Level Corpus](https://github.com/TheVGLC/TheVGLC). MarioGPT is able to generate levels, guided by a simple text prompt. This generation is not perfect, but we believe this is a great first step more controllable and diverse level / environment generation. Forward generation: ![alt text](static/timelapse_0.gif) Requirements ---- - python3.8+ Installation --------------- from pypi ``` pip install mario-gpt ``` or from source ``` git clone git@github.com:shyamsn97/mario-gpt.git python setup.py install ``` Generating Levels ------------- Since our models are built off of the amazing [transformers](https://github.com/huggingface/transformers) library, we host our model in https://huggingface.co/shyamsn97/Mario-GPT2-700-context-length This code snippet is the minimal code you need to generate a mario level! ```python from mario_gpt import MarioLM, SampleOutput # pretrained_model = shyamsn97/Mario-GPT2-700-context-length mario_lm = MarioLM() # use cuda to speed stuff up # import torch # device = torch.device('cuda') # mario_lm = mario_lm.to(device) prompts = ["many pipes, many enemies, some blocks, high elevation"] # generate level of size 1400, pump temperature up to ~2.4 for more stochastic but playable levels generated_level = mario_lm.sample( prompts=prompts, num_steps=1400, temperature=2.0, use_tqdm=True ) # show string list generated_level.level # show PIL image generated_level.img # save image generated_level.img.save("generated_level.png") # save text level to file generated_level.save("generated_level.txt") # play in interactive generated_level.play() # run Astar agent generated_level.run_astar() # Continue generation generated_level_continued = mario_lm.sample( seed=generated_level, prompts=prompts, num_steps=1400, temperature=2.0, use_tqdm=True ) # load from text file loaded_level = SampleOutput.load("generated_level.txt") # play from loaded (should be the same level that we generated) loaded_level.play() ... ``` ##### See [notebook](notebooks/Sampling.ipynb) for a more in depth tutorial to generate levels Interacting with Levels ------------- Right now there are two ways to interact with generated levels: 1) [Huggingface demo](https://huggingface.co/spaces/multimodalart/mariogpt) -- Thanks to the amazing work by [multimodalart](https://github.com/multimodalart), you can generate and play levels interactively in the browser! In addition, gpus are provided so you don't have to own one yourself. 2) Using the [play and astar methods](mario_gpt/simulator/simulator.py). These require you to have java installed on your computer (Java 8+ tested). For interactive, use the `play()` method and for astar use the `run_astar` method. Example: ```python from mario_gpt import MarioLM mario_lm = MarioLM() prompts = ["many pipes, many enemies, some blocks, high elevation"] generated_level = mario_lm.sample( prompts=prompts, num_steps=1400, temperature=2.0, use_tqdm=True ) # play in interactive generated_level.play() # run Astar agent generated_level.run_astar() ``` ## Future Plans Here's a list of some stuff that will be added to the codebase! - [x] Basic inference code - [x] Add MarioBert Model - [x] Add Interactive simulator - [ ] Inpainting functionality from paper - [ ] Open-ended level generation code - [ ] Training code from paper - [ ] Different generation methods (eg. constrained beam search, etc.) Authors ------- Shyam Sudhakaran , Miguel Gonzlez-Duque , Claire Glanois , Matthias Freiberger , Elias Najarro , Sebastian Risi , Citation ------ If you use the code for academic or commecial use, please cite the associated paper: ``` @misc{https://doi.org/10.48550/arxiv.2302.05981, doi = {10.48550/ARXIV.2302.05981}, url = {https://arxiv.org/abs/2302.05981}, author = {Sudhakaran, Shyam and Gonzlez-Duque, Miguel and Glanois, Claire and Freiberger, Matthias and Najarro, Elias and Risi, Sebastian}, keywords = {Artificial Intelligence (cs.AI), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {MarioGPT: Open-Ended Text2Level Generation through Large Language Models}, publisher = {arXiv}, year = {2023}, copyright = {arXiv.org perpetual, non-exclusive license} } ```

Owner

  • Login: alixunxing
  • Kind: user

GitHub Events

Total
Last Year