ollamar
ollamar: An R package for running large language models - Published in JOSS (2025)
Science Score: 93.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 7 DOI reference(s) in README and JOSS metadata -
✓Academic publication links
Links to: joss.theoj.org -
○Committers with academic emails
-
○Institutional organization owner
-
✓JOSS paper metadata
Published in Journal of Open Source Software
Keywords
ai
api
llm
llms
ollama
ollama-api
r
Last synced: 6 months ago
·
JSON representation
Repository
R library to run Ollama language models
Basic Info
- Host: GitHub
- Owner: hauselin
- License: other
- Language: R
- Default Branch: main
- Homepage: https://hauselin.github.io/ollama-r/
- Size: 4.09 MB
Statistics
- Stars: 108
- Watchers: 4
- Forks: 15
- Open Issues: 0
- Releases: 3
Topics
ai
api
llm
llms
ollama
ollama-api
r
Created almost 2 years ago
· Last pushed 7 months ago
Metadata Files
Readme
Changelog
Contributing
License
Code of conduct
README.Rmd
---
output: github_document
---
```{r, include = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>",
fig.path = "man/figures/README-",
out.width = "100%"
)
```
# ollamar
[](https://CRAN.R-project.org/package=ollamar)
[](https://github.com/hauselin/ollama-r/actions/workflows/R-CMD-check.yaml)
[](https://cran.r-project.org/package=ollamar)
[](https://doi.org/10.21105/joss.07211)
The [Ollama R library](https://hauselin.github.io/ollama-r/) is the easiest way to integrate R with [Ollama](https://ollama.com/), which lets you run language models locally on your own machine.
The library also makes it easy to work with data structures (e.g., conversational/chat histories) that are standard for different LLMs (such as those provided by OpenAI and Anthropic). It also lets you specify different output formats (e.g., dataframes, text/vector, lists) that best suit your need, allowing easy integration with other libraries/tools and parallelization via the `httr2` library.
To use this R library, ensure the [Ollama](https://ollama.com) app is installed. Ollama can use GPUs for accelerating LLM inference. See [Ollama GPU documentation](https://github.com/ollama/ollama/blob/main/docs/gpu.md) for more information.
See [Ollama's Github page](https://github.com/ollama/ollama) for more information. This library uses the [Ollama REST API (see documentation for details)](https://github.com/ollama/ollama/blob/main/docs/api.md) and was last tested on v0.5.4.
> Note: You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.
## Ollama R vs Ollama Python/JS
This library has been inspired by the official [Ollama Python](https://github.com/ollama/ollama-python) and [Ollama JavaScript](https://github.com/ollama/ollama-js) libraries. If you're coming from Python or JavaScript, you should feel right at home. Alternatively, if you plan to use Ollama with Python or JavaScript, using this R library will help you understand the Python/JavaScript libraries as well.
## Installation
1. Download and install the [Ollama](https://ollama.com) app.
- [macOS](https://ollama.com/download/Ollama-darwin.zip)
- [Windows preview](https://ollama.com/download/OllamaSetup.exe)
- Linux: `curl -fsSL https://ollama.com/install.sh | sh`
- [Docker image](https://hub.docker.com/r/ollama/ollama)
2. Open/launch the Ollama app to start the local server.
3. Install either the stable or latest/development version of `ollamar`.
Stable version:
```{r eval=FALSE}
install.packages("ollamar")
```
For the latest/development version with more features/bug fixes (see latest changes [here](https://hauselin.github.io/ollama-r/news/index.html)), you can install it from GitHub using the `install_github` function from the `remotes` library. If it doesn't work or you don't have `remotes` library, please run `install.packages("remotes")` in R or RStudio before running the code below.
```{r eval=FALSE}
# install.packages("remotes") # run this line if you don't have the remotes library
remotes::install_github("hauselin/ollamar")
```
## Example usage
Below is a basic demonstration of how to use the library. For details, see the [getting started vignette](https://hauselin.github.io/ollama-r/articles/ollamar.html) on our [main page](https://hauselin.github.io/ollama-r/).
`ollamar` uses the [`httr2` library](https://httr2.r-lib.org/index.html) to make HTTP requests to the Ollama server, so many functions in this library returns an `httr2_response` object by default. If the response object says `Status: 200 OK`, then the request was successful.
```{r eval=FALSE}
library(ollamar)
test_connection() # test connection to Ollama server
# if you see "Ollama local server not running or wrong server," Ollama app/server isn't running
# download a model
pull("llama3.1") # download a model (equivalent bash code: ollama run llama3.1)
# generate a response/text based on a prompt; returns an httr2 response by default
resp <- generate("llama3.1", "tell me a 5-word story")
resp
#' interpret httr2 response object
#'
#' POST http://127.0.0.1:11434/api/generate # endpoint
#' Status: 200 OK # if successful, status code should be 200 OK
#' Content-Type: application/json
#' Body: In memory (414 bytes)
# get just the text from the response object
resp_process(resp, "text")
# get the text as a tibble dataframe
resp_process(resp, "df")
# alternatively, specify the output type when calling the function initially
txt <- generate("llama3.1", "tell me a 5-word story", output = "text")
# list available models (models you've pulled/downloaded)
list_models()
name size parameter_size quantization_level modified
1 codegemma:7b 5 GB 9B Q4_0 2024-07-27T23:44:10
2 llama3.1:latest 4.7 GB 8.0B Q4_0 2024-07-31T07:44:33
```
## Citing `ollamar`
If you use this library, please cite [this paper](https://joss.theoj.org/papers/10.21105/joss.07211) using the following BibTeX entry:
```bibtex
@article{Lin2025JOSS,
author = {Lin, Hause and Safi, Tawab},
title = {ollamar: An R package for running large language models},
journal = {Journal of Open Source Software},
volume = {10},
number = {105},
pages = {7211},
year = {2025},
month = jan,
volume = {10},
doi = {10.21105/joss.07211},
url = {https://joss.theoj.org/papers/10.21105/joss.07211}
}
```
Owner
- Name: Hause Lin
- Login: hauselin
- Kind: user
- Website: hauselin.com
- Twitter: hauselin
- Repositories: 182
- Profile: https://github.com/hauselin
Researcher at MIT & World Bank
JOSS Publication
ollamar: An R package for running large language models
Published
January 24, 2025
Volume 10, Issue 105, Page 7211
Authors
Tags
large language models Ollama natural language processing artificial intelligenceGitHub Events
Total
- Create event: 1
- Release event: 1
- Issues event: 7
- Watch event: 59
- Issue comment event: 12
- Push event: 34
- Fork event: 5
Last Year
- Create event: 1
- Release event: 1
- Issues event: 7
- Watch event: 59
- Issue comment event: 12
- Push event: 34
- Fork event: 5
Committers
Last synced: 7 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Hause Lin | h****n@g****m | 165 |
| Tawab Safi | a****t@g****m | 9 |
| Kenneth Enevoldsen | k****n@g****m | 2 |
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 29
- Total pull requests: 3
- Average time to close issues: 5 days
- Average time to close pull requests: 1 day
- Total issue authors: 9
- Total pull request authors: 2
- Average comments per issue: 0.9
- Average comments per pull request: 1.0
- Merged pull requests: 3
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 25
- Pull requests: 2
- Average time to close issues: 5 days
- Average time to close pull requests: about 9 hours
- Issue authors: 7
- Pull request authors: 1
- Average comments per issue: 0.8
- Average comments per pull request: 1.5
- Merged pull requests: 2
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- hauselin (20)
- asaficontact (2)
- dhicks (1)
- otoomet (1)
- KC0120 (1)
- keshovsharma (1)
- wenbopeng (1)
- xiaoluolorn (1)
- KennethEnevoldsen (1)
- slackmodel (1)
Pull Request Authors
- KennethEnevoldsen (4)
- asaficontact (2)
Top Labels
Issue Labels
enhancement (3)
documentation (1)
Pull Request Labels
Packages
- Total packages: 2
-
Total downloads:
- cran 1,048 last-month
-
Total dependent packages: 0
(may contain duplicates) -
Total dependent repositories: 0
(may contain duplicates) - Total versions: 7
- Total maintainers: 1
proxy.golang.org: github.com/hauselin/ollama-r
- Documentation: https://pkg.go.dev/github.com/hauselin/ollama-r#section-documentation
- License: other
-
Latest release: v1.2.2
published about 1 year ago
Rankings
Dependent packages count: 6.2%
Average: 6.4%
Dependent repos count: 6.7%
Last synced:
6 months ago
cran.r-project.org: ollamar
'Ollama' Language Models
- Homepage: https://hauselin.github.io/ollama-r/
- Documentation: http://cran.r-project.org/web/packages/ollamar/ollamar.pdf
- License: MIT + file LICENSE
-
Latest release: 1.2.2
published about 1 year ago
Rankings
Dependent packages count: 27.8%
Dependent repos count: 35.7%
Average: 49.5%
Downloads: 84.9%
Maintainers (1)
Last synced:
6 months ago
Dependencies
DESCRIPTION
cran
- glue * imports
- httr2 * imports
- jsonlite * imports
- tibble * imports
- testthat >= 3.0.0 suggests
.github/workflows/R-CMD-check.yaml
actions
- actions/checkout v4 composite
- r-lib/actions/check-r-package v2 composite
- r-lib/actions/setup-pandoc v2 composite
- r-lib/actions/setup-r v2 composite
- r-lib/actions/setup-r-dependencies v2 composite
.github/workflows/pkgdown.yaml
actions
- JamesIves/github-pages-deploy-action v4.5.0 composite
- actions/checkout v4 composite
- r-lib/actions/setup-pandoc v2 composite
- r-lib/actions/setup-r v2 composite
- r-lib/actions/setup-r-dependencies v2 composite
