Science Score: 75.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 2 DOI reference(s) in README -
✓Academic publication links
Links to: sciencedirect.com, zenodo.org -
○Academic email domains
-
✓Institutional organization owner
Organization eurac-eebgroup has institutional domain (www.eurac.edu) -
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (15.6%) to scientific vocabulary
Keywords
Repository
brickllm with langgraph
Basic Info
- Host: GitHub
- Owner: EURAC-EEBgroup
- License: bsd-3-clause
- Language: Python
- Default Branch: main
- Homepage: https://eurac-eebgroup.github.io/brick-llm/
- Size: 2.99 MB
Statistics
- Stars: 21
- Watchers: 4
- Forks: 5
- Open Issues: 3
- Releases: 11
Topics
Metadata Files
README.md
🧱 BrickLLM
BrickLLM is a Python library for generating RDF files following the BrickSchema ontology using Large Language Models (LLMs).
🧰 Features
- Generate BrickSchema-compliant RDF files from natural language descriptions of buildings and facilities
- Support for multiple LLM providers (OpenAI, Anthropic, Fireworks)
- Customizable graph execution with LangGraph
- Easy-to-use API for integrating with existing projects
💻 Installation
You can install BrickLLM using pip:
bash
pip install brickllm
Development Installation
[Poetry](https://python-poetry.org/) is used for dependency management during development. To install BrickLLM for contributing, follow these steps: ``` bash # Clone the repository git clone https://github.com/EURAC-EEBgroup/brickllm-lib.git cd brick-llm # Create a virtual environment python -m venv .venv # Activate the virtual environment source .venv/bin/activate # Linux/Mac .venv\Scripts\activate # Windows # Install Poetry and dependencies pip install poetry poetry install # Install pre-commit hooks poetry runpre-commit install ```🚀 Quick Start
Here's a simple example of how to use BrickLLM:
[!NOTE] You must first create a .env file with the API keys of the specified LLM provider (if not local) and load them in the environment
``` python from brickllm.graphs import BrickSchemaGraph
building_description = """ I have a building located in Bolzano. It has 3 floors and each floor has 1 office. There are 2 rooms in each office and each room has three sensors: - Temperature sensor; - Humidity sensor; - CO sensor. """
Create an instance of BrickSchemaGraph with a predefined provider
brick_graph = BrickSchemaGraph(model="openai")
Display the graph structure
brick_graph.display()
Prepare input data
inputdata = { "userprompt": building_description }
Run the graph
result = brickgraph.run(inputdata=input_data, stream=False)
Print the result
print(result)
save the result to a file
brickgraph.savettloutput("mybuilding.ttl") ```
Using Custom LLM Models
BrickLLM supports using custom LLM models. Here's an example using OpenAI's GPT-4o: ``` python from brickllm.graphs import BrickSchemaGraph from langchain_openai import ChatOpenAI custom_model = ChatOpenAI(temperature=0, model="gpt-4o") brick_graph = BrickSchemaGraph(model=custom_model) # Prepare input data input_data = { "user_prompt": building_description } # Run the graph with the custom model result = brick_graph.run(input_data=input_data, stream=False) ```Using Local LLM Models
BrickLLM supports using local LLM models employing the Ollama framework. Currently, only our finetuned model is supported.
### Option 1: Using Docker Compose You can easily set up and run the Ollama environment using Docker Compose. The finetuned model file will be automatically downloaded inside the container. Follow these steps: 1. Clone the repository and navigate to the `finetuned` directory containing the `Dockerfile` and `docker-compose.yml`. 2. Run the following command to build and start the container: ```bash docker-compose up --build -d ``` 3. Verify that the docker is running on localhost:11434: ```bash docker ps ``` if result is: ``` CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 1e9bff7c2f7b finetuned-ollama-llm:latest "/entrypoint.sh" 42 minutes ago Up 42 minutes 11434/tcp compassionate_wing ``` so run the docker image specifying the port: ```bash docker run -d -p 11434:11434 finetuned-ollama-llm:latest docker ps ``` the result will be like: ``` CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES df8b31d4ed86 finetuned-ollama-llm:latest "/entrypoint.sh" 7 seconds ago Up 7 seconds 0.0.0.0:11434->11434/tcp eloquent_jennings ``` check if ollama is runnin in the port 11434: ``` curl http://localhost:11434 ``` Result should be: ``` Ollama is running ``` This will download the model file, create the model in Ollama, and serve it on port `11434`. The necessary directories will be created automatically. ### Option 2: Manual Setup If you prefer to set up the model manually, follow these steps: 1. Download the `.gguf` file from here. 2. Create a file named `Modelfile` with the following content: ```bash FROM ./unsloth.Q4_K_M.gguf ``` 3. Place the downloaded `.gguf` file in the same folder as the `Modelfile`. 4. Ensure Ollama is running on your system. 5. Run the following command to create the model in Ollama: ```bash ollama create llama3.1:8b-brick-v8 -f Modelfile ``` Once you've set up the model in Ollama, you can use it in your code as follows: ``` python from brickllm.graphs import BrickSchemaGraphLocal instructions = """ Your job is to generate a RDF graph in Turtle format from a description of energy systems and sensors of a building in the following input, using the Brick ontology. ### Instructions: - Each subject, object of predicate must start with a @prefix. - Use the prefix bldg: with IRI📖 Documentation
For more detailed information on how to use BrickLLM, please refer to our documentation.
▶️ Web Application
A web app is available to use the library directly through an interface at the following link (). The application can also be used locally as described in the dedicated repository BrickLLM App.
Note: The tool is currently being deployed on our servers and on the MODERATE platform. It will be online shortly !
🤝 Contributing
We welcome contributions to BrickLLM! Please see our contributing guidelines for more information.
📜 License
BrickLLM is released under the BSD-3-Clause License. See the LICENSE file for details.
📧 Contact
For any questions or support, please contact:
- Marco Perini marco.perini@eurac.edu
- Daniele Antonucci daniele.antonucci@eurac.edu
- Rocco Giudice rocco.giudice@polito.it
📝 Citation
SoftwareX paper: here.
Please cite us if you use the library
💙 Acknowledgements
This work was carried out within European projects:
Moderate - Horizon Europe research and innovation programme under grant agreement No 101069834, with the aim of contributing to the development of open products useful for defining plausible scenarios for the decarbonization of the built environment BrickLLM is developed and maintained by the Energy Efficiency in Buildings group at EURAC Research. Thanks to the contribution of: - Moderate project: Horizon Europe research and innovation programme under grant agreement No 101069834 - Politecnico of Turin, in particular to Rocco Giudice, Marco Savino Piscitelli and Alfonso Capozzoli from BAEDALab.
Thank you to Brick for the great work it is doing.
Owner
- Name: Energy Efficient Buildings @EURAC
- Login: EURAC-EEBgroup
- Kind: organization
- Location: Bolzano - Bozen
This research group focuses on energy flexible buildings and building clusters.
Citation (CITATION.cff)
cff-version: 1.1.0 message: "If you use this software, please cite it as below." authors: - family-names: "Marco" given-names: "Perini" orcid: "https://orcid.org/0009-0008-6620-829X" - family-names: "Daniele" given-names: "Antonucci" orcid: "https://orcid.org/0000-0002-4736-0711" - family-names: "Rocco" given-names: "Giudice" orcid: "https://orcid.org/0009-0009-4013-4373" title: "EURAC-EEBgroup/brick-llm" version: v1.1.1 doi: 10.5281/zenodo.14039358 date-released: 2024-11-05 url: "https://github.com/EURAC-EEBgroup/brick-llm"
GitHub Events
Total
- Create event: 12
- Issues event: 8
- Release event: 10
- Watch event: 22
- Issue comment event: 18
- Push event: 48
- Pull request review event: 5
- Pull request event: 12
- Fork event: 4
Last Year
- Create event: 12
- Issues event: 8
- Release event: 10
- Watch event: 22
- Issue comment event: 18
- Push event: 48
- Pull request review event: 5
- Pull request event: 12
- Fork event: 4
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 4
- Total pull requests: 4
- Average time to close issues: about 2 months
- Average time to close pull requests: 3 days
- Total issue authors: 3
- Total pull request authors: 2
- Average comments per issue: 1.0
- Average comments per pull request: 0.0
- Merged pull requests: 3
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 4
- Pull requests: 4
- Average time to close issues: about 2 months
- Average time to close pull requests: 3 days
- Issue authors: 3
- Pull request authors: 2
- Average comments per issue: 1.0
- Average comments per pull request: 0.0
- Merged pull requests: 3
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- PeriniM (2)
- bbartling (2)
- jainmilan (1)
Pull Request Authors
- PeriniM (4)
- Giudice7 (2)