rag-constitucion-chile
Platform to compare Chile's current constitution with its new proposed constitution using LLMs.
Science Score: 31.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
○.zenodo.json file
-
○DOI references
-
○Academic publication links
-
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (8.0%) to scientific vocabulary
Keywords
Repository
Platform to compare Chile's current constitution with its new proposed constitution using LLMs.
Basic Info
- Host: GitHub
- Owner: aastroza
- License: mit
- Language: Python
- Default Branch: main
- Homepage: https://rag-constitucion-chile.streamlit.app/
- Size: 39.3 MB
Statistics
- Stars: 0
- Watchers: 2
- Forks: 1
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
RAG-Constitucion-Chile
Platform to compare Chile's current constitution with its new proposed constitution. This is a proof-of-concept using RAG where the sources are the articles from each constitution. You can check out a demo here: https://rag-constitucion-chile.streamlit.app/ . Coming soon to https://discolab.cl/
Getting Started
Before you begin to run this project, there are a few prerequisites you will need to have in place:
✅ OpenAI API Key: In order to interact with OpenAI's API, you will need to have an API key. You can obtain this by creating an account on OpenAI's website and following their instructions to generate an API key.
✅ Modal Account: The application is containerized for deployment using the Modal platform. Visit the Modal website to sign up for an account if you don't have one already.
✅ Streamlit Account (Optional): While you can run Streamlit apps locally without an account, having a Streamlit account allows you to deploy and share your apps, which can be useful for demonstrating your project to others. If you wish to use this feature, sign up for a Streamlit account.
Installation
conda create --name rag-constitucion-chile -c conda-forge python=3.10
conda activate rag-constitucion-chile
pip install -r requirements.txt
[CLIENT] Streamlit app
streamlit run st_app.py
Owner
- Name: Alonso Astroza Tagle
- Login: aastroza
- Kind: user
- Location: Santiago, Chile
- Company: IDS UDD, GeoVictoria
- Website: https://aastroza.github.io/
- Twitter: aastroza
- Repositories: 41
- Profile: https://github.com/aastroza
Data Scientist (15+ yrs exp) & AI enthusiast🌱. Prof at Universidad del Desarrollo, teaching Data Product Dev🎓. Building data products w/ LLMs like ChatGPT🤖.
Citation (citation.py)
import os
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain
from llama_index.query_engine import CitationQueryEngine
from llama_index import (
VectorStoreIndex,
SimpleDirectoryReader,
LLMPredictor,
ServiceContext,
)
from dotenv import load_dotenv
import openai
load_dotenv()
openai.api_key = os.environ["OPENAI_API_KEY"]
def create_query_engine(documents_path = "data/documents/", persist_dir = "./citation"):
service_context = ServiceContext.from_defaults(
llm_predictor=LLMPredictor(llm=ChatOpenAI(model_name='gpt-4', temperature=0))
)
documents = SimpleDirectoryReader(documents_path).load_data()
index = VectorStoreIndex.from_documents(documents, service_context=service_context)
index.storage_context.persist(persist_dir=persist_dir)
query_engine = CitationQueryEngine.from_args(
index,
similarity_top_k=3,
# here we can control how granular citation sources are, the default is 512
citation_chunk_size=1024
)
return query_engine
def get_response(query_engine, prompt):
response = query_engine.query(prompt)
sources = []
for node in response.source_nodes:
[source, capitulo, articulo] = node.node.get_text().split('\n', 3)[0:3]
sources.append(f'[{source.replace("Source ", "" ).replace(":", "")}] {capitulo}, {articulo}\n')
result = {
"question": prompt,
"sources": sources,
"answer": response
}
return result
def get_final_response(query, response_vigente, response_propuesta):
template = """
You are a Constitutional Lawyer. You are asked to give a brief response about
the diferences of two constitutions about this topic: {query}.
The first constitution is the current one, and the second one is a proposed one.
Always refer to the first constitution as Constitución Actual and the second one as Constitución Propuesta.
The first constituion says the following about the topic: {first_response}.
The second constituion says the following about the topic: {second_response}.
Please detail the differences between the two constitutions about this topic.
Please be concise and respond in spanish.
"""
prompt = PromptTemplate(template=template, input_variables=["query", "first_response", "second_response"])
llm_chain = LLMChain(prompt=prompt, llm=ChatOpenAI(model_name='gpt-4', temperature=0))
final_response = llm_chain.predict(query=query, first_response=response_vigente.response, second_response=response_propuesta.response)
return final_response
query_engine_vigente = create_query_engine(documents_path = "data/documents", persist_dir = "./citation")
query_engine_propuesta = create_query_engine(documents_path = "data/documents_propuesta", persist_dir = "./citation_propuesta")
GitHub Events
Total
Last Year
Committers
Last synced: 12 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Alonso Astroza Tagle | a****a@g****m | 14 |
| Vokturz | v****7@g****m | 3 |
Issues and Pull Requests
Last synced: 12 months ago
All Time
- Total issues: 0
- Total pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 0
- Total pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- langchain *
- llama-index *
- nltk *
- openai *
- python-dotenv *
- streamlit *