scoopika-core
The core of Scoopika, ecosystem for building controllable AI-powered applications
Science Score: 18.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
○codemeta.json file
-
○.zenodo.json file
-
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (13.7%) to scientific vocabulary
Keywords
Repository
The core of Scoopika, ecosystem for building controllable AI-powered applications
Basic Info
- Host: GitHub
- Owner: Scoopika
- License: apache-2.0
- Language: Python
- Default Branch: main
- Homepage: https://scoopika.vercel.app/
- Size: 140 KB
Statistics
- Stars: 1
- Watchers: 0
- Forks: 1
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
🦄 Scoopika: Core
Build controllable AI-powered applications & agents.
NOTICE: This work is still under development.
NOTICE: This repository contains the core functionality of Scoopika, and it's not intended for use in application-level but for developers who want to build new open-source projects on top of it... If you're interested in using Scoopika in your application I recommend checking the scoopika-py for Python, or scoopika-js for Typescript.
What is Scoopika ?
Scoopika is an ecosystem of tools for building controllable and predictable AI-powered applications around LLMs. applications that work with function-calling and agents.
Some of Scoopika's features:
- Build context-aware applications that enable users to interact with their data in natural language.
- Define rules and validation steps for the function calling process, your functions will NEVER receive inputs it does not expect again.
- Create agents that do custom tasks easily in a minute.
- Vector stores that provide history to the other parts of Scoopika, and works with any vector database you use.
Run locally
You can run the core locally, first clone this repository and then install the requirements:
bash
pip install -r requirements.txt
Now you can start using the core, it has a number of main classes: ToolSelection and ArgumentsSelection, and also the prompts.dynamic function for creating custom LLM prompts.
Again the core is not intended for application use, you can check how it works and figure out how to build on top of it, or configure it for your own use case.
The main Scoopika's functionality can be found in the main scoopika-server repository.
Acknowledgement
We couldn't have done this project without all the effort the LangChain team have put in building their great open source project, a lot of functionalities in this core is built on top of their framework and we appreciate it (We are not affliated LangChain).
For all license notices we recommend you check the NOTICE file.
If you're an author of a work that this core depends on and you want us to add your name, citation, or license to the notice, please contact me on: kais.radwan.personal@gmail.com
Owner
- Name: scoopika
- Login: Scoopika
- Kind: organization
- Repositories: 1
- Profile: https://github.com/Scoopika
Controllable AI function-calling
Citation (CITATIONS.md)
```
cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "Chase"
given-names: "Harrison"
title: "LangChain"
date-released: 2022-10-17
url: "https://github.com/langchain-ai/langchain"
```
```
cff-version: 1.2.0
preferred-citation:
type: article
message: "If you use spaCy, please cite it as below."
authors:
- family-names: "Honnibal"
given-names: "Matthew"
- family-names: "Montani"
given-names: "Ines"
- family-names: "Van Landeghem"
given-names: "Sofie"
- family-names: "Boyd"
given-names: "Adriane"
title: "spaCy: Industrial-strength Natural Language Processing in Python"
doi: "10.5281/zenodo.1212303"
year: 2020
```
```
cff-version: "1.2.0"
date-released: 2020-10
message: "If you use this software, please cite it using these metadata."
title: "Transformers: State-of-the-Art Natural Language Processing"
url: "https://github.com/huggingface/transformers"
authors:
- family-names: Wolf
given-names: Thomas
- family-names: Debut
given-names: Lysandre
- family-names: Sanh
given-names: Victor
- family-names: Chaumond
given-names: Julien
- family-names: Delangue
given-names: Clement
- family-names: Moi
given-names: Anthony
- family-names: Cistac
given-names: Perric
- family-names: Ma
given-names: Clara
- family-names: Jernite
given-names: Yacine
- family-names: Plu
given-names: Julien
- family-names: Xu
given-names: Canwen
- family-names: "Le Scao"
given-names: Teven
- family-names: Gugger
given-names: Sylvain
- family-names: Drame
given-names: Mariama
- family-names: Lhoest
given-names: Quentin
- family-names: Rush
given-names: "Alexander M."
preferred-citation:
type: conference-paper
authors:
- family-names: Wolf
given-names: Thomas
- family-names: Debut
given-names: Lysandre
- family-names: Sanh
given-names: Victor
- family-names: Chaumond
given-names: Julien
- family-names: Delangue
given-names: Clement
- family-names: Moi
given-names: Anthony
- family-names: Cistac
given-names: Perric
- family-names: Ma
given-names: Clara
- family-names: Jernite
given-names: Yacine
- family-names: Plu
given-names: Julien
- family-names: Xu
given-names: Canwen
- family-names: "Le Scao"
given-names: Teven
- family-names: Gugger
given-names: Sylvain
- family-names: Drame
given-names: Mariama
- family-names: Lhoest
given-names: Quentin
- family-names: Rush
given-names: "Alexander M."
booktitle: "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations"
month: 10
start: 38
end: 45
title: "Transformers: State-of-the-Art Natural Language Processing"
year: 2020
publisher: "Association for Computational Linguistics"
url: "https://www.aclweb.org/anthology/2020.emnlp-demos.6"
address: "Online"
```
GitHub Events
Total
Last Year
Dependencies
- Jinja2 ==3.1.3
- MarkupSafe ==2.1.5
- PyYAML ==6.0.1
- Pygments ==2.17.2
- SQLAlchemy ==2.0.28
- aiohttp ==3.9.3
- aiosignal ==1.3.1
- anyio ==4.3.0
- attrs ==23.2.0
- black ==24.3.0
- blis ==0.7.11
- catalogue ==2.0.10
- certifi ==2024.2.2
- charset-normalizer ==3.3.2
- click ==8.1.7
- cloudpathlib ==0.16.0
- confection ==0.1.4
- cymem ==2.0.8
- dataclasses-json ==0.6.4
- distro ==1.9.0
- filelock ==3.13.1
- frozenlist ==1.4.1
- fsspec ==2024.3.1
- greenlet ==3.0.3
- h11 ==0.14.0
- httpcore ==1.0.4
- httpx ==0.27.0
- huggingface-hub ==0.21.4
- idna ==3.6
- jsonpatch ==1.33
- jsonpointer ==2.4
- kor ==1.0.1
- langchain ==0.1.12
- langchain-community ==0.0.28
- langchain-core ==0.1.32
- langchain-openai ==0.0.8
- langchain-text-splitters ==0.0.1
- langcodes ==3.3.0
- langsmith ==0.1.27
- markdown-it-py ==3.0.0
- marshmallow ==3.21.1
- mdurl ==0.1.2
- mpmath ==1.3.0
- multidict ==6.0.5
- murmurhash ==1.0.10
- mypy-extensions ==1.0.0
- networkx ==3.2.1
- numpy ==1.26.4
- nvidia-cublas-cu12 ==12.1.3.1
- nvidia-cuda-cupti-cu12 ==12.1.105
- nvidia-cuda-nvrtc-cu12 ==12.1.105
- nvidia-cuda-runtime-cu12 ==12.1.105
- nvidia-cudnn-cu12 ==8.9.2.26
- nvidia-cufft-cu12 ==11.0.2.54
- nvidia-curand-cu12 ==10.3.2.106
- nvidia-cusolver-cu12 ==11.4.5.107
- nvidia-cusparse-cu12 ==12.1.0.106
- nvidia-nccl-cu12 ==2.19.3
- nvidia-nvjitlink-cu12 ==12.4.99
- nvidia-nvtx-cu12 ==12.1.105
- openai ==1.14.1
- orjson ==3.9.15
- packaging ==23.2
- pandas ==1.5.3
- pathspec ==0.12.1
- platformdirs ==4.2.0
- preshed ==3.0.9
- pydantic ==1.10.14
- python-dateutil ==2.9.0.post0
- python-dotenv ==1.0.1
- pytz ==2024.1
- regex ==2023.12.25
- requests ==2.31.0
- rich ==13.7.1
- safetensors ==0.4.2
- six ==1.16.0
- smart-open ==6.4.0
- sniffio ==1.3.1
- spacy ==3.7.4
- spacy-legacy ==3.0.12
- spacy-loggers ==1.0.5
- srsly ==2.4.8
- sympy ==1.12
- tenacity ==8.2.3
- thinc ==8.2.3
- tiktoken ==0.6.0
- tokenizers ==0.15.2
- torch ==2.2.1
- tqdm ==4.66.2
- transformers ==4.38.2
- triton ==2.2.0
- typer ==0.9.0
- typing-inspect ==0.9.0
- typing_extensions ==4.10.0
- urllib3 ==2.2.1
- wasabi ==1.1.2
- weasel ==0.3.4
- yarl ==1.9.4