https://github.com/alan-turing-institute/crop
CROP is a Research Observation Platform
Science Score: 10.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
○codemeta.json file
-
○.zenodo.json file
-
○DOI references
-
○Academic publication links
-
✓Committers with academic emails
9 of 12 committers (75.0%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (14.3%) to scientific vocabulary
Keywords
Keywords from Contributors
Repository
CROP is a Research Observation Platform
Basic Info
Statistics
- Stars: 27
- Watchers: 16
- Forks: 6
- Open Issues: 54
- Releases: 0
Topics
Metadata Files
README.md
CROP 
CROP is a Research Observations Platform designed and created by the Research Engineering Group at the Alan Turing Institute in collaboration with Dr Ruchi Choudhary's research group and Growing Underground.
Summary
The aim of CROP is to prototype a digital twin of the Growing Underground underground farm.
CROP is an on cloud-based application which utilizes the flexibility, customization and evolution that a cloud-native system provides, to better refine, simplify and improve the processes and architecture of the system with regards to our research needs.
The digital twin: * collects heterogeneous IoT sensor data, * provides 3D visualisation of the underground farm and sensor locations, * helps to analyse and forecast farm conditions at various points in time.
Key Functionalities
- The CROP web application is the main interface for the digital twin. Users can
- explore collected heterogeneous IoT sensor data,
- analyse farm conditions at various points in time,
- use the interactive 3D visualisation of the farm,
- forecast future farm conditions using machine learning models built into the platform.
- The CROP database is constantly updated from multiple streams of data: Hyper API, Stark energy usage platform, and others.
- For forecasting, CROP uses two models
- An Autoregressive Integrated Moving Average (ARIMA) model uses past temperature and relative humidity data in the farm to forecast conditions a few days into the future. The documentation for our Python implementation of this is available here.
- A Greenhouse Energy Simulation (GES) model uses past sensor data, weather data and farm operational parameters (lighting schedules, fan settings, etc.) to forecast conditions a few days into the future using Gaussian Process Regression. The GES model has the ability to predict various alternative scenarios, such as how would conditions change if the lights were switched on at a different time, or the fan settings were changed.
The Unity 3D model is found in this repo
Implementation
CROP is implemented using a well established software stack (given below) and exploits four different services on the Azure cloud computing platform.
Software stack
| Platform | Service | Software |
|---|---|---|
![]() |
Function
|
|
WebApp
|
||
Storage
|
||
PostgreSQL
|
Repository structure
The directories are structured as follows:
* core has all the code for interacting with the database.
This includes defining the database schema in structure.py, some utilities, and modules for each of the ingress functions, for reading and writing the data from various sensors and other data sources.
* tests has the test suite for core.
* webapp has the Flask webapp.
* ingress_functions has the configurations and code for the Azure functions for importing data into the database.
The code itself for each function is nearly trivial:
It just calls the corresponding function in core.
This folder mainly exists to hold the host.json and function.json files for each function.
* models has the code for the forecasting models, ARIMA and GES.
* media has illustrations, logos, etc.
* util_scripts has various utilities that don't fit the other categories.
* .secrets has shell scripts for setting environment variables with secrets, such as database passwords, to facilitate running a local copy of the webapp for development use.
The version-controlled files are merely templates, to be filled in with the actual values for the secrets.
All of tests, webapp, ingress_functions, util_scripts, and models import and use various parts of core. None of them import from or talk to each other.
The repository root also has three Docker files:
* Dockerfile_ingress_functions builds a container that holds all the functions in ingress_functions, in an environment to run them on an Azure function app.
* Dockerfile_models_functions builds a similar container that holds the functions in models, for running the predictive models on an Azure function app.
* Dockerfile_webapp builds the webapp, ready to be deployed as an Azure app service. It builds on webappbase.
* Dockerfile_webappbase is a container that installs some of the dependencies needed by the webapp.
It's separated from webapp to improve build times, and shouldn't need to be rebuilt except in rare circumstances.
Some of the subfolders have their own READMEs. See them for more details of each part.
Development credits
Gentelella - a free to use (MIT license) Bootstrap admin template on which the webapp is built.
Deployment
We employ a continuous delivery toolchain using Github Actions which deploys the latest version of CROP when a push or a PR is made to the main or develop branches.
The Github Actions
* Build a Docker container for the webapp, and push it to Docker Hub.
* Build a Docker container for the Azure function apps that collect data into the database (ingress functions), and push it to Docker Hub.
* Publish the Azure function app for running the forecasting models.
* Run the CROP test suite.
The Azure services for the webapp and the ingress functions listen to updates on Docker Hub, and deploy the latest container once it has been pushed to the Hub.
The main and develop branches are deployed to production and testing versions of the platform, respectively. The former is relatively stable, the latter may be broken in the course of development.
Scripted deployment
A complete set of infrastructure for a CROP deployment on Microsoft Azure cloud can bet setup using Pulumi.
For instructions on how to do this, see here
Developer instructions/running locally
When developing the code for the CROP platform, it may be useful/necessary to run the webapp locally. To do this:
* Clone this repo, and change directory to it.
* (Recommended) create and activate a new Python environment, using conda or venv, or your favourite virtual environment tool.
* Run pip install .. This will install the contents of the core/ directory (which contains database-related schema and code, and many of the queries) as a python package called cropcore. It will also install all the dependencies listed in requirements.txt.
* You will need to set several environment variables, containing e.g. database credentials, and API keys. These will be in a file called .secrets/crop.sh. There is a template file called .secrets/template_crop.sh where you can fill in the info as you have it (then copy to crop.sh), or you can ask for a copy of this file from an existing developer.
* Once you have it, run source .secrets/crop.sh.
* Ensure that you have npm (node package manager) installed. To install this on MacOS, you can do brew install npm.
* Change to the webapp/ directory, and run the command ./run.sh. You should then be able to navigate to the webapp by pointing your browser to http://localhost:5000.
Getting help
If you found a bug or need support, please submit an issue here.
How to contribute
We welcome contributions! If you are willing to propose new features or have bug fixes to contribute, please submit a pull request here if you have permissions, or fork the repository, make your changes, and submit a pull request from your fork.
Owner
- Name: The Alan Turing Institute
- Login: alan-turing-institute
- Kind: organization
- Email: info@turing.ac.uk
- Website: https://turing.ac.uk
- Repositories: 477
- Profile: https://github.com/alan-turing-institute
The UK's national institute for data science and artificial intelligence.
GitHub Events
Total
- Watch event: 2
- Fork event: 2
Last Year
- Watch event: 2
- Fork event: 2
Committers
Last synced: 7 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Markus Hauru | m****u@t****k | 405 |
| Tomas Lazauskas | 1****z | 261 |
| nbarlowATI | n****w@t****k | 240 |
| David Salvador Jasin | d****n@t****k | 220 |
| misspawty | f****1@u****k | 128 |
| myyong | m****g@t****k | 121 |
| mastoffel | m****l@g****m | 39 |
| Flora Roumpani | f****i@t****k | 28 |
| Rebecca Ward | r****1@c****k | 17 |
| Melanie Jans-Singh | m****2@c****k | 10 |
| dependabot[bot] | 4****] | 3 |
| Jack Roberts | j****s@t****k | 2 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 7 months ago
All Time
- Total issues: 57
- Total pull requests: 43
- Average time to close issues: 4 months
- Average time to close pull requests: 2 days
- Total issue authors: 4
- Total pull request authors: 5
- Average comments per issue: 0.63
- Average comments per pull request: 0.44
- Merged pull requests: 42
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- mhauru (33)
- nbarlowATI (17)
- rmward61 (5)
- dsj976 (2)
Pull Request Authors
- mhauru (21)
- nbarlowATI (12)
- dsj976 (6)
- mastoffel (3)
- rmward61 (1)
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- actions/checkout v2 composite
- reviewdog/action-suggester v1 composite
- rickstaa/action-black v1 composite
- actions/checkout v3 composite
- docker/login-action v1 composite
- actions/checkout v3 composite
- docker/login-action v1 composite
- Azure/functions-action v1 composite
- actions/checkout master composite
- actions/setup-python v1 composite
- actions/checkout v2 composite
- creyD/prettier_action v4.2 composite
- actions/checkout v3 composite
- actions/setup-python v2 composite
- harmon758/postgresql-action v1 composite
- actions/checkout v3 composite
- docker/login-action v1 composite
- rocker/r-ver 4.0.5 build
- eslint ^8.25.0 development
- eslint-config-google ^0.14.0 development
- eslint-config-prettier ^8.5.0 development
- prettier ^2.6.2 development
- alertifyjs ^1.11.0
- bootstrap ^3.3.6
- bootstrap-daterangepicker ^2.1.24
- chart.js ^3.8.0
- chartjs-adapter-moment ^1.0.0
- datatables.net ^1.10.12
- datatables.net-bs ^1.10.12
- datatables.net-buttons ^1.2.1
- datatables.net-buttons-bs ^1.2.1
- datatables.net-fixedheader ^3.1.2
- datatables.net-fixedheader-bs ^3.1.2
- datatables.net-keytable ^2.1.2
- datatables.net-responsive ^2.1.0
- datatables.net-responsive-bs ^2.1.0
- datatables.net-scroller ^1.4.2
- datatables.net-scroller-bs ^1.4.2
- font-awesome ^4.6.3
- jquery ^2.2.3
- moment ^2.13.0
- parsleyjs ^2.3.13
- plotly.js-dist ^2.15.0
- Jinja2 ==3.0.3
- PyYAML ==6.0
- SQLAlchemy-Utils ==0.38.2
- WTForms ==2.3.3
- azure-functions ==1.11.2
- azure-storage-blob ==12.13.0
- bcrypt ==3.2.2
- black ==22.3.0
- flask ==2.1.1
- flask-cors ==3.0.10
- flask-sqlalchemy ==2.5.1
- flask_login ==0.6.1
- flask_migrate ==3.1.0
- flask_wtf ==1.0.1
- jinjasql ==0.1.8
- matplotlib ==3.5.1
- numpy ==1.22.3
- pandas ==1.4.2
- pip ==22.0.4
- pre-commit *
- psycopg2-binary ==2.9.3
- pylint ==2.13.5
- pytest ==7.1.1
- python-dateutil ==2.8.2
- requests ==2.28.1
- requests_mock ==1.9.3
- scikit-learn ==1.0.2
- scipy ==1.8.0
- selenium ==4.1.3
- sqlalchemy ==1.4.39
- werkzeug ==2.1.2

Function
WebApp
Storage
PostgreSQL