mibiscreen

Prediction tool for Microbiome based Remediation

https://github.com/mibipret/mibiscreen

Science Score: 57.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 3 DOI reference(s) in README
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.3%) to scientific vocabulary

Keywords

microbiome python remediation
Last synced: 6 months ago · JSON representation ·

Repository

Prediction tool for Microbiome based Remediation

Basic Info
Statistics
  • Stars: 0
  • Watchers: 1
  • Forks: 0
  • Open Issues: 22
  • Releases: 7
Topics
microbiome python remediation
Created almost 2 years ago · Last pushed 7 months ago
Metadata Files
Readme Contributing License Code of conduct Citation

README.dev.md

mibiscreen developer documentation

Development install

```shell

Create a virtual environment, e.g. with

python -m venv env

activate virtual environment

source env/bin/activate

make sure to have a recent version of pip and setuptools

python -m pip install --upgrade pip setuptools

(from the project root directory)

install mibiscreen as an editable package

python -m pip install --no-cache-dir --editable .

install development dependencies

python -m pip install --no-cache-dir --editable .[dev] ```

Afterwards check that the install directory is present in the PATH environment variable.

Running the tests

There are two ways to run tests.

The first way requires an activated virtual environment with the development tools installed:

shell pytest -v

The second is to use tox, which can be installed separately (e.g. with pip install tox), i.e. not necessarily inside the virtual environment you use for installing mibiscreen, but then builds the necessary virtual environments itself by simply running:

shell tox

Testing with tox allows for keeping the testing environment separate from your development environment. The development environment will typically accumulate (old) packages during development that interfere with testing; this problem is avoided by testing with tox.

Test coverage

In addition to just running the tests to see if they pass, they can be used for coverage statistics, i.e. to determine how much of the package's code is actually executed during tests. In an activated virtual environment with the development tools installed, inside the package directory, run:

shell coverage run

This runs tests and stores the result in a .coverage file. To see the results on the command line, run

shell coverage report

coverage can also generate output in HTML and other formats; see coverage help for more information.

Running linters locally

For linting and sorting imports we will use ruff. Running the linters requires an activated virtual environment with the development tools installed.

```shell

linter

ruff check .

linter with automatic fixing

ruff check . --fix ```

To fix readability of your code style you can use yapf.

You can enable automatic linting with ruff on commit by enabling the git hook from .githooks/pre-commit, like so:

shell git config --local core.hooksPath .githooks

Testing docs locally

To build the documentation locally, first make sure mkdocs and its dependencies are installed: shell python -m pip install .[doc]

Then you can build the documentation and serve it locally with shell mkdocs serve

This will return a URL (e.g. http://127.0.0.1:8000/mibiscreen/) where the docs site can be viewed.

Versioning

Bumping the version across all files is done with bump-my-version, e.g.

shell bump-my-version bump major # bumps from e.g. 0.3.2 to 1.0.0 bump-my-version bump minor # bumps from e.g. 0.3.2 to 0.4.0 bump-my-version bump patch # bumps from e.g. 0.3.2 to 0.3.3

Making a release

To create a release you need write permission on the repository.

This section describes how to make a release:

  1. preparation
  2. making a release on GitHub

(1/2) Preparation

  1. Checkout the main branch locally
  2. Verify that the information (especially the author list) in CITATION.cff is correct.
  3. Make sure the version has been updated.
  4. Run the unit tests with pytest -v
  5. Make sure the docs build and look good

(2/2) GitHub

When all is well, navigate to the releases on GitHub.

  1. Press draft a new release button
  2. Select the "Choose a tag" drop down and write out the new version (e.g. v1.3.2)
  3. Press "Generate release notes" to automatically fill the title (with the version number) and generate a description (the changelog from the merge pull requests)
  4. Press the Publish release button

This will create the release on github and automatically trigger:

  1. The .github/workflows/publish.yml workflow which will build the package and publish it on PyPI
  2. The Zenodo-Github integration into making a snapshot of your repository and sticking a DOI on it and adding the new version to the main Zenodo entry for your software at 10.5281/zenodo.10878799

Owner

  • Name: MiBiPreT
  • Login: MiBiPreT
  • Kind: organization

Micro-Bioremediation Prediction Tool

Citation (CITATION.cff)

# YAML 1.2
---
cff-version: "1.2.0"
title: "mibiscreen"
authors:
  -
    family-names: Zech
    given-names: Alraune
    orcid: "https://orcid.org/0000-0002-8783-6198"
  -
    family-names: Aseyednezhad
    given-names: Sona
    orcid: "https://orcid.org/0000-0001-7324-2009"
  -
    family-names: Richardson
    given-names: Robin
    orcid: "https://orcid.org/0000-0002-9984-2720"
  -
    family-names: Camphuijsen
    given-names: Jaro
    orcid: "https://orcid.org/0000-0002-8928-7831"
date-released: 2024-03-26
doi: 10.5281/zenodo.10878799
version: "0.5.0"
repository-code: "https://github.com/MiBiPreT/mibiscreen"
keywords:
  - bioremediation
  - contaminant data analysis
  - contaminant transport modelling
  - bio-geo-chemical modelling
message: "If you use this software, please cite it using these metadata."
license: Apache-2.0

GitHub Events

Total
  • Create event: 15
  • Release event: 3
  • Issues event: 26
  • Delete event: 9
  • Issue comment event: 46
  • Push event: 59
  • Pull request event: 16
  • Pull request review event: 25
  • Pull request review comment event: 33
Last Year
  • Create event: 15
  • Release event: 3
  • Issues event: 26
  • Delete event: 9
  • Issue comment event: 46
  • Push event: 59
  • Pull request event: 16
  • Pull request review event: 25
  • Pull request review comment event: 33