pybryt

Python library for pedagogical auto-assessment

https://github.com/microsoft/pybryt

Science Score: 64.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Committers with academic emails
    2 of 13 committers (15.4%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (10.8%) to scientific vocabulary

Keywords

auto-assessment educators pybryt-library python-library

Keywords from Contributors

standardization
Last synced: 4 months ago · JSON representation ·

Repository

Python library for pedagogical auto-assessment

Basic Info
Statistics
  • Stars: 66
  • Watchers: 6
  • Forks: 18
  • Open Issues: 3
  • Releases: 22
Topics
auto-assessment educators pybryt-library python-library
Created almost 5 years ago · Last pushed over 3 years ago
Metadata Files
Readme Changelog Contributing License Code of conduct Citation Security Support

README.md

PyPI GitHub license GitHub contributors GitHub issues GitHub pull-requests PRs Welcome GitHub Actions codecov

GitHub watchers GitHub forks GitHub stars

PyBryt - Python Library

PyBrytLogo

PyBryt is an auto-assessment Python library for teaching and learning.

  • The PyBryt Library is a FREE Open Source Python Library that provides auto assessment of grading submissions. Our goal is to empower students and educators to learn about technology through fun, guided, hands-on content aimed at specific learning goals.
  • The PyBryt Library is a Open Source Python Library - focused on the auto assessment and validation of Python coding.
  • The PyBryt library has been developed under open source to support learning and training institutions to auto assess the work completed by learners.
    PyBrytGoals
  • The PyBryt Library will work existing auto grading solution such as Otter Grader, OkPy or Autolab.

Features

Educators and Institutions can leverage the PyBryt Library to integrate auto assessment and reference models to hands on labs and assessments.

  • Educators do not have to enforce the structure of the solution;
  • Learner practice the design process,code design and implemented solution;
  • Meaningful & pedagogical feedback to the learners;
  • Analysis of complexity within the learners solution;
  • Plagiarism detection and support for reference solutions;
  • Easy integration into existing organizational or institutional grading infrastructure.

Getting Started

See the Getting Started page on the pybryt documentation for steps to install and use pybryt for the first time. You can also check the Microsoft Learn interactive modules on Introductions to PyBryt and Advanced PyBryt to learn more about to use the library to autoassess your learners activities.

Testing

To run the demos, all demos are located in the demo folder.

First install PyBryt with pip:

pip install pybryt

Simply launch the index.ipynb notebook in each of the directories under demo from Jupyter Notebook, which demonstrates the process of using PyBryt to assess student submissions.

Technical Report

We continuously interact with computerized systems to achieve goals and perform tasks in our personal and professional lives. Therefore, the ability to program such systems is a skill needed by everyone. Consequently, computational thinking skills are essential for everyone, which creates a challenge for the educational system to teach these skills at scale and allow students to practice these skills. To address this challenge, we present a novel approach to providing formative feedback to students on programming assignments. Our approach uses dynamic evaluation to trace intermediate results generated by student's code and compares them to the reference implementation provided by their teachers. We have implemented this method as a Python library and demonstrate its use to give students relevant feedback on their work while allowing teachers to challenge their students' computational thinking skills. Paper available at PyBryt: auto-assessment and auto-grading for computational thinking

Citing Technical Report

bibtex @misc{pyles2021pybryt, title={PyBryt: auto-assessment and auto-grading for computational thinking}, author={Christopher Pyles and Francois van Schalkwyk and Gerard J. Gorman and Marijan Beg and Lee Stott and Nir Levy and Ran Gilad-Bachrach}, year={2021}, eprint={2112.02144}, archivePrefix={arXiv}, primaryClass={cs.HC} }

Citing of Codebase

Please use the citing this repositry on the repo menu or citation.cff file in the root of this repo.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

Owner

  • Name: Microsoft
  • Login: microsoft
  • Kind: organization
  • Email: opensource@microsoft.com
  • Location: Redmond, WA

Open source projects and samples from Microsoft

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "Stott"
  given-names: "Lee"
  orcid: "https://orcid.org/0000-0002-3715-0892"
- family-names: "Gilad-Bachrach"
  given-names: "Ran"
  orcid: "https://orcid.org/0000-0002-4001-8307"
- family-names: Pyles
  given-names: Christopher
  orcid: "https://orcid.org/0000-0001-8520-7593"
- family-names: "Beg"
  given-names: "Marijan"
  orcid: "https://orcid.org/0000-0002-6670-3994"
- family-names: "Levy"
  given-names: "Nir"
  orcid: "https://orcid.org/0000-0002-4256-4934"
- family-names: "Gorman"
  given-names: "Gerard John"
  orcid: "https://orcid.org/0000-0003-0563-3678"
- family-names: "Percival"
  given-names: "James Robert"
  orcid: "https://orcid.org/0000-0002-6556-0055"
- family-names: "Rhodri"
  given-names: "Nelson"
  orcid: "https://orcid.org/0000-0003-2768-5735"
title: "Pybryt - Python library for pedagogical auto-assessment"
version: 0.7.0
date-released: 2022-04-28
url: "https://github.com/microsoft/pybryt"

GitHub Events

Total
  • Watch event: 6
Last Year
  • Watch event: 6

Committers

Last synced: about 1 year ago

All Time
  • Total Commits: 615
  • Total Committers: 13
  • Avg Commits per committer: 47.308
  • Development Distribution Score (DDS): 0.285
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Chris Pyles c****s@b****u 440
Chris Pyles 4****s 67
Lee Stott l****t@m****m 34
github-actions 4****] 19
Marijan Beg m****g@i****k 17
Anthony Shaw a****w@g****m 10
Nir Levy n****y@m****m 10
nir-levy n****y@n****m 8
Microsoft Open Source m****e 5
root r****t@D****n 2
Mark Patterson 3****7 1
Ran Gilad-Bachrach 1****b 1
microsoft-github-operations[bot] 5****] 1

Issues and Pull Requests

Last synced: 5 months ago

All Time
  • Total issues: 37
  • Total pull requests: 64
  • Average time to close issues: about 2 months
  • Average time to close pull requests: 2 days
  • Total issue authors: 6
  • Total pull request authors: 7
  • Average comments per issue: 0.62
  • Average comments per pull request: 0.88
  • Merged pull requests: 63
  • Bot issues: 1
  • Bot pull requests: 12
Past Year
  • Issues: 1
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 1
  • Pull request authors: 0
  • Average comments per issue: 0.0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 1
  • Bot pull requests: 0
Top Authors
Issue Authors
  • chrispyles (25)
  • marijanbeg (7)
  • ranigb (2)
  • microsoft-github-policy-service[bot] (1)
  • muzzahmed (1)
  • leestott (1)
Pull Request Authors
  • chrispyles (41)
  • github-actions[bot] (12)
  • leestott (4)
  • tonybaloney (3)
  • marijanbeg (2)
  • markpatterson27 (1)
  • ranigb (1)
Top Labels
Issue Labels
Pull Request Labels
release (13)

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 249 last-month
  • Total docker downloads: 157
  • Total dependent packages: 1
  • Total dependent repositories: 17
  • Total versions: 24
  • Total maintainers: 2
pypi.org: pybryt

Python auto-assessment library

  • Versions: 24
  • Dependent Packages: 1
  • Dependent Repositories: 17
  • Downloads: 249 Last month
  • Docker Downloads: 157
Rankings
Dependent repos count: 3.5%
Dependent packages count: 4.6%
Average: 7.1%
Forks count: 8.6%
Stargazers count: 8.8%
Downloads: 10.0%
Maintainers (2)
Last synced: 4 months ago

Dependencies

docs/requirements.txt pypi
  • furo *
  • nbsphinx *
  • sphinx-argparse *
  • sphinx-click *
  • sphinxcontrib-apidoc *
requirements.txt pypi
  • Cython *
  • IPython *
  • astunparse *
  • click *
  • dill *
  • ipykernel *
  • nbconvert *
  • nbformat *
  • numpy *
  • pandas *
.github/workflows/build-docs.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
.github/workflows/prevent-stable-merges.yml actions
.github/workflows/release.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
.github/workflows/run-tests.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
environment.yml conda
  • pip
  • python 3.8.*
setup.py pypi