fluence

A deep learning library based on Pytorch focussed on low resource language research and robustness

https://github.com/prajjwal1/fluence

Science Score: 10.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.2%) to scientific vocabulary

Keywords

attention deep-learning nlp pytorch transformers
Last synced: 6 months ago · JSON representation

Repository

A deep learning library based on Pytorch focussed on low resource language research and robustness

Basic Info
  • Host: GitHub
  • Owner: prajjwal1
  • License: apache-2.0
  • Language: Python
  • Default Branch: master
  • Homepage:
  • Size: 3.12 MB
Statistics
  • Stars: 70
  • Watchers: 4
  • Forks: 3
  • Open Issues: 0
  • Releases: 0
Topics
attention deep-learning nlp pytorch transformers
Created about 6 years ago · Last pushed about 4 years ago
Metadata Files
Readme Contributing License

README.md



Latest Release Apache

Winner of Pytorch Global Hackathon 2020.

Fluence is a Pytorch based deep learning library focussed on providing computationally efficient, low resource methods and algorithms for NLP. Although the main focus is to provide support with transformers for NLP tasks, it can be extended with other domains and architectures as well. Currently in pre-alpha stage.

List of implemented papers #### Adaptive Methods - [Adaptive Attention Span in Transformers (ACL 2019)](https://arxiv.org/abs/1905.07799) - [Adaptively Sparse Transformers (EMNLP 2019)](https://arxiv.org/abs/1909.00015) - [Reducing Transformer Depth on Demand with Structured Dropout (ICLR 2020)](https://arxiv.org/abs/1909.11556) #### Debiasing - [Learning Robust Representations by Projecting Superficial Statistics Out (ICLR 2019)](https://openreview.net/pdf?id=rJEjjoR9K7) -------------------------------------------------------------------------------

Why Fluence ?

Fluence is targeted towards two main goals: 1. Compute efficiency: Low resource research: 2. Robustness: Algorithms that either enhance our understanding of current methods or show where SoTA methods fail.

It is as straightforward to use as HF Transformers, and fully integrates with Pytorch. Please note that the current modules (meta-trainer, siamese-trainer) which rely on inherited Trainer works with transformers==3.0. Newer version comes with a modified Trainer.

Installing

For stable version: bash pip3 install --user fluence

For development version (recommended): bash git clone https://github.com/prajjwal1/fluence cd fluence python3 setup.py install --user

Overview

The library contains implementation for the following approaches (many more to come):
| Module | Method with documentation | -------------------------------------------------------------------------------------- | ---------------------------- | fluence.adaptive | Adaptive Methods | | fluence.datasets | Datasets |
| fluence.optim | Optimizers | | fluence.sampling | Importance Sampling | | fluence.models | Siamese Methodology, Debiasing | fluence.prune | Pruning|

Documentation

Please head to this link to learn how you can integrate fluence with your workflow. Since it's an early release, there might be bugs. Please file an issue if you encounter one. Docs are a work-in-progress.

Contribution

You can contribute by either filing an issue or sending a Pull Request (if you encounter any bug or want some features to be added). Please checkout the contributing guide for more details.

Tests

Fluence comes with an extensive test suite for high test coverage. pytest tests/ -v

Author: Prajjwal Bhargava (@prajjwal_1)

Owner

  • Name: Prajj
  • Login: prajjwal1
  • Kind: user
  • Location: New York
  • Company: @facebookresearch

AI Research at @facebookresearch

GitHub Events

Total
  • Watch event: 1
Last Year
  • Watch event: 1

Committers

Last synced: 8 months ago

All Time
  • Total Commits: 93
  • Total Committers: 1
  • Avg Commits per committer: 93.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
prajjwal1 p****n@p****m 93

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 1
  • Total pull requests: 25
  • Average time to close issues: 9 days
  • Average time to close pull requests: about 15 hours
  • Total issue authors: 1
  • Total pull request authors: 1
  • Average comments per issue: 0.0
  • Average comments per pull request: 0.16
  • Merged pull requests: 25
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • prajjwal1 (1)
Pull Request Authors
  • prajjwal1 (25)
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 48 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 1
  • Total versions: 6
  • Total maintainers: 1
pypi.org: fluence

Pytorch based deep learning library focussed on providing computationally efficient low resource methods and algorithms for NLP

  • Versions: 6
  • Dependent Packages: 0
  • Dependent Repositories: 1
  • Downloads: 48 Last month
Rankings
Dependent packages count: 7.3%
Stargazers count: 8.4%
Average: 18.0%
Forks count: 19.2%
Dependent repos count: 22.1%
Downloads: 32.9%
Maintainers (1)
Last synced: 6 months ago

Dependencies

setup.py pypi
  • dataclasses *
  • numpy *
  • tqdm *