spacy

💫 Industrial-strength Natural Language Processing (NLP) in Python

https://github.com/explosion/spacy

Science Score: 36.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • â—‹
    CITATION.cff file
  • ✓
    codemeta.json file
    Found codemeta.json file
  • ✓
    .zenodo.json file
    Found .zenodo.json file
  • â—‹
    DOI references
  • â—‹
    Academic publication links
  • ✓
    Committers with academic emails
    28 of 753 committers (3.7%) from academic institutions
  • â—‹
    Institutional organization owner
  • â—‹
    JOSS paper metadata
  • â—‹
    Scientific vocabulary similarity
    Low similarity (15.8%) to scientific vocabulary

Keywords

ai artificial-intelligence cython data-science deep-learning entity-linking machine-learning named-entity-recognition natural-language-processing neural-network neural-networks nlp nlp-library python spacy text-classification tokenization

Keywords from Contributors

lemmatization gsoc-2018 greek-language greek jax transformers cryptography cryptocurrencies machine-learning-library mxnet
Last synced: 6 months ago · JSON representation

Repository

💫 Industrial-strength Natural Language Processing (NLP) in Python

Basic Info
  • Host: GitHub
  • Owner: explosion
  • License: mit
  • Language: Python
  • Default Branch: master
  • Homepage: https://spacy.io
  • Size: 194 MB
Statistics
  • Stars: 32,333
  • Watchers: 567
  • Forks: 4,568
  • Open Issues: 200
  • Releases: 130
Topics
ai artificial-intelligence cython data-science deep-learning entity-linking machine-learning named-entity-recognition natural-language-processing neural-network neural-networks nlp nlp-library python spacy text-classification tokenization
Created over 11 years ago · Last pushed 9 months ago
Metadata Files
Readme Contributing Funding License Citation

README.md

spaCy: Industrial-strength NLP

spaCy is a library for advanced Natural Language Processing in Python and Cython. It's built on the very latest research, and was designed from day one to be used in real products.

spaCy comes with pretrained pipelines and currently supports tokenization and training for 70+ languages. It features state-of-the-art speed and neural network models for tagging, parsing, named entity recognition, text classification and more, multi-task learning with pretrained transformers like BERT, as well as a production-ready training system and easy model packaging, deployment and workflow management. spaCy is commercial open-source software, released under the MIT license.

Version 3.8 out now! Check out the release notes here.

tests Current Release Version pypi Version conda Version Python wheels Code style: black
PyPi downloads Conda downloads

Documentation

| Documentation | | | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | spaCy 101 | New to spaCy? Here's everything you need to know! | | Usage Guides | How to use spaCy and its features. | | New in v3.0 | New features, backwards incompatibilities and migration guide. | | Project Templates | End-to-end workflows you can clone, modify and run. | | API Reference | The detailed reference for spaCy's API. | | GPU Processing | Use spaCy with CUDA-compatible GPU processing. | | Models | Download trained pipelines for spaCy. | | Large Language Models | Integrate LLMs into spaCy pipelines. | | Universe | Plugins, extensions, demos and books from the spaCy ecosystem. | | spaCy VS Code Extension | Additional tooling and features for working with spaCy's config files. | | Online Course | Learn spaCy in this free and interactive online course. | | Blog | Read about current spaCy and Prodigy development, releases, talks and more from Explosion. | | Videos | Our YouTube channel with video tutorials, talks and more. | | Live Stream | Join Matt as he works on spaCy and chat about NLP, live every week. | | Changelog | Changes and version history. | | Contribute | How to contribute to the spaCy project and code base. | | Swag | Support us and our work with unique, custom-designed swag! | | Tailored Solutions | Custom NLP consulting, implementation and strategic advice by spaCys core development team. Streamlined, production-ready, predictable and maintainable. Send us an email or take our 5-minute questionnaire, and well'be in touch! Learn more → |

Where to ask questions

The spaCy project is maintained by the spaCy team. Please understand that we won't be able to provide individual support via email. We also believe that help is much more valuable if it's shared publicly, so that more people can benefit from it.

| Type | Platforms | | ------------------------------- | --------------------------------------- | | Bug Reports | GitHub Issue Tracker | | Feature Requests & Ideas | GitHub Discussions | | Usage Questions | GitHub Discussions | | General Discussion | GitHub Discussions |

Features

  • Support for 70+ languages
  • Trained pipelines for different languages and tasks
  • Multi-task learning with pretrained transformers like BERT
  • Support for pretrained word vectors and embeddings
  • State-of-the-art speed
  • Production-ready training system
  • Linguistically-motivated tokenization
  • Components for named entity recognition, part-of-speech-tagging, dependency parsing, sentence segmentation, text classification, lemmatization, morphological analysis, entity linking and more
  • Easily extensible with custom components and attributes
  • Support for custom models in PyTorch, TensorFlow and other frameworks
  • Built in visualizers for syntax and NER
  • Easy model packaging, deployment and workflow management
  • Robust, rigorously evaluated accuracy

For more details, see the facts, figures and benchmarks.

Install spaCy

For detailed installation instructions, see the documentation.

  • Operating system: macOS / OS X Linux Windows (Cygwin, MinGW, Visual Studio)
  • Python version: Python >=3.7, <3.13 (only 64 bit)
  • Package managers: pip (via conda-forge)

pip

Using pip, spaCy releases are available as source packages and binary wheels. Before you install spaCy and its dependencies, make sure that your pip, setuptools and wheel are up to date.

bash pip install -U pip setuptools wheel pip install spacy

To install additional data tables for lemmatization and normalization you can run pip install spacy[lookups] or install spacy-lookups-data separately. The lookups package is needed to create blank models with lemmatization data, and to lemmatize in languages that don't yet come with pretrained models and aren't powered by third-party libraries.

When using pip it is generally recommended to install packages in a virtual environment to avoid modifying system state:

bash python -m venv .env source .env/bin/activate pip install -U pip setuptools wheel pip install spacy

conda

You can also install spaCy from conda via the conda-forge channel. For the feedstock including the build recipe and configuration, check out this repository.

bash conda install -c conda-forge spacy

Updating spaCy

Some updates to spaCy may require downloading new statistical models. If you're running spaCy v2.0 or higher, you can use the validate command to check if your installed models are compatible and if not, print details on how to update them:

bash pip install -U spacy python -m spacy validate

If you've trained your own models, keep in mind that your training and runtime inputs must match. After updating spaCy, we recommend retraining your models with the new version.

For details on upgrading from spaCy 2.x to spaCy 3.x, see the migration guide.

Download model packages

Trained pipelines for spaCy can be installed as Python packages. This means that they're a component of your application, just like any other module. Models can be installed using spaCy's download command, or manually by pointing pip to a path or URL.

| Documentation | | | -------------------------- | ---------------------------------------------------------------- | | Available Pipelines | Detailed pipeline descriptions, accuracy figures and benchmarks. | | Models Documentation | Detailed usage and installation instructions. | | Training | How to train your own pipelines on your data. |

```bash

Download best-matching version of specific model for your spaCy installation

python -m spacy download encoreweb_sm

pip install .tar.gz archive or .whl from path or URL

pip install /Users/you/encorewebsm-3.0.0.tar.gz pip install /Users/you/encorewebsm-3.0.0-py3-none-any.whl pip install https://github.com/explosion/spacy-models/releases/download/encorewebsm-3.0.0/encorewebsm-3.0.0.tar.gz ```

Loading and using models

To load a model, use spacy.load() with the model name or a path to the model data directory.

python import spacy nlp = spacy.load("en_core_web_sm") doc = nlp("This is a sentence.")

You can also import a model directly via its full name and then call its load() method with no arguments.

```python import spacy import encoreweb_sm

nlp = encoreweb_sm.load() doc = nlp("This is a sentence.") ```

For more info and examples, check out the models documentation.

Compile from source

The other way to install spaCy is to clone its GitHub repository and build it from source. That is the common way if you want to make changes to the code base. You'll need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, virtualenv and git installed. The compiler part is the trickiest. How to do that depends on your system.

| Platform | | | ----------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Ubuntu | Install system-level dependencies via apt-get: sudo apt-get install build-essential python-dev git . | | Mac | Install a recent version of XCode, including the so-called "Command Line Tools". macOS and OS X ship with Python and git preinstalled. | | Windows | Install a version of the Visual C++ Build Tools or Visual Studio Express that matches the version that was used to compile your Python interpreter. |

For more details and instructions, see the documentation on compiling spaCy from source and the quickstart widget to get the right commands for your platform and Python version.

```bash git clone https://github.com/explosion/spaCy cd spaCy

python -m venv .env source .env/bin/activate

make sure you are using the latest pip

python -m pip install -U pip setuptools wheel

pip install -r requirements.txt pip install --no-build-isolation --editable . ```

To install with extras:

bash pip install --no-build-isolation --editable .[lookups,cuda102]

Run tests

spaCy comes with an extensive test suite. In order to run the tests, you'll usually want to clone the repository and build spaCy from source. This will also install the required development dependencies and test utilities defined in the requirements.txt.

Alternatively, you can run pytest on the tests from within the installed spacy package. Don't forget to also install the test utilities via spaCy's requirements.txt:

bash pip install -r requirements.txt python -m pytest --pyargs spacy

Owner

  • Name: Explosion
  • Login: explosion
  • Kind: organization
  • Email: contact@explosion.ai
  • Location: Berlin, Germany

A software company specializing in developer tools for Artificial Intelligence and Natural Language Processing

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 14,136
  • Total Committers: 753
  • Avg Commits per committer: 18.773
  • Development Distribution Score (DDS): 0.714
Past Year
  • Commits: 103
  • Committers: 20
  • Avg Commits per committer: 5.15
  • Development Distribution Score (DDS): 0.311
Top Committers
Name Email Commits
Ines Montani i****s@i****o 4,041
Matthew Honnibal h****h@g****m 3,094
Matthew Honnibal h****l@g****m 2,186
Adriane Boyd a****d@g****m 998
svlandeg s****m@g****m 961
Henning Peters p****e@d****e 211
Matthew Honnibal m****w@h****m 197
Paul O'Leary McCann p****m@d****m 169
Jim Geovedi j****m@g****m 49
Daniël de Kok me@d****u 47
Wolfgang Seeker s****r@s****o 43
Raphael Mitsch r****h@o****m 43
github-actions[bot] 4****] 40
Jim O'Regan j****n@t****e 35
maxirmx m****v@i****g 34
maxirmx m****v@c****g 33
Marcus Blättermann m****s@e****e 33
Gyorgy Orosz o****y@g****m 30
Madeesh Kannan s****e 30
DuyguA d****2@g****m 28
Edward 4****r 24
Søren Lind Kristiansen s****n@g****k 23
Lj Miranda 1****1 23
Peter Baumgartner 5****r 21
Raphaël Bournhonesque r****l@b****u 21
Richard Hudson r****d@e****i 17
Leander Fiedler l****r 16
Explosion Bot c****t@e****i 16
Wannaphong Phatthiyaphaibun w****g@y****m 15
Roman Domrachev l****r@g****m 15
and 723 more...

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 351
  • Total pull requests: 398
  • Average time to close issues: about 2 months
  • Average time to close pull requests: about 1 month
  • Total issue authors: 307
  • Total pull request authors: 106
  • Average comments per issue: 3.25
  • Average comments per pull request: 1.16
  • Merged pull requests: 289
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 81
  • Pull requests: 79
  • Average time to close issues: 3 months
  • Average time to close pull requests: about 1 month
  • Issue authors: 78
  • Pull request authors: 40
  • Average comments per issue: 2.16
  • Average comments per pull request: 0.8
  • Merged pull requests: 27
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • nrodnova (5)
  • BLKSerene (4)
  • kormilitzin (3)
  • belalsalih (3)
  • lsmith77 (3)
  • rkatriel (3)
  • KennethEnevoldsen (3)
  • ojo4f3 (3)
  • lordsoffallen (2)
  • rs-pawanmethre (2)
  • jianlins (2)
  • erikspears (2)
  • adrianeboyd (2)
  • SHxKM (2)
  • Oumayma68 (2)
Pull Request Authors
  • adrianeboyd (90)
  • svlandeg (65)
  • danieldk (50)
  • rmitsch (39)
  • honnibal (26)
  • wjbmattingly (10)
  • thjbdvlt (7)
  • shadeMe (7)
  • lise-brinck (5)
  • victorialslocum (5)
  • BLKSerene (4)
  • samhithamuvva (4)
  • bdura (4)
  • ljvmiranda921 (4)
  • davispuh (3)
Top Labels
Issue Labels
bug (24) docs (18) third-party (14) feat / transformer (12) lang / en (11) install (11) feat / lemmatizer (10) usage (10) enhancement (10) feat / doc (10) feat / serialize (8) training (8) models (8) feat / visualizers (7) perf / accuracy (7) compat (6) duplicate (6) feat / tokenizer (6) perf / memory (6) gpu (6) feat/llm (6) feat / ner (5) feat / matcher (5) lang / de (5) feat / cli (5) feat / spancat (5) scaling (5) feat / vectors (4) feat / ux (3) resolved (3)
Pull Request Labels
docs (114) enhancement (49) bug (39) meta (34) feat/llm (34) universe (27) 🔜 v4.0 (26) tests (21) feat / cli (20) feat / pipeline (18) 🔜 v3.7 (17) feat / doc (16) 🔜 v5.0 (10) feat / visualizers (8) feat / tokenizer (7) feat / textcat (7) feat / parser (6) feat / nel (6) install (5) feat / matcher (5) v3.5 (5) 🔜 v3.6 (5) lang / fo (5) third-party (5) feat / lemmatizer (5) feat / serialize (4) feat / spancat (4) compat (4) feat / ux (4) gpu (4)

Packages

  • Total packages: 6
  • Total downloads:
    • pypi 15,434,929 last-month
  • Total docker downloads: 946,253,747
  • Total dependent packages: 1,048
    (may contain duplicates)
  • Total dependent repositories: 16,141
    (may contain duplicates)
  • Total versions: 528
  • Total maintainers: 4
pypi.org: spacy

Industrial-strength Natural Language Processing (NLP) in Python

  • Versions: 218
  • Dependent Packages: 873
  • Dependent Repositories: 15,793
  • Downloads: 15,434,929 Last month
  • Docker Downloads: 946,253,747
Rankings
Dependent packages count: 0.0%
Dependent repos count: 0.1%
Stargazers count: 0.1%
Forks count: 0.1%
Downloads: 0.1%
Average: 0.1%
Docker downloads count: 0.3%
Maintainers (3)
Last synced: 6 months ago
conda-forge.org: spacy

spaCy is a library for advanced natural language processing in Python and Cython.

  • Homepage: https://spacy.io/
  • License: MIT
  • Latest release: 3.4.3
    published over 3 years ago
  • Versions: 68
  • Dependent Packages: 92
  • Dependent Repositories: 174
Rankings
Dependent packages count: 0.8%
Stargazers count: 1.3%
Average: 1.6%
Forks count: 1.6%
Dependent repos count: 2.6%
Last synced: 6 months ago
proxy.golang.org: github.com/explosion/spacy
  • Versions: 112
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Stargazers count: 0.1%
Forks count: 0.1%
Average: 4.4%
Dependent packages count: 8.1%
Dependent repos count: 9.3%
Last synced: 6 months ago
proxy.golang.org: github.com/explosion/spaCy
  • Versions: 112
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Stargazers count: 0.1%
Forks count: 0.1%
Average: 5.1%
Dependent packages count: 9.5%
Dependent repos count: 10.8%
Last synced: 6 months ago
pypi.org: spacy-wheel

Reupload of SpaCy 3.4.4 with Global Wheel

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Stargazers count: 0.1%
Forks count: 0.1%
Dependent packages count: 4.6%
Average: 9.0%
Dependent repos count: 31.2%
Maintainers (1)
Last synced: 8 months ago
anaconda.org: spacy

spaCy is a library for advanced Natural Language Processing in Python and Cython. It's built on the very latest research, and was designed from day one to be used in real products.

  • Homepage: https://spacy.io/
  • License: MIT
  • Latest release: 3.8.2
    published over 1 year ago
  • Versions: 17
  • Dependent Packages: 83
  • Dependent Repositories: 174
Rankings
Stargazers count: 3.2%
Forks count: 5.1%
Dependent repos count: 15.2%
Average: 16.1%
Dependent packages count: 41.0%
Last synced: 6 months ago

Dependencies

.github/workflows/explosionbot.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite
.github/workflows/gputests.yml actions
  • buildkite/trigger-pipeline-action v1.2.0 composite
.github/workflows/issue-manager.yml actions
  • tiangolo/issue-manager 0.4.0 composite
.github/workflows/lock.yml actions
  • dessant/lock-threads v4 composite
.github/workflows/slowtests.yml actions
  • actions/checkout v3 composite
  • buildkite/trigger-pipeline-action v1.2.0 composite
.github/workflows/spacy_universe_alert.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite
website/Dockerfile docker
  • node 11.15.0 build
website/package-lock.json npm
  • 1039 dependencies
website/package.json npm
  • @codemirror/lang-python ^6.1.0
  • @docsearch/react ^3.3.0
  • @jupyterlab/services ^3.2.1
  • @lezer/highlight ^1.1.3
  • @mapbox/rehype-prism ^0.8.0
  • @mdx-js/loader ^2.1.5
  • @mdx-js/react ^2.1.5
  • @next/mdx ^13.0.2
  • @rehooks/online-status ^1.1.2
  • @types/node 18.11.9
  • @types/react 18.0.25
  • @types/react-dom 18.0.8
  • @uiw/codemirror-themes ^4.19.3
  • @uiw/react-codemirror ^4.19.3
  • acorn ^8.8.1
  • browser-monads ^1.0.0
  • classnames ^2.3.2
  • eslint 8.27.0
  • eslint-config-next 13.0.2
  • html-to-react ^1.5.0
  • jinja-to-js ^3.2.3
  • md-attr-parser ^1.3.0
  • next 13.0.2
  • next-mdx-remote ^4.2.0
  • next-plausible ^3.6.5
  • next-pwa ^5.6.0
  • next-sitemap ^3.1.32
  • node-fetch ^2.6.7
  • parse-numeric-range ^1.3.0
  • prettier ^2.7.1
  • prismjs ^1.29.0
  • prop-types ^15.8.1
  • react 18.2.0
  • react-dom 18.2.0
  • react-github-btn ^1.4.0
  • react-inlinesvg ^3.0.1
  • react-intersection-observer ^9.4.0
  • remark ^14.0.2
  • remark-gfm ^3.0.1
  • remark-react ^9.0.1
  • remark-smartypants ^2.0.0
  • remark-unwrap-images ^3.0.1
  • sass ^1.56.1
  • typescript 4.8.4
  • unist-util-visit ^4.1.1
  • ws ^8.11.0
requirements.txt pypi
  • black >=22.0,<23.0
  • catalogue >=2.0.6,<2.1.0
  • cymem >=2.0.2,<2.1.0
  • cython >=0.25,<3.0
  • flake8 >=3.8.0,<6.0.0
  • hypothesis >=3.27.0,<7.0.0
  • jinja2 *
  • langcodes >=3.2.0,<4.0.0
  • ml_datasets >=0.2.0,<0.3.0
  • mock >=2.0.0,<3.0.0
  • murmurhash >=0.28.0,<1.1.0
  • mypy >=0.990,<0.1000
  • numpy >=1.15.0
  • packaging >=20.0
  • pathy >=0.10.0
  • pre-commit >=2.13.0
  • preshed >=3.0.2,<3.1.0
  • pydantic >=1.7.4,
  • pytest >=5.2.0,
  • pytest-timeout >=1.3.0,<2.0.0
  • requests >=2.13.0,<3.0.0
  • setuptools *
  • smart-open >=5.2.1,<7.0.0
  • spacy-legacy >=3.0.11,<3.1.0
  • spacy-loggers >=1.0.0,<2.0.0
  • srsly >=2.4.3,<3.0.0
  • thinc >=8.1.0,<8.2.0
  • tqdm >=4.38.0,<5.0.0
  • typer >=0.3.0,<0.8.0
  • types-dataclasses >=0.1.3
  • types-mock >=0.1.1
  • types-requests *
  • types-setuptools >=57.0.0
  • typing_extensions >=3.7.4.1,<4.5.0
  • wasabi >=0.9.1,<1.2.0
website/setup/requirements.txt pypi
  • jinja2 >=3.1.0
  • srsly *
.github/workflows/tests.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite
.github/workflows/universe_validation.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite
pyproject.toml pypi
setup.py pypi