Recent Releases of https://github.com/explosion/spacy-transformers

https://github.com/explosion/spacy-transformers - v1.3.8: Fix wheels on v1.3.7 release

v1.3.7 updates the transformers pin to allow use of new models such as ModernBERT. However there was an error in the wheel publication. This release corrects it.

- Python
Published by github-actions[bot] about 1 year ago

https://github.com/explosion/spacy-transformers - v1.3.6

✨ New features and improvements

  • Bump transforms pin to allow up to 4.41.x
  • Updates to the test suite

👥 Contributors

@danieldk, @honnibal, @ines, @svlandeg

- Python
Published by github-actions[bot] about 1 year ago

https://github.com/explosion/spacy-transformers - v1.3.5

  • Relax spaCy constraint to <4.1.0 to support spaCy 4.
  • Fix a compilation issue with recent Cython 3 versions (#415).

- Python
Published by danieldk almost 2 years ago

https://github.com/explosion/spacy-transformers - v1.3.4

  • Extend support for transformers to v4.36.

- Python
Published by adrianeboyd about 2 years ago

https://github.com/explosion/spacy-transformers - v1.3.3

  • Drop support for Python 3.6.
  • Extend support for transformers to v4.35.

- Python
Published by adrianeboyd over 2 years ago

https://github.com/explosion/spacy-transformers - v1.3.2

  • Extend support for transformers to v4.34.

- Python
Published by adrianeboyd over 2 years ago

https://github.com/explosion/spacy-transformers - v1.3.1

This release addresses incompatibilities related to saving and loading models across a range of transformers versions following changes in transformers v4.31.

  • Extend support to transformers v4.33 (#402).
  • Add fallback to load state_dict with strict=False, due to incompatibilities related to state_dict keys between transformers v4.30 and v4.31 (#398).

Options for improved pipeline package compatibility

If you are packaging models with spacy package and you'd like to avoid having your users run into the fallback strict=False loading and warnings related to state_dict keys, you may want to add a stricter transformers requirement that corresponds to the version used during training.

If you're training with the newest supported transformers version (v4.33.x), you could currently add transformers>=4.31 to your package requirements. Note that this would restrict your users to Python 3.8+ and PyTorch 1.9+ due to transformers requirements.

Create meta.json with your additional requirements:

json { "requirements": ["transformers>=4.31"] }

Package with spacy package:

shell spacy package --meta meta.json input_dir output_dir

The requirement transformers>=4.31 will be added to any other package requirements that are automatically determined based on your pipeline config.

- Python
Published by adrianeboyd over 2 years ago

https://github.com/explosion/spacy-transformers - v1.3.0

NOTE: This release has been yanked on PyPI because of the incompatibilities between saved pipelines for transformers v4.30 and v4.31. Please use spacy-transformers v1.2.x instead.

  • Update build constraints to use numpy v1.25+ for python 3.9+ (#394).
  • Extend support for transformers up to v4.31 (#395).

:warning: After upgrading to transformers v4.31+ you may not be able to load certain older saved pipelines. All older saved models can still be loaded with transformers<4.31 or you can retrain your models so they are compatible with newer versions of transformers (#395).

If you have created packages with spacy package using spacy-transformers v1.2 or earlier, those package requirements should be restricted to compatible versions of transformers and should not require updates.

- Python
Published by adrianeboyd over 2 years ago

https://github.com/explosion/spacy-transformers - v1.2.5

  • Extend support for transformers up to v4.30.x.

- Python
Published by adrianeboyd over 2 years ago

https://github.com/explosion/spacy-transformers - v1.2.4

  • Extend support for transformers up to v4.29.x.

- Python
Published by adrianeboyd almost 3 years ago

https://github.com/explosion/spacy-transformers - v1.2.3

  • Extend support for transformers up to v4.28.x.
  • Implement coalesced pooling over entire batches (#368).

- Python
Published by adrianeboyd almost 3 years ago

https://github.com/explosion/spacy-transformers - v1.2.2

  • Transformer.predict: do not broadcast to listeners, requires spacy>=3.5.0 (#345)
  • Correct and clarify the handling of empty/zero-length Docs during training and inference (#365)
  • Remove superfluous datatype and device conversions, requires torch>=1.8.0 (#369)
  • Fix memory leak in offsets mapping alignment for fast tokenizers (#373)

- Python
Published by adrianeboyd about 3 years ago

https://github.com/explosion/spacy-transformers - v1.2.1

  • Extend support for transformers up to v4.26.x.

- Python
Published by adrianeboyd about 3 years ago

https://github.com/explosion/spacy-transformers - v1.2.0

  • For fast tokenizers, use the offset mapping provided by the tokenizer (#338).

Using the offset mapping instead of the heuristic alignment from spacy-alignments resolves unexpected and missing alignments such as those discussed in https://github.com/explosion/spaCy/discussions/6563, https://github.com/explosion/spaCy/discussions/10794 and https://github.com/explosion/spaCy/discussions/12023.

:warning: Slow and fast tokenizers will no longer give identical results due to potential differences in the alignments between transformer tokens and spaCy tokens. We recommend retraining all models with fast tokenizers for use with spacy-transformers v1.2. * Serialize the tokenizer use_fast setting (#339).

- Python
Published by adrianeboyd about 3 years ago

https://github.com/explosion/spacy-transformers - v1.1.9

  • Extend support for transformers up to v4.25.x.
  • Add support for Python 3.11 (currently limited to linux due to supported platforms for PyTorch v1.13.x).

- Python
Published by adrianeboyd about 3 years ago

https://github.com/explosion/spacy-transformers - v1.1.7

  • Extend support for transformers up to v4.20.x.
  • Convert all transformer outputs to XP arrays at once (#330).
  • Support alternate model loaders in HFShim and HFWrapper (#332).

- Python
Published by adrianeboyd over 3 years ago

https://github.com/explosion/spacy-transformers - v1.1.8

  • Extend support for transformers up to v4.21.x.
  • Support MPS device in HFShim (#328).
  • Track seen docs during alignment to improve speed (#337).
  • Don't require examples in Transformer.initialize (#341).

- Python
Published by adrianeboyd over 3 years ago

https://github.com/explosion/spacy-transformers - v1.1.6

  • Extend support for transformers up to v4.19.x.
  • Fix issue #324: Skip backprop for transformer if not available, for example if the transformer is frozen.

- Python
Published by adrianeboyd over 3 years ago

https://github.com/explosion/spacy-transformers - v1.1.5

✨ New features and improvements

  • Extend support for transformers up to v4.17.x.

👥 Contributors

@adrianeboyd

- Python
Published by adrianeboyd almost 4 years ago

https://github.com/explosion/spacy-transformers - v1.1.4

✨ New features and improvements

  • Extend support for transformers up to v4.15.x.

👥 Contributors

@adrianeboyd, @danieldk

- Python
Published by adrianeboyd about 4 years ago

https://github.com/explosion/spacy-transformers - v1.1.3

✨ New features and improvements

  • Extend support for transformers up to v4.12.x.

👥 Contributors

@adrianeboyd

- Python
Published by adrianeboyd about 4 years ago

https://github.com/explosion/spacy-transformers - v1.1.2

🔴 Bug fixes

  • Fix #315: Enable loading of v1.0.x pipelines in windows.

👥 Contributors

@adrianeboyd, @ryndaniels, @svlandeg

- Python
Published by adrianeboyd over 4 years ago

https://github.com/explosion/spacy-transformers - v1.1.1

🔴 Bug fixes

  • Fix #309: Fix parameter ordering and defaults for new parameters in TransformerModel architectures.
  • Fix #310: Fix config and model issues when replacing listeners.

👥 Contributors

@adrianeboyd, @svlandeg

- Python
Published by adrianeboyd over 4 years ago

https://github.com/explosion/spacy-transformers - v1.1.0

✨ New features and improvements

  • Refactor and improve transformer serialization for better support of inline transformer components and replacing listeners.
  • Provide the transformer model output as ModelOutput instead of tuples in TransformerData.model_output and FullTransformerBatch.model_output. For backwards compatibility, the tuple format remains available under TransformerData.tensors and FullTransformerBatch.tensors. See more details in the transformer API docs.
  • Add support for transformer_config settings such as output_attentions. Additional output is stored under TransformerData.model_output. More details in the TransformerModel docs.
  • Add support for mixed-precision training.
  • Improve training speed by streamlining allocations for tokenizer output.
  • Extend support for transformers up to v4.11.x.

🔴 Bug fixes

  • Fix support for GPT2 models.

⚠️ Backwards incompatibilities

  • The serialization format for transformer components has changed in v1.1 and is not compatible with spacy-transformers v1.0.x. Pipelines trained with v1.0.x can be loaded with v1.1.x, but pipelines saved with v1.1.x cannot be loaded with v1.0.x.
  • TransformerData.tensors and FullTransformerBatch.tensors return a tuple instead of a list.

👥 Contributors

@adrianeboyd, @bryant1410, @danieldk, @honnibal, @ines, @KennethEnevoldsen, @svlandeg

- Python
Published by adrianeboyd over 4 years ago

https://github.com/explosion/spacy-transformers - v1.0.6: Bugfix for replacing listeners

  • Fix copying of grad_factor when replacing listeners.

- Python
Published by ines over 4 years ago

https://github.com/explosion/spacy-transformers - v1.0.5: Bugfix for replacing listeners

  • Fix replacing listeners: #277
  • Require spaCy 3.1.0 or higher

- Python
Published by svlandeg over 4 years ago

https://github.com/explosion/spacy-transformers - v1.0.4

  • Extend transformers support to <4.10.0
  • Enable pickling of span getters and annotation setters, which is required for multiprocessing with spawn

- Python
Published by adrianeboyd over 4 years ago

https://github.com/explosion/spacy-transformers - v1.0.3

  • Allow spaCy 3.1
  • Extend transformers to <4.7.0

- Python
Published by svlandeg over 4 years ago

https://github.com/explosion/spacy-transformers - v1.0.2

✨ New features and improvements

  • Add support for transformers v4.3-v4.5
  • Add extra for CUDA 11.2

🔴 Bug fixes

  • Fix #264, #265: Improve handling of empty docs
  • Fix #269: Add trf_data extension in Transformer.__call__ and Transformer.pipe to support distributed processing

👥 Contributors

Thanks to @bryant1410 for the pull requests and contributions!

- Python
Published by adrianeboyd almost 5 years ago

https://github.com/explosion/spacy-transformers - v1.0.1

🔴 Bug fixes

  • Fix listener initialization when model width is unset.

- Python
Published by ines about 5 years ago

https://github.com/explosion/spacy-transformers - v1.0.0: Rewrite for spaCy v3, Transformer component and TransformerListener, plus more functions

This release requires spaCy v3.

✨ New features and improvements

  • Rewrite library from scratch for spaCy v3.0.
  • Transformer component for easy pipeline integration.
  • TransformerListener to share transformer weights between components.
  • Built-in registered functions that are available in spaCy if spacy-transformers is installed in the same environment.

📖 Documentation

- Python
Published by ines about 5 years ago

https://github.com/explosion/spacy-transformers - v1.0.0rc2

🌙 This release is a pre-release and requires spaCy v3 (nightly).

✨ New features and improvements

  • Add support for Python 3.9
  • Add support for transformers v4

🔴 Bug fixes

  • Fix #230: Add upstream argument to TransformerListener.v1
  • Fix #238: Skip special tokens during alignment
  • Fix #246: Raise error if model max length exceeded

- Python
Published by adrianeboyd about 5 years ago

https://github.com/explosion/spacy-transformers - v1.0.0rc0

🌙 This release is a pre-release and requires spaCy v3 (nightly).

✨ New features and improvements

  • Rewrite library from scratch for spaCy v3.0.
  • Transformer component for easy pipeline integration.
  • TransformerListener to share transformer weights between components.
  • Built-in registered functions that are available in spaCy if spacy-transformers is installed in the same environment.

📖 Documentation

- Python
Published by ines over 5 years ago

https://github.com/explosion/spacy-transformers - v0.6.2: Bug fix for Tok2Vec speed

  • Fix issue #204: Store model_type in tok2vec config to fix speed degradation

- Python
Published by adrianeboyd over 5 years ago

https://github.com/explosion/spacy-transformers - v0.6.1: Updates for spaCy v2.3 and transformers AutoConfig

⚠️ This release requires downloading new models.

  • Update spacy-transformers for spaCy v2.3
  • Update and extend supported transformers versions to >=2.4.0,<2.9.0
  • Use transformers.AutoConfig to support loading pretrained models from https://huggingface.co/models
  • #123: Fix alignment algorithm using pytokenizations

Thanks to @tamuhey for the pull request!

- Python
Published by adrianeboyd over 5 years ago

https://github.com/explosion/spacy-transformers - v0.5.3: Bug fixes for truncation

Bug fixes related to alignment and truncation:

  • #191: Reset max_len in case of alignment error
  • #196: Fix wordpiecer truncation to be per sentence

Enhancement:

  • #162: Let nlp.update handle Doc type inputs

Thanks to @ZhuoruLin for the pull requests and helping us debug issues related to batching and truncation!

- Python
Published by adrianeboyd over 5 years ago

https://github.com/explosion/spacy-transformers - v0.6.0 Update to transformers v2.5.0

Update to newer version of transformers.

This library is being rewritten for spaCy v3, in order to improve its flexibility and performance and to make it easier to stay up to date with new transformer models. See here for details: https://github.com/explosion/spacy-transformers/pull/173

- Python
Published by honnibal over 5 years ago

https://github.com/explosion/spacy-transformers - v0.5.2: Bug fixes to alignment

Fix various alignment and preprocessing bugs.

- Python
Published by honnibal over 5 years ago

https://github.com/explosion/spacy-transformers - v0.5.1

  • Downgrade version pin of importlib_metadata to prevent conflict.
  • Fix issue #92: Fix index error when calculating doc.tensor.

Thanks to @ssavvi for the pull request!

- Python
Published by ines over 6 years ago

https://github.com/explosion/spacy-transformers - v0.5.0

⚠️ This release requires downloading new models. Also note the new model names that specify trf (transformers) instead of pytt (PyTorch transformers).

  • Rename package from spacy-pytorch-transformers to spacy-transformers.
  • Update to spacy>=2.2.0.
  • Upgrade to latest transformers.
  • Improve code and repo organization.

- Python
Published by ines over 6 years ago

https://github.com/explosion/spacy-transformers - v0.4.0

- Python
Published by ines over 6 years ago

https://github.com/explosion/spacy-transformers - v0.3.0

  • Add out-of-the-box support for RoBERTa.
  • Add pre-packaged RoBERTa model.
  • Update to pytorch-transformers v1.1.
  • Fix serialization when model was saved from GPU.

- Python
Published by ines over 6 years ago

https://github.com/explosion/spacy-transformers - v0.2.0

  • Add support for GLUE benchmark tasks.
  • Support text-pair classification. The specifics of this are likely to change, but you can see run_glue.py for current usage.
  • Improve reliability of tokenization and alignment.
  • Add support for segment IDs to the PyTT_Wrapper class. These can now be passed in as a second column of the RaggedArray input. See the model_registry.get_word_pieces function for example usage.
  • Set default maximum sequence length to 128.
  • Fix bug that caused settings not to be passed into PyTT_TextCategorizer on model initialization.
  • Fix serialization of XLNet model.

- Python
Published by ines over 6 years ago

https://github.com/explosion/spacy-transformers - v0.1.1

  • Handle unaligned tokens in extension attributes.

- Python
Published by ines over 6 years ago

https://github.com/explosion/spacy-transformers - v0.1.0

⚠️ This version requires downloading new models.

  • Fix issue #15: Fix serialization of config and make models load correctly offline.
  • Improve accuracy of textcat by passing hyper-parameters correctly (Adam epsilon, L2).
  • Support pooler output for BERT model.
  • Add fine_tune_pooler_output model architecture option for pytt_textcat.
  • Add Glue benchmark script in examples/tasks/run_glue.py.
  • Improve overall stability.

- Python
Published by ines over 6 years ago

https://github.com/explosion/spacy-transformers - v0.0.1

- Python
Published by ines over 6 years ago