lasagne

Lightweight library to build and train neural networks in Theano

https://github.com/lasagne/lasagne

Science Score: 20.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: zenodo.org
  • Committers with academic emails
    5 of 72 committers (6.9%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.4%) to scientific vocabulary

Keywords

deep-learning-library neural-networks python theano

Keywords from Contributors

optimizing-compiler tensors aesara automatic-differentiation symbolic-computation term-rewriting-system transpiler deep-neural-networks distributed autograd
Last synced: 6 months ago · JSON representation

Repository

Lightweight library to build and train neural networks in Theano

Basic Info
Statistics
  • Stars: 3,862
  • Watchers: 214
  • Forks: 941
  • Open Issues: 139
  • Releases: 0
Topics
deep-learning-library neural-networks python theano
Created over 11 years ago · Last pushed almost 4 years ago
Metadata Files
Readme Changelog Contributing License

README.rst

.. image:: https://readthedocs.org/projects/lasagne/badge/
    :target: http://lasagne.readthedocs.org/en/latest/

.. image:: https://travis-ci.org/Lasagne/Lasagne.svg
    :target: https://travis-ci.org/Lasagne/Lasagne

.. image:: https://img.shields.io/coveralls/Lasagne/Lasagne.svg
    :target: https://coveralls.io/r/Lasagne/Lasagne

.. image:: https://img.shields.io/badge/license-MIT-blue.svg
    :target: https://github.com/Lasagne/Lasagne/blob/master/LICENSE

.. image:: https://zenodo.org/badge/16974/Lasagne/Lasagne.svg
   :target: https://zenodo.org/badge/latestdoi/16974/Lasagne/Lasagne

Lasagne
=======

Lasagne is a lightweight library to build and train neural networks in Theano.
Its main features are:

* Supports feed-forward networks such as Convolutional Neural Networks (CNNs),
  recurrent networks including Long Short-Term Memory (LSTM), and any
  combination thereof
* Allows architectures of multiple inputs and multiple outputs, including
  auxiliary classifiers
* Many optimization methods including Nesterov momentum, RMSprop and ADAM
* Freely definable cost function and no need to derive gradients due to
  Theano's symbolic differentiation
* Transparent support of CPUs and GPUs due to Theano's expression compiler

Its design is governed by `six principles
`_:

* Simplicity: Be easy to use, easy to understand and easy to extend, to
  facilitate use in research
* Transparency: Do not hide Theano behind abstractions, directly process and
  return Theano expressions or Python / numpy data types
* Modularity: Allow all parts (layers, regularizers, optimizers, ...) to be
  used independently of Lasagne
* Pragmatism: Make common use cases easy, do not overrate uncommon cases
* Restraint: Do not obstruct users with features they decide not to use
* Focus: "Do one thing and do it well"


Installation
------------

In short, you can install a known compatible version of Theano and the latest
Lasagne development version via:

.. code-block:: bash

  pip install -r https://raw.githubusercontent.com/Lasagne/Lasagne/master/requirements.txt
  pip install https://github.com/Lasagne/Lasagne/archive/master.zip

For more details and alternatives, please see the `Installation instructions
`_.


Documentation
-------------

Documentation is available online: http://lasagne.readthedocs.org/

For support, please refer to the `lasagne-users mailing list
`_.


Example
-------

.. code-block:: python

  import lasagne
  import theano
  import theano.tensor as T

  # create Theano variables for input and target minibatch
  input_var = T.tensor4('X')
  target_var = T.ivector('y')

  # create a small convolutional neural network
  from lasagne.nonlinearities import leaky_rectify, softmax
  network = lasagne.layers.InputLayer((None, 3, 32, 32), input_var)
  network = lasagne.layers.Conv2DLayer(network, 64, (3, 3),
                                       nonlinearity=leaky_rectify)
  network = lasagne.layers.Conv2DLayer(network, 32, (3, 3),
                                       nonlinearity=leaky_rectify)
  network = lasagne.layers.Pool2DLayer(network, (3, 3), stride=2, mode='max')
  network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5),
                                      128, nonlinearity=leaky_rectify,
                                      W=lasagne.init.Orthogonal())
  network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5),
                                      10, nonlinearity=softmax)

  # create loss function
  prediction = lasagne.layers.get_output(network)
  loss = lasagne.objectives.categorical_crossentropy(prediction, target_var)
  loss = loss.mean() + 1e-4 * lasagne.regularization.regularize_network_params(
          network, lasagne.regularization.l2)

  # create parameter update expressions
  params = lasagne.layers.get_all_params(network, trainable=True)
  updates = lasagne.updates.nesterov_momentum(loss, params, learning_rate=0.01,
                                              momentum=0.9)

  # compile training function that updates parameters and returns training loss
  train_fn = theano.function([input_var, target_var], loss, updates=updates)

  # train network (assuming you've got some training data in numpy arrays)
  for epoch in range(100):
      loss = 0
      for input_batch, target_batch in training_data:
          loss += train_fn(input_batch, target_batch)
      print("Epoch %d: Loss %g" % (epoch + 1, loss / len(training_data)))

  # use trained network for predictions
  test_prediction = lasagne.layers.get_output(network, deterministic=True)
  predict_fn = theano.function([input_var], T.argmax(test_prediction, axis=1))
  print("Predicted class for first test input: %r" % predict_fn(test_data[0]))

For a fully-functional example, see `examples/mnist.py `_,
and check the `Tutorial
`_ for in-depth
explanations of the same. More examples, code snippets and reproductions of
recent research papers are maintained in the separate `Lasagne Recipes
`_ repository.


Citation
--------

If you find Lasagne useful for your scientific work, please consider citing it
in resulting publications. We provide a ready-to-use `BibTeX entry for citing
Lasagne `_.


Development
-----------

Lasagne is a work in progress, input is welcome.

Please see the `Contribution instructions
`_ for details
on how you can contribute!

Owner

  • Name: Lasagne
  • Login: Lasagne
  • Kind: organization

GitHub Events

Total
  • Watch event: 28
  • Pull request event: 1
  • Fork event: 4
Last Year
  • Watch event: 28
  • Pull request event: 1
  • Fork event: 4

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 825
  • Total Committers: 72
  • Avg Commits per committer: 11.458
  • Development Distribution Score (DDS): 0.781
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Jan Schlüter j****r@o****t 181
Sander Dieleman s****n@g****m 135
Colin Raffel c****l@g****m 103
Eben Olson e****n@g****m 66
skaae s****y@g****m 64
Daniel Nouri d****i@g****m 49
Daniel Maturana d****a@c****u 30
Martin Thoma i****o@m****e 24
Geoffrey French f****5@g****m 18
Eric Battenberg e****g@g****m 16
Eric Battenberg e****g@g****m 10
Jack Kelly j****t@x****k 8
Søren Kaae Sønderby s****y@b****l 8
JeffreyDF J****F@G****m 6
Reyhane Askari r****t@g****m 5
Geoffrey French b****3@g****m 5
Botev b****g@g****m 4
Diogo Moitinho de Almeida d****9@g****m 4
Hendrik Weideman w****h@r****u 4
Michael Heilman m****n@c****m 4
Mikhail Korobov k****4@g****m 4
sentient07 v****5@g****m 4
danstowell d****b@g****m 3
Kai Li 1****1@q****m 3
Joshua Chin j****n@g****m 3
Brian McFee b****e@n****u 3
Kaixhin d****n@k****m 3
wuaalb m****w@g****m 2
joncrall e****c@g****m 2
Alexander Mathews a****3@g****m 2
and 42 more...

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 69
  • Total pull requests: 33
  • Average time to close issues: 3 months
  • Average time to close pull requests: 9 months
  • Total issue authors: 48
  • Total pull request authors: 22
  • Average comments per issue: 5.26
  • Average comments per pull request: 3.3
  • Merged pull requests: 15
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 1
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 1
  • Average comments per issue: 0
  • Average comments per pull request: 0.0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • f0k (7)
  • guoxuesong (5)
  • saitarslanboun (3)
  • botev (3)
  • ghost (3)
  • rahashwini (3)
  • twiecki (2)
  • yzhl0929 (2)
  • justheuristic (2)
  • radarsat1 (1)
  • MartinThoma (1)
  • Erotemic (1)
  • ciaua (1)
  • quintendewilde (1)
  • ashish2811 (1)
Pull Request Authors
  • f0k (6)
  • Sentient07 (6)
  • ReyhaneAskari (2)
  • skaae (2)
  • digantamisra98 (1)
  • vvmurthy (1)
  • JasonnnW3000 (1)
  • SimonKohl (1)
  • luizgh (1)
  • TobyPDE (1)
  • jonashen (1)
  • rp2872 (1)
  • wafuwafu13 (1)
  • christopher-beckham (1)
  • pleabargain (1)
Top Labels
Issue Labels
easy (3) documentation (2) enhancement (1)
Pull Request Labels
documentation (1)

Packages

  • Total packages: 2
  • Total downloads:
    • pypi 1,145 last-month
  • Total dependent packages: 2
    (may contain duplicates)
  • Total dependent repositories: 43
    (may contain duplicates)
  • Total versions: 3
  • Total maintainers: 1
pypi.org: lasagne

A lightweight library to build and train neural networks in Theano

  • Versions: 2
  • Dependent Packages: 1
  • Dependent Repositories: 42
  • Downloads: 1,145 Last month
Rankings
Stargazers count: 1.1%
Forks count: 1.3%
Dependent repos count: 2.3%
Average: 3.5%
Downloads: 5.7%
Dependent packages count: 7.3%
Maintainers (1)
Last synced: 6 months ago
conda-forge.org: lasagne

A lightweight library to build and train neural networks in Theano

  • Versions: 1
  • Dependent Packages: 1
  • Dependent Repositories: 1
Rankings
Forks count: 4.9%
Stargazers count: 5.8%
Average: 15.9%
Dependent repos count: 24.1%
Dependent packages count: 28.9%
Last synced: 6 months ago

Dependencies

requirements-dev.txt pypi
  • Jinja2 ==2.7.3 development
  • Sphinx ==1.2.3 development
  • mock * development
  • numpydoc * development
  • pep8 ==1.6.2 development
  • pytest * development
  • pytest-cov * development
  • pytest-pep8 * development
  • sphinx_rtd_theme * development
requirements.txt pypi
  • Theano >=0.8.2
setup.py pypi
  • Theano *
  • numpy *