layer_norm_expressivity_role

Code for the paper "On the Expressivity Role of LayerNorm in Transformers' Attention" (Findings of ACL'2023)

https://github.com/tech-srl/layer_norm_expressivity_role

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (8.5%) to scientific vocabulary

Keywords

attention layer-normalization layernorm transformers
Last synced: 7 months ago · JSON representation ·

Repository

Code for the paper "On the Expressivity Role of LayerNorm in Transformers' Attention" (Findings of ACL'2023)

Basic Info
  • Host: GitHub
  • Owner: tech-srl
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 748 KB
Statistics
  • Stars: 46
  • Watchers: 6
  • Forks: 3
  • Open Issues: 1
  • Releases: 0
Topics
attention layer-normalization layernorm transformers
Created almost 3 years ago · Last pushed over 1 year ago
Metadata Files
Readme Citation

README.md

On the Expressivity Role of LayerNorm in Transformers' Attention

This repository contains the code for reproduce the results from "On the Expressivity Role of LayerNorm in Transformers' Attention" (Findings of ACL'2023) [PDF].

alt text

Setup

Make sure you have wandb.ai user and that you are logged into your machine.

Install the required python packages: pip install -r requirements.txt

Gurobi is needed to find unselectable keys, and requires a license. See in here.

Hardware

In general, all experiments can run on either GPU or CPU.

Code Structure

  1. The majority subdirectory contains the files needed to reproduce the results of the Majority task (Figure 1a, 1b, 2, 3).
  2. The unselectable subdirectory contains the files needed to reproduce the results of the unselectable experiments (Figure 1c, 1d, 4, Table 1, 2).

Citation

On the Expressivity Role of LayerNorm in Transformers' Attention @article{brody2023expressivity, title={On the Expressivity Role of LayerNorm in Transformers' Attention}, author={Brody, Shaked and Alon, Uri and Yahav, Eran}, journal={arXiv preprint arXiv:2305.02582}, year={2023} }

Owner

  • Name: tech-srl
  • Login: tech-srl
  • Kind: organization

Citation (CITATION.cff)

@article{brody2023expressivity,
  title={On the Expressivity Role of LayerNorm in Transformers' Attention},
  author={Brody, Shaked and Alon, Uri and Yahav, Eran},
  journal={arXiv preprint arXiv:2305.02582},
  year={2023}
}

GitHub Events

Total
  • Watch event: 11
  • Fork event: 1
Last Year
  • Watch event: 11
  • Fork event: 1

Issues and Pull Requests

Last synced: 12 months ago

All Time
  • Total issues: 1
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 1
  • Total pull request authors: 0
  • Average comments per issue: 2.0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 1
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 1
  • Pull request authors: 0
  • Average comments per issue: 2.0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • liveck (1)
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels