how_attentive_are_gats
Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Science Score: 54.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.8%) to scientific vocabulary
Keywords
Repository
Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Basic Info
Statistics
- Stars: 340
- Watchers: 11
- Forks: 41
- Open Issues: 4
- Releases: 0
Topics
Metadata Files
README.md
How Attentive are Graph Attention Networks?
This repository is the official implementation of How Attentive are Graph Attention Networks?.
January 2022: the paper was accepted to ICLR'2022 !

Using GATv2
GATv2 is now available as part of PyTorch Geometric library!
from torch_geometric.nn.conv.gatv2_conv import GATv2Conv
https://pytorch-geometric.readthedocs.io/en/latest/modules/nn.html#torch_geometric.nn.conv.GATv2Conv
and also is in this main directory.
GATv2 is now available as part of DGL library!
from dgl.nn.pytorch import GATv2Conv
https://docs.dgl.ai/en/latest/api/python/nn.pytorch.html#gatv2conv
and also in this repository.
GATv2 is now available as part of Google's TensorFlow GNN library!
from tensorflow_gnn.graph.keras.layers.gat_v2 import GATv2Convolution
Code Structure
Since our experiments (Section 4) are based on different frameworks, this repository is divided into several sub-projects:
1. The subdirectory arxiv_mag_products_collab_citation2_noise contains the needed files to reproduce the results of
Node-Prediction, Link-Prediction, and Robustness to Noise (Tables 2a, 3 and Figure 4).
2. The subdirectory proteins contains the needed files to reproduce the results of ogbn-proteins in Node-Prediction (Table 2b).
3. The subdirectory dictionary_lookup contains the need files to reproduce the results of the DictionaryLookup benchmark (Figure 3).
4. The subdirectory tf-gnn-samples contains the needed files to reproduce the results of the VarMisuse and QM9 datasets
(Table 1 and Table 4).
Requirements
Each subdirectory contains its own requirements and dependencies.
Generally, all subdirectories depend on PyTorch 1.7.1 and PyTorch Geometric version 1.7.0 (proteins depends on DGL version 0.6.0).
The subdirectory tf-gnn-samples (VarMisuse and QM9) depends on TensorFlow 1.13.
Hardware
In general, all experiments can run on either GPU or CPU.
Citation
How Attentive are Graph Attention Networks?
@inproceedings{
brody2022how,
title={How Attentive are Graph Attention Networks? },
author={Shaked Brody and Uri Alon and Eran Yahav},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=F72ximsx7C1}
}
Owner
- Name: tech-srl
- Login: tech-srl
- Kind: organization
- Repositories: 25
- Profile: https://github.com/tech-srl
Citation (CITATION.cff)
@inproceedings{
brody2022how,
title={How Attentive are Graph Attention Networks? },
author={Shaked Brody and Uri Alon and Eran Yahav},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=F72ximsx7C1}
}
GitHub Events
Total
- Issues event: 1
- Watch event: 40
- Fork event: 5
Last Year
- Issues event: 1
- Watch event: 40
- Fork event: 5
Committers
Last synced: 8 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| urialon | u****1@g****m | 17 |
| Shaked Brody | s****r@g****m | 9 |
| Uri Alon | u****n@t****h | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 8 months ago
All Time
- Total issues: 8
- Total pull requests: 0
- Average time to close issues: 2 months
- Average time to close pull requests: N/A
- Total issue authors: 7
- Total pull request authors: 0
- Average comments per issue: 2.25
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 1
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 1
- Pull request authors: 0
- Average comments per issue: 0.0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- adrianjav (2)
- wenhaozheng-nju (1)
- LeadBeetle (1)
- JiaYingjuan (1)
- ALEX13679173326 (1)
- Jonbroad15 (1)
- Zhayuanhe (1)
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- ogb >=1.3.1
- torch >=1.7.1
- torch-geometric >=1.7.0
- torch-scatter >=2.0.4
- torch-sparse >=0.6.0
- tqdm >=4.42.1
- attrdict ==2.0.1
- seaborn *
- sklearn *
- torch >=1.7.1
- torch-geometric >=1.7.0
- torch-scatter >=2.0.4
- torch-sparse >=0.6.0
- torchvision *
- dgl ==0.6.0
- ogb >=1.3.1
- torch >=1.7.1
- tqdm >=4.42.1
- docopt *
- dpu-utils >=0.1.30
- numpy *
- tensorflow-gpu >=1.13.1