toxic-video-games-gnn

Toxic video game classification with graph neural networks

https://github.com/thebv/toxic-video-games-gnn

Science Score: 36.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (6.3%) to scientific vocabulary

Keywords

annotation classification gnns graphs tf2 toxcity
Last synced: 6 months ago · JSON representation

Repository

Toxic video game classification with graph neural networks

Basic Info
  • Host: GitHub
  • Owner: TheBv
  • License: lgpl-3.0
  • Language: Jupyter Notebook
  • Default Branch: master
  • Homepage:
  • Size: 57.7 MB
Statistics
  • Stars: 0
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 1
Topics
annotation classification gnns graphs tf2 toxcity
Created about 2 years ago · Last pushed over 1 year ago
Metadata Files
Readme License Citation

README.MD

GitHub License GitHub release (with filter) Thesis

Identifying Toxic Video Game Matches with GNN

Repository for the bachelor thesis "Identifying toxic behaviour in online games". This thesis introduces a way to represent a given video game match as an event graph and using Graph Neural Networks (GNNs) to train a model to detect toxic behaviour in a given match.

More specifically we achieve this by projecting a video game match, which itself can be understood as a temporal network, into an event graph.

This graph we can then enhance using other information such as a graph connecting players that frequently play with eachother.

We can then apply various GNNs on this graph to train a model. More specifically we chose a simple GNN based on Principal Neighbourhood Aggregation.

Results

Type | Dataset | ROC-AUC --- | --- | --- Multiclass | Detoxify | 0.6134 Multiclass | Annotation | 0.6957 Multiclass | Annotation-Enhanced | 0.7237

Datasets

Detoxify: Dataset including 10.000 matches labeling matches as toxic based on the NLP tool Detoxify.

Annotation: Dataset based on roughly 1000 human annotated matches.

Annotation-Enhanced: Dataset based on human annotated matches enhanced with a player graph with weights representing the amount of times they play with eachother.

Citation

@misc{Schrottenbacher2024, author = {Patrick Schrottenbacher}, title = {Identifying toxic behaviour in online games}, institution = {Informatik und Mathematik}, type = {bachelorthesis}, pages = {35}, year = {2024}, url = {https://publikationen.ub.uni-frankfurt.de/files/81676/Toxic_video_game_classification.pdf} repository = {https://github.com/TheBv/toxic-video-games-gnn} }

Owner

  • Name: Patrick Schrottenbacher
  • Login: TheBv
  • Kind: user

GitHub Events

Total
  • Push event: 2
Last Year
  • Push event: 2