043-temporally-and-distributionally-robust-optimization-for-cold-start-recommendation

https://github.com/szu-advtech-2024/043-temporally-and-distributionally-robust-optimization-for-cold-start-recommendation

Science Score: 23.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (10.9%) to scientific vocabulary

Scientific Fields

Artificial Intelligence and Machine Learning Computer Science - 40% confidence
Last synced: 4 months ago · JSON representation

Repository

Basic Info
  • Host: GitHub
  • Owner: SZU-AdvTech-2024
  • Default Branch: main
  • Size: 0 Bytes
Statistics
  • Stars: 0
  • Watchers: 0
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Created 12 months ago · Last pushed 12 months ago
Metadata Files
Citation

https://github.com/SZU-AdvTech-2024/043-Temporally-and-Distributionally-Robust-Optimization-for-Cold-Start-Recommendation/blob/main/

# Temporally and Distributionally Robust Optimization for Cold-start Recommendation
:bulb: This is the pytorch implementation of our paper 
> [Temporally and Distributionally Robust Optimization for Cold-start Recommendation](https://arxiv.org/pdf/2312.09901.pdf)
>
> Xinyu Lin, Wenjie Wang, Jujia Zhao, Yongqi Li, Fuli Feng, Tat-Seng Chua

## Environment
- Anaconda 3
- python 3.7.11
- pytorch 1.10.0
- numpy 1.21.4
- kmeans_pytorch

## Usage

### Data
The experimental data are in './data' folder, including Amazon, Micro-video, and Kwai.

### :red_circle: Training 
```
python main.py --model_name=$1 --data_path=$2 --batch_size=$3 --l_r=$4 --reg_weight=$5 --num_group=$6 --num_period=$7 --mu=$8 --eta=$9 --lam=$10 --split_mode=$11 --log_name=$12 --gpu=$13
```
or use run.sh
```
sh run.sh             
```
- The log file will be in the './code/log/' folder. 
- The explanation of hyper-parameters can be found in './code/main.py'. 
- The default hyper-parameter settings are detailed in './code/hyper-parameters.txt'.

:star2: TDRO is a model-agnostic training framework and can be applied to any cold-start recommender model. You can simply create your cold-start recommender model script in './code' folder, in a similar way to "model_CLCRec.py". Alternatively, you may adopt the function ``train_TDRO`` in "Train.py" to your own code for training your cold-start recommender model via TDRO.

### :large_blue_circle: Inference
Get the results of TDRO by running inference.py:

```
python inference.py --inference --data_path=$1 --ckpt=$2 --gpu=$3
```
or use inference.sh
```
sh inference.sh dataset  
```

### :white_circle: Examples
1. Train on Amazon dataset
```
cd ./code
sh run.sh TDRO amazon 1000 0.001 0.001 5 5 0.2 0.2 0.3 global log 0
```
2. Inference 
```
cd ./code
sh inference.sh amazon  0
```
## Citation
If you find our work is useful for your research, please consider citing:
```
@inproceedings{lin2023temporally,
      title={Temporally and Distributionally Robust Optimization for Cold-start Recommendation}, 
      author={Xinyu Lin, Wenjie Wang, Jujia Zhao, Yongqi Li, Fuli Feng, and Tat-Seng Chua},
      booktitle={AAAI},
      year={2024}
}
```

## License

NUS  [NExT++](https://www.nextcenter.org/)

Owner

  • Name: SZU-AdvTech-2024
  • Login: SZU-AdvTech-2024
  • Kind: organization

GitHub Events

Total
  • Push event: 2
  • Create event: 3
Last Year
  • Push event: 2
  • Create event: 3