adaptivebatchhe
An adaptive batch homomorphic encryption framework for cross-device Federated Learning, which determines cost-efficient and sufficiently secure encryption strategies for clients with heterogeneous data and system capabilities.
Science Score: 41.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
○.zenodo.json file
-
○DOI references
-
✓Academic publication links
Links to: scholar.google, ieee.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (9.1%) to scientific vocabulary
Keywords
Repository
An adaptive batch homomorphic encryption framework for cross-device Federated Learning, which determines cost-efficient and sufficiently secure encryption strategies for clients with heterogeneous data and system capabilities.
Basic Info
Statistics
- Stars: 5
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
AdaptiveBatchHE
This repository provides the implementation of the paper "Adaptive Batch Homomorphic Encryption for Joint Federated Learning in Cross-Device Scenarios", which is published in IEEE INTERNET OF THINGS JOURNAL. In this paper, we propose an adaptive batch HE framework for cross-device FL, which determines cost-efficient and sufficiently secure encryption strategies for clients with heterogeneous data and system capabilities. Our framework can achieve comparable accuracy to plain HE (i.e., encryption applied per gradient), while reducing training time by 3×-31×, and communication cost by 45×-66×.
![]() |
![]() |
![]() |
![]() |
| Training time over 100 epochs | Testing accuracy over epochs | Communication cost in one epoch | Cost efficiency under various HE key sizes |
Our framework consists of the following three key components:
1. Clustering of Clients based on Sparsity of CNNs
The codes in CNN Sparisty are for determining the sparsity vectors of clients.
federated_main.py is the main function.
The input is the path of the dataset.
2. Selection of HE Key Size for Each Client based on Fuzzy Logic
The codes in fuzzy logic are for determining the HE key size of clients.
fuzzy_logic_main.py is the main function.
There are three inputs: input_NS, input_TR, and input_CC.
Their values are between 0 and 1.
3. Accuracy-lossless Batch Encryption and Aggregation
The codes in batch encryption are for accuracy-lossless batch encryption and aggregation of model parameters during FL training.
federated_experiment_main.py is the main function.
The main function needs a proper hyperparameter K to run correctly, of which reason has been explained with detail in the paper. The default K value is 4. For specific settings, please refer to the comments in the codes.
Prerequisites
To run the code, the following libraies are needed:
- Python >= 3.8
- Pytorch >= 1.10
- torchvision >= 0.11
- phe >= 1.5
- skfuzzy >= 0.4
Check environment.yaml for details.
Citing
If you use this repository, please cite:
bibtex
@article{han2023adaptiveBatchHE,
title={Adaptive Batch Homomorphic Encryption for Joint Federated Learning in Cross-Device Scenarios},
author={Han, Junhao and Yan, Li},
journal={IEEE Internet of Things Journal},
volume={11},
number={6},
pages={9338--9354},
year={2023},
}
List of publications that cite this work: Google Scholar
Owner
- Login: liyan2015
- Kind: user
- Repositories: 2
- Profile: https://github.com/liyan2015
Citation (CITATION.bib)
@article{han2023adaptiveBatchHE,
title={Adaptive Batch Homomorphic Encryption for Joint Federated Learning in Cross-Device Scenarios},
author={Han, Junhao and Yan, Li},
journal={IEEE Internet of Things Journal},
volume={11},
number={6},
pages={9338--9354},
year={2023},
}
GitHub Events
Total
- Watch event: 3
- Push event: 2
- Fork event: 1
Last Year
- Watch event: 3
- Push event: 2
- Fork event: 1
Dependencies
- absl-py ==1.0.0
- astunparse ==1.6.3
- cachetools ==5.0.0
- chainmap ==1.0.3
- charset-normalizer ==2.0.12
- combomethod ==1.0.12
- cycler ==0.11.0
- flatbuffers ==2.0
- fonttools ==4.33.3
- gast ==0.5.3
- gmpy2 ==2.1.2
- google-auth ==2.6.6
- google-auth-oauthlib ==0.4.6
- google-pasta ==0.2.0
- grpcio ==1.46.1
- h5py ==3.6.0
- idna ==3.3
- importlib-metadata ==4.11.3
- joblib ==1.1.0
- keras ==2.8.0
- keras-preprocessing ==1.1.2
- kiwisolver ==1.4.2
- libclang ==14.0.1
- llvmlite ==0.38.1
- markdown ==3.3.7
- matplotlib ==3.5.2
- nulltype ==2.3.1
- numba ==0.55.1
- numpy ==1.21.6
- oauthlib ==3.2.0
- opt-einsum ==3.3.0
- packaging ==21.3
- phe ==1.5.0
- pillow ==9.1.0
- protobuf ==3.20.1
- pyasn1 ==0.4.8
- pyasn1-modules ==0.2.8
- pyparsing ==3.0.9
- pysnooper ==1.1.1
- python-dateutil ==2.8.2
- requests ==2.27.1
- requests-oauthlib ==1.3.1
- rsa ==4.8
- scipy ==1.8.1
- six ==1.12.0
- tensorboard ==2.8.0
- tensorboard-data-server ==0.6.1
- tensorboard-plugin-wit ==1.8.1
- tensorboardx ==2.5
- tensorflow ==2.8.0
- tensorflow-io-gcs-filesystem ==0.25.0
- termcolor ==1.1.0
- tf-estimator-nightly ==2.8.0.dev2021122109
- torch ==1.10.0
- torchaudio ==0.10.0
- torchvision ==0.11.1
- tqdm ==4.64.0
- typing-extensions ==4.2.0
- urllib3 ==1.26.9
- werkzeug ==2.1.2
- wrapt ==1.14.1
- zipp ==3.8.0



