shufflebench
A benchmark for generic, large-scale shuffle operations on continuous stream of data, implemented with state-of-the-art stream processing frameworks.
Science Score: 57.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 3 DOI reference(s) in README -
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.4%) to scientific vocabulary
Repository
A benchmark for generic, large-scale shuffle operations on continuous stream of data, implemented with state-of-the-art stream processing frameworks.
Basic Info
Statistics
- Stars: 13
- Watchers: 6
- Forks: 2
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
ShuffleBench
A benchmark for generic, large-scale shuffle operations on continuous stream of data, implemented with state-of-the-art stream processing frameworks.
Currently, we provide implementations for the following frameworks:
- Apache Flink
- Apache Spark (Structured Streaming)
- Apache Kafka Streams
- Hazlecast (with its Jet engine)
Additionally, a load generator and a tool for measuring and exporting the latency are provided.
Usage
The most straightforward way to run experiments with ShuffleBench is to use the Theodolite benchmarking framework. This allows you to run experiments on Kubernetes clusters in a fully automated, reproducible way including setting up stream processing application, starting the load generator, measuring performance metrics, and collecting the results.
Theodolite benchmark specifications for ShuffleBench can be found in kubernetes. There, you can also find detailed instructions on how to run the benchmarks.
To engage at a lower level, you can also run the benchmark implementations and the load generator manually using the Kubernetes manifests in kubernetes or run the provided container images or the Java applications directly.
Build and Package Project
Gradle is used to build, test, and package the benchmark implementations, the load generator, and the latency exporter tool. To build all subprojects, run:
sh
./gradlew build
Build and Publish Images
Except the Shufflebench implementations for Spark, all implementations can be packaged as container images and pushed to a registry using Jib by running:
sh
ORG_GRADLE_PROJECT_imageRepository=<your.registry.com>/shufflebench ./gradlew jib
For Spark, we have to build and push the image manually (e.g., using the Docker deamon):
sh
docker build -t <your.registry.com>/shufflebench/shufflebench-spark shuffle-spark/
docker push <your.registry.com>/shufflebench/shufflebench-spark
How to Cite
If you use ShuffleBench in your research, please cite:
Sören Henning, Adriano Vogel, Michael Leichtfried, Otmar Ertl, and Rick Rabiser. 2024. ShuffleBench: A Benchmark for Large-Scale Data Shuffling Operations with Distributed Stream Processing Frameworks. In Proceedings of the 15th ACM/SPEC International Conference on Performance Engineering (ICPE '24). Association for Computing Machinery, New York, NY, USA, 2–13. DOI: 10.1145/3629526.3645036
Owner
- Name: Dynatrace Research
- Login: dynatrace-research
- Kind: organization
- Email: research@dynatrace.com
- Website: https://research.dynatrace.com
- Repositories: 14
- Profile: https://github.com/dynatrace-research
This organization contains Open Source projects maintained by Dynatrace. If not stated differently, these projects are not officially supported.
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you use ShuffleBench, please cite it using these metadata."
authors:
- family-names: Henning
given-names: "Sören"
orcid: "https://orcid.org/0000-0001-6912-2549"
- family-names: Vogel
given-names: Adriano
orcid: "https://orcid.org/0000-0003-3299-2641"
- family-names: Leichtfried
given-names: Michael
orcid: "https://orcid.org/0000-0002-4415-6694"
- family-names: Ertl
given-names: Otmar
orcid: "https://orcid.org/0000-0001-7322-6332"
- family-names: Rabiser
given-names: Rick
orcid: "https://orcid.org/0000-0003-3862-1112"
title: ShuffleBench
repository-code: "https://github.com/dynatrace-research/ShuffleBench"
license: "Apache-2.0"
doi: "10.1145/3629526.3645036"
preferred-citation:
type: conference-paper
authors:
- family-names: Henning
given-names: "Sören"
orcid: "https://orcid.org/0000-0001-6912-2549"
- family-names: Vogel
given-names: Adriano
orcid: "https://orcid.org/0000-0003-3299-2641"
- family-names: Leichtfried
given-names: Michael
orcid: "https://orcid.org/0000-0002-4415-6694"
- family-names: Ertl
given-names: Otmar
orcid: "https://orcid.org/0000-0001-7322-6332"
- family-names: Rabiser
given-names: Rick
orcid: "https://orcid.org/0000-0003-3862-1112"
conference:
name: "15th ACM/SPEC International Conference on Performance Engineering (ICPE '24)"
date-start: 2024-05-07
date-end: 2024-05-11
collection-title: "Proceedings of the 15th ACM/SPEC International Conference on Performance Engineering"
title: "ShuffleBench: A Benchmark for Large-Scale Data Shuffling Operations with Distributed Stream Processing Frameworks"
year: 2024
doi: "10.1145/3629526.3645036"
status: "in-press"
GitHub Events
Total
- Issues event: 1
- Watch event: 2
- Delete event: 1
- Issue comment event: 1
- Push event: 5
- Fork event: 1
Last Year
- Issues event: 1
- Watch event: 2
- Delete event: 1
- Issue comment event: 1
- Push event: 5
- Fork event: 1
Dependencies
- actions/checkout v4 composite
- actions/setup-java v3 composite
- gradle/gradle-build-action v2 composite
- openjdk 11 build
- com.dynatrace.hash4j:hash4j 0.11.0 implementation
- org.slf4j:slf4j-api 1.7.25 implementation
- io.micrometer:micrometer-registry-dynatrace latest.release implementation
- io.micrometer:micrometer-registry-prometheus latest.release implementation
- io.smallrye.config:smallrye-config 3.2.1 implementation
- org.apache.kafka:kafka-clients 3.5.0 implementation
- org.apache.logging.log4j:log4j-api 2.19.0 implementation
- org.apache.logging.log4j:log4j-core 2.19.0 implementation
- org.apache.logging.log4j:log4j-slf4j-impl 2.19.0 implementation
- com.dynatrace.hash4j:hash4j 0.11.0 implementation
- io.smallrye.config:smallrye-config 3.2.1 implementation
- org.apache.kafka:kafka-clients 3.5.0 implementation
- org.apache.logging.log4j:log4j-api 2.19.0 implementation
- org.apache.logging.log4j:log4j-core 2.19.0 implementation
- org.apache.logging.log4j:log4j-slf4j-impl 2.19.0 implementation
- org.apache.flink:flink-clients ${flinkVersion} implementation
- org.apache.flink:flink-streaming-java ${flinkVersion} implementation
- org.apache.logging.log4j:log4j-api 2.19.0 runtimeOnly
- org.apache.logging.log4j:log4j-core 2.19.0 runtimeOnly
- org.apache.logging.log4j:log4j-slf4j-impl 2.19.0 runtimeOnly
- com.dynatrace.hash4j:hash4j 0.9.0 implementation
- com.hazelcast.jet:hazelcast-jet-kafka 5.3.1 implementation
- com.hazelcast:hazelcast 5.3.1 implementation
- com.hazelcast:hazelcast-kubernetes 2.2.3 implementation
- io.smallrye.config:smallrye-config 3.2.1 implementation
- org.apache.logging.log4j:log4j-api 2.19.0 implementation
- org.apache.logging.log4j:log4j-core 2.19.0 implementation
- org.apache.logging.log4j:log4j-slf4j-impl 2.19.0 implementation
- com.dynatrace.hash4j:hash4j 0.9.0 implementation
- io.smallrye.config:smallrye-config 3.2.1 implementation
- org.apache.kafka:kafka-streams 3.5.0 implementation
- org.apache.logging.log4j:log4j-api 2.19.0 implementation
- org.apache.logging.log4j:log4j-core 2.19.0 implementation
- org.apache.logging.log4j:log4j-slf4j-impl 2.19.0 implementation
- com.dynatrace.hash4j:hash4j 0.9.0 implementation
- com.fasterxml.jackson.core:jackson-databind 2.14.3 implementation
- com.typesafe:config 1.4.1 implementation
- io.smallrye.config:smallrye-config 3.2.1 implementation
- org.apache.kafka:kafka-clients 3.5.0 implementation
- org.apache.logging.log4j:log4j-api 2.19.0 implementation
- org.apache.logging.log4j:log4j-core 2.19.0 implementation
- org.apache.logging.log4j:log4j-slf4j-impl 2.19.0 implementation
- org.apache.spark:spark-core_2.12 ${sparkVersion} implementation
- org.apache.spark:spark-sql-kafka-0-10_2.12 ${sparkVersion} implementation
- org.apache.spark:spark-sql_2.12 ${sparkVersion} implementation
- org.apache.spark:spark-streaming_2.12 ${sparkVersion} implementation