serverless-benchmarks

SeBS: serverless benchmarking suite for automatic performance analysis of FaaS platforms.

https://github.com/spcl/serverless-benchmarks

Science Score: 85.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 4 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org, acm.org
  • Committers with academic emails
    1 of 26 committers (3.8%) from academic institutions
  • Institutional organization owner
    Organization spcl has institutional domain (spcl.inf.ethz.ch)
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.1%) to scientific vocabulary

Keywords

benchmark-framework benchmarking faas performance-analysis serverless
Last synced: 6 months ago · JSON representation ·

Repository

SeBS: serverless benchmarking suite for automatic performance analysis of FaaS platforms.

Basic Info
Statistics
  • Stars: 173
  • Watchers: 7
  • Forks: 84
  • Open Issues: 56
  • Releases: 2
Topics
benchmark-framework benchmarking faas performance-analysis serverless
Created about 6 years ago · Last pushed 7 months ago
Metadata Files
Readme License Citation

README.md

CircleCI Release License GitHub issues GitHub pull requests Slack

SeBS: Serverless Benchmark Suite

FaaS benchmarking suite for serverless functions with automatic build, deployment, and measurements.

Overview of SeBS features and components.

SeBS is a diverse suite of FaaS benchmarks that allows automatic performance analysis of commercial and open-source serverless platforms. We provide a suite of benchmark applications and experiments and use them to test and evaluate different components of FaaS systems. See the installation instructions to learn how to configure SeBS to use selected commercial and open-source serverless systems. Then, take a look at usage instructions to see how SeBS can automatically launch serverless functions and entire experiments in the cloud!

SeBS provides support for automatic deployment and invocation of benchmarks on commercial and black-box platforms AWS Lambda, Azure Functions, and Google Cloud Functions. Furthermore, we support the open-source platform OpenWhisk and offer a custom, Docker-based local evaluation platform. See the documentation on cloud providers for details on configuring each platform in SeBS. The documentation describes in detail the design and implementation of our tool, and see the modularity section to learn how SeBS can be extended with new platforms, benchmarks, and experiments. Find out more about our project in a paper summary.

Do you have further questions that were not answered by our documentation? Did you encounter trouble installing and using SeBS? Or do you want to use SeBS in your work and you need new features? Join our community on Slack or open a GitHub issue.

For more information on how to configure, use, and extend SeBS, see our documentation:

Publication

When using SeBS, please cite our Middleware '21 paper. An extended version of our paper is available on arXiv, and you can find more details about research work in this paper summary. You can cite our software repository as well, using the citation button on the right.

@inproceedings{copik2021sebs, author = {Copik, Marcin and Kwasniewski, Grzegorz and Besta, Maciej and Podstawski, Michal and Hoefler, Torsten}, title = {SeBS: A Serverless Benchmark Suite for Function-as-a-Service Computing}, year = {2021}, isbn = {9781450385343}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/3464298.3476133}, doi = {10.1145/3464298.3476133}, booktitle = {Proceedings of the 22nd International Middleware Conference}, pages = {64–78}, numpages = {15}, keywords = {benchmark, serverless, FaaS, function-as-a-service}, location = {Qu\'{e}bec city, Canada}, series = {Middleware '21} }

Installation

Requirements: - Docker (at least 19) - Python 3.7+ with: - pip - venv - libcurl and its headers must be available on your system to install pycurl - Standard Linux tools and zip installed

... and that should be all. We currently support Linux and other POSIX systems with Bash available. On Windows, we recommend using WSL.

To install the benchmarks with a support for all platforms, use:

./install.py --aws --azure --gcp --openwhisk --local

It will create a virtual environment in python-venv, and install necessary Python dependencies and third-party dependencies. To use SeBS, you must first active the new Python virtual environment:

. python-venv/bin/activate

Now you can deploy serverless experiments :-)

The installation of additional platforms is controlled with the --{platform} and --no-{platform} switches. Currently, the default behavior for install.py is to install only the local environment.

To verify the correctness of installation, you can use our regression testing.

[!WARNING] Please do not use SeBS with sudo. There is no requirement to use any superuser permissions. Make sure that your Docker daemon is running and your user has sufficient permissions to use it (see Docker documentation on configuring your user to have non-sudo access to containers). Otherwise, you might see many "Connection refused" and "Permission denied" errors when using SeBS.

[!WARNING] We use libcurl to make HTTP requests. pycurl will attempt to build its bindings and needs headers for that - make sure you have all development packages installed. If you see an error like this one: src/pycurl.h:206:13: fatal error: gnutls/gnutls.h: No such file or directory, it means that you are missing some of the dependencies.

Authors

Owner

  • Name: SPCL
  • Login: spcl
  • Kind: organization

Citation (CITATION.cff)

# YAML 1.2
---
authors: 
  -
    family-names: Copik
    given-names: Marcin
    orcid: "https://orcid.org/0000-0002-7606-5519"
cff-version: "1.1.0"
date-released: 2021-07-29
message: "If you use this software, please cite our Middleware' 21 paper, and cite the software repository using this metadata."
repository-code: "https://github.com/spcl/serverless-benchmarks"
title: "SeBS: serverless benchmarks suite"
version: "1.0"
...

GitHub Events

Total
  • Issues event: 18
  • Watch event: 24
  • Delete event: 6
  • Issue comment event: 75
  • Push event: 88
  • Pull request review event: 121
  • Pull request review comment event: 238
  • Pull request event: 28
  • Fork event: 16
  • Create event: 11
Last Year
  • Issues event: 18
  • Watch event: 24
  • Delete event: 6
  • Issue comment event: 75
  • Push event: 88
  • Pull request review event: 121
  • Pull request review comment event: 238
  • Pull request event: 28
  • Fork event: 16
  • Create event: 11

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 1,329
  • Total Committers: 26
  • Avg Commits per committer: 51.115
  • Development Distribution Score (DDS): 0.079
Past Year
  • Commits: 215
  • Committers: 9
  • Avg Commits per committer: 23.889
  • Development Distribution Score (DDS): 0.06
Top Committers
Name Email Commits
Marcin Copik m****k@g****m 1,224
Mateusz Knapik m****6@g****m 16
Mateusz Szarek m****0@g****m 11
Michał Podstawski m****i@g****m 11
Paweł Żuk p****k@m****l 11
Kacper Janda k****a@c****l 7
root m****4@g****m 6
Nico Graf n****f@g****m 5
prajinkhadka 3****a 5
Abhishek Kumar a****2@g****m 4
sborkows b****8@g****m 4
JmmCz j****i@g****m 3
Sascha Kehrli s****i@g****m 3
milowedo m****o@g****m 3
Laurin Brandner m****l@l****h 2
Physix 6****t 2
Rafal Mucha m****4@g****m 2
Zeno Berkhan z****n@d****e 2
Amit Aryeh Levy a****t@a****m 1
Muhammad Mahad m****y@g****m 1
Ojas 9****6 1
Saadat Nursultan 3****t 1
Zisen Liu 2****l 1
aidenh6307 1****7 1
lawrence910426 l****6@g****m 1
qdelamea-aneo 1****o 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 114
  • Total pull requests: 96
  • Average time to close issues: about 1 year
  • Average time to close pull requests: 3 months
  • Total issue authors: 21
  • Total pull request authors: 27
  • Average comments per issue: 2.89
  • Average comments per pull request: 1.77
  • Merged pull requests: 48
  • Bot issues: 0
  • Bot pull requests: 5
Past Year
  • Issues: 14
  • Pull requests: 26
  • Average time to close issues: 22 days
  • Average time to close pull requests: 19 days
  • Issue authors: 4
  • Pull request authors: 7
  • Average comments per issue: 1.93
  • Average comments per pull request: 2.23
  • Merged pull requests: 11
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • mcopik (76)
  • jchigu (7)
  • nervermore2 (5)
  • nurSaadat (4)
  • HelloWorldGitHubUser (3)
  • prajinkhadka (2)
  • DaY1zz (1)
  • YasminBZ (1)
  • Subashkatel (1)
  • im-rikesh (1)
  • rabbull (1)
  • alevy (1)
  • Charitra1 (1)
  • ryuxin (1)
  • wowu (1)
Pull Request Authors
  • mcopik (45)
  • octonawish-akcodes (9)
  • prajinkhadka (8)
  • dependabot[bot] (5)
  • oanarosca (5)
  • mahlashrifi (4)
  • qdelamea-aneo (4)
  • MahadMuhammad (4)
  • adhinneupane (4)
  • Kaleab-git (4)
  • lbrndnr (3)
  • ojninja16 (2)
  • zeno420 (2)
  • micpod (2)
  • rabbull (2)
Top Labels
Issue Labels
enhancement (43) aws (30) good first issue (21) azure (19) local (16) bug (16) gcp (12) critical (9) needs-reproduction (8) python (5) dormant (4) nodejs (4) experiments (4) paper (3) refactor (3) documentation (3) stale (2) next-release (2) benchmark (1) release (1)
Pull Request Labels
dependencies (5)

Dependencies

benchmarks/100.webapps/110.dynamic-html/nodejs/package.json npm
  • mustache ^3.2.1
benchmarks/100.webapps/120.uploader/nodejs/package.json npm
  • request ^2.88.0
benchmarks/200.multimedia/210.thumbnailer/nodejs/package.json npm
  • sharp ^0.25
docker/local/nodejs/package.json npm
  • minio ^7.0.13
  • strftime ^0.10.0
  • uuid ^3.4.0
benchmarks/100.webapps/110.dynamic-html/python/requirements.txt pypi
  • jinja2 >=2.10.3
benchmarks/500.scientific/501.graph-pagerank/python/requirements.txt pypi
  • python-igraph ==0.7.1.post6
benchmarks/500.scientific/502.graph-mst/python/requirements.txt pypi
  • python-igraph ==0.7.1.post6
benchmarks/500.scientific/503.graph-bfs/python/requirements.txt pypi
  • python-igraph ==0.7.1.post6
benchmarks/500.scientific/504.dna-visualisation/python/requirements.txt pypi
  • squiggle ==0.3.1
requirements.aws.txt pypi
  • boto3 *
  • boto3-stubs *
  • urllib3 *
requirements.azure.txt pypi
  • azure-storage-blob ==12.10.0
requirements.gcp.txt pypi
  • google-api-python-client ==1.12.5
  • google-api-python-client-stubs *
  • google-cloud-logging ==2.0.0
  • google-cloud-monitoring ==2.0.0
  • google-cloud-storage ==1.32.0
  • grpcio *
requirements.local.txt pypi
  • minio ==5.0.10
requirements.txt pypi
  • black *
  • click >=7.1.2
  • docker >=4.2.0
  • flake8 *
  • flake8-black *
  • flake8-boto3 *
  • mypy *
  • numpy *
  • pandas >=1.1.3
  • pycurl >=7.43
  • scipy *
  • testtools >=2.4.0
  • types-pycurl *
  • types-requests *
  • tzlocal >=2.1