spruce

Enrichment pipeline for CUR reports which adds energy and carbon data allowing to report and reduce the impact of the your cloud usage.

https://github.com/digitalpebble/spruce

Science Score: 26.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (11.9%) to scientific vocabulary

Keywords

apache-spark aws carbon-emissions climate cloud greenops greensoftware sustainability
Last synced: 6 months ago · JSON representation

Repository

Enrichment pipeline for CUR reports which adds energy and carbon data allowing to report and reduce the impact of the your cloud usage.

Basic Info
  • Host: GitHub
  • Owner: DigitalPebble
  • License: apache-2.0
  • Language: Java
  • Default Branch: main
  • Homepage:
  • Size: 2.09 MB
Statistics
  • Stars: 6
  • Watchers: 0
  • Forks: 2
  • Open Issues: 6
  • Releases: 2
Topics
apache-spark aws carbon-emissions climate cloud greenops greensoftware sustainability
Created 7 months ago · Last pushed 6 months ago
Metadata Files
Readme Contributing Funding License Code of conduct

README.md

Spruce

SPRUCE

Spruce helps estimate the environmental impact of your cloud usage. By leveraging open source models and data, it enriches usage reports generated by cloud providers and allows you to build reports and visualisations. Having the greenops and finops data in the same place makes it easier to expose your costs and impacts side by side.

Spruce uses Apache Spark to read and write the usage reports (typically in Parquet format) in a scalable way and, thanks to its modular approach, splits the enrichment of the data into configurable stages.

A typical sequence of stages would be: - estimation of embedded emissions from resources used - estimation of energy used - application of PUE and other overheads - application of carbon intensity factors

Please note that this is currently a prototype which handles only CUR reports from AWS. Not all AWS services are covered.

One of the benefits of using Apache Spark is that you can use EMR on AWS to enrich the CURs at scale without having to export or expose any of your data.

Prerequisites

You will need to have CUR reports as inputs. Those are generated via DataExports and stored on S3 as Parquet files.

Local install

With Apache Maven, Java and Apache Spark installed locally and added to the $PATH.

mvn clean package spark-submit --class com.digitalpebble.spruce.SparkJob --driver-memory 8g ./target/spruce-1.0.jar ./curs ./output

Docker

Build the Docker image with docker build -t digitalpebble/spruce:1.0 .

The command below processes the data locally by mounting the directories containing the CURs and output as volumes: docker run -it -v ./curs:/curs -v ./output:/output digitalpebble/spruce:1.0 \ /opt/spark/bin/spark-submit \ --class com.digitalpebble.spruce.SparkJob \ --driver-memory 4g \ --master 'local[*]' \ /usr/local/lib/spruce-1.0.jar \ /curs /output/enriched

Explore the output

Using DuckDB locally or Athena on AWS:

```sql create table enriched_curs as select * from 'output//.parquet';

select lineitemproductcode, productservicecode, round(sum(operationalemissionsco2eqg),2) as co2usageg, round(sum(energyusagekwh),2) as energyusagekwh from enrichedcurs where operationalemissionsco2eqg > 0.01 group by lineitemproductcode, productservicecode order by co2usage_g desc; ```

should give an output similar to

| lineitemproductcode | productservicecode | co2usageg | energyusagekwh | |------------------------|---------------------|-------------|------------------| | AmazonS3 | AWSDataTransfer | 659.2 | 3.31 | | AmazonRDS | AWSDataTransfer | 361.59 | 1.09 | | AmazonEC2 | AWSDataTransfer | 162.59 | 1.43 | | AmazonECR | AWSDataTransfer | 88.75 | 0.8 | | AmazonVPC | AWSDataTransfer | 40.55 | 0.38 | | AWSELB | AWSDataTransfer | 6.3 | 0.06 |

To measure the proportion of the costs for which emissions where calculated

sql select round(covered * 100 / "total costs", 2) as percentage_costs_covered from ( select sum(line_item_unblended_cost) as "total costs", sum(line_item_unblended_cost) filter (where operational_emissions_co2eq_g is not null) as covered from enriched_curs where line_item_line_item_type like '%Usage' );

License

Licensed under the Apache License, Version 2.0: http://www.apache.org/licenses/LICENSE-2.0

Owner

  • Name: DigitalPebble Ltd
  • Login: DigitalPebble
  • Kind: organization
  • Email: github@digitalpebble.com
  • Location: Bristol, UK

GitHub Events

Total
  • Create event: 6
  • Issues event: 19
  • Watch event: 4
  • Delete event: 5
  • Issue comment event: 8
  • Push event: 22
  • Public event: 1
  • Pull request review comment event: 15
  • Pull request review event: 10
  • Pull request event: 16
  • Gollum event: 6
  • Fork event: 1
Last Year
  • Create event: 6
  • Issues event: 19
  • Watch event: 4
  • Delete event: 5
  • Issue comment event: 8
  • Push event: 22
  • Public event: 1
  • Pull request review comment event: 15
  • Pull request review event: 10
  • Pull request event: 16
  • Gollum event: 6
  • Fork event: 1

Committers

Last synced: 7 months ago

All Time
  • Total Commits: 40
  • Total Committers: 1
  • Avg Commits per committer: 40.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 40
  • Committers: 1
  • Avg Commits per committer: 40.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Julien Nioche j****n@d****m 40
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 16
  • Total pull requests: 11
  • Average time to close issues: 20 days
  • Average time to close pull requests: 2 days
  • Total issue authors: 1
  • Total pull request authors: 3
  • Average comments per issue: 0.63
  • Average comments per pull request: 1.0
  • Merged pull requests: 7
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 16
  • Pull requests: 11
  • Average time to close issues: 20 days
  • Average time to close pull requests: 2 days
  • Issue authors: 1
  • Pull request authors: 3
  • Average comments per issue: 0.63
  • Average comments per pull request: 1.0
  • Merged pull requests: 7
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • jnioche (16)
Pull Request Authors
  • jnioche (8)
  • nikhiln64 (2)
  • de9uch1 (1)
Top Labels
Issue Labels
enhancement (5) help wanted (4) good first issue (3) documentation (1)
Pull Request Labels
enhancement (6)

Dependencies

.github/workflows/maven.yml actions
  • actions/checkout v4 composite
  • actions/setup-java v4 composite
  • advanced-security/maven-dependency-submission-action 571e99aab1055c2e71a1e2309b9691de18d6b7d6 composite
Dockerfile docker
  • apache/spark 4.0.0-java21 build
  • maven 3.9.9-eclipse-temurin-21 build
pom.xml maven
  • org.apache.spark:spark-sql_2.13 4.0.0 provided
  • org.junit.jupiter:junit-jupiter-api 5.13.1 test
  • org.junit.jupiter:junit-jupiter-engine 5.13.1 test