Recent Releases of https://github.com/SETL-Framework/setl

https://github.com/SETL-Framework/setl - SETL-1.0.0-RC2

BREAKING CHANGE: - Change group id to io.github.setl-framework

- Scala
Published by qxzzxq almost 5 years ago

https://github.com/SETL-Framework/setl - SETL-1.0.0-RC1

New Features: - Added Spark 3.0 support

Fixes: - Fixed save mode in DynamoDB Connector

- Scala
Published by qxzzxq over 5 years ago

https://github.com/SETL-Framework/setl - SETL-0.4.3

Changes: - Updated spark-cassandra-connector from 2.4.2 to 2.5.0 - Updated spark-excel-connector from 0.12.4 to 0.13.1 - Updated spark-dynamodb-connector from 1.0.1 to 1.0.4 - Updated scalatest (scope test) from 3.1.0 to 3.1.2 - Updated postgresql (scope test) from 42.2.9 to 42.2.12

New Features: - Added pipeline dependency check before starting the spark job - Added default Spark job group and description - Added StructuredStreamingConnector - Added DeltaConnector - Added ZipArchiver that can zip files/directories

Fixes - Fixed path separator in FileConnectorSuite that cause test failure - Fixed Setl.hasExternalInput that always returns false

- Scala
Published by qxzzxq over 5 years ago

https://github.com/SETL-Framework/setl - SETL-0.4.2

Fix cross compile issue (#111)

- Scala
Published by qxzzxq about 6 years ago

https://github.com/SETL-Framework/setl - SETL-0.4.1

Changes: - Changed benchmark unit of time to seconds (#88)

Fixes: - The master URL of SparkSession can now be overwritten in local environment (#74) - FileConnector now lists path correctly for nested directories (#97)

New features: - Added Mermaid diagram generation to Pipeline (#51) - Added showDiagram() method to Pipeline that prints the Mermaid code and generates the live editor URL 🎩🐰✨ (#52) - Added Codecov report and Scala API doc - Added delete method in JDBCConnector (#82) - Added drop method in DBConnector (#83) - Added support for both of the following two Spark configuration styles in SETL builder (#86) ```hocon setl.config { spark { spark.app.name = "my_app" spark.sql.shuffle.partitions = "1000" } }

setl.config2 { spark.app.name = "myapp" spark.sql.shuffle.partitions = "1000" } ```

Others: - Improved test coverage

- Scala
Published by qxzzxq about 6 years ago

https://github.com/SETL-Framework/setl - v0.4.0

Changes: - BREAKING CHANGE: Renamed DCContext to Setl - Changed the default application environment config path into setl.environment - Changed the default context config path into setl.config

Fixes: - Fixed issue of DynamoDBConnector that doesn't take user configuration - Fixed issue of CompoundKey annotation. Now SparkRepository handles correctly columns having multiple compound keys. (#36)

New features: - Added support for private variable delivery (#24) - Added empty SparkRepository as placeholder (#30) - Added annotation Benchmark that could be used on methods of an AbstractFactory (#35)

Others: - Optimized DeliverableDispatcher - Optimized PipelineInspector (#33)

- Scala
Published by qxzzxq about 6 years ago