https://github.com/robert-forrest/cerebral

Tool for creating multi-output deep ensemble neural-networks

https://github.com/robert-forrest/cerebral

Science Score: 10.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: rsc.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (11.2%) to scientific vocabulary

Keywords

deep-learning ensemble-learning machine-learning materials-science neural-networks
Last synced: 6 months ago · JSON representation

Repository

Tool for creating multi-output deep ensemble neural-networks

Basic Info
Statistics
  • Stars: 3
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Topics
deep-learning ensemble-learning machine-learning materials-science neural-networks
Created almost 4 years ago · Last pushed about 3 years ago
Metadata Files
Readme License

README.md

cerebral

Tests

Tool for creating multi-output deep ensemble neural-networks for alloy property modelling.

See our paper Machine-learning improves understanding of glass formation in metallic systems for discussion of the model, it's architecture, and performance.

Installation

The cerebral package can be installed from pypi using pip:

pip install cerebral

Cerebral makes heavy use of the metallurgy package to manipulate and approximate properties of alloys. Cerebral can be used with the evomatic package to perform alloy searching.

Usage

Cerebral can be used to create multi-input mult-output deep neural networks for the modelling of arbitrary alloy properties.

The following example shows configuration of cerebral to predict the "price" property of an alloy, based on atomic percentages alone. Cerebral is configured to load data for this problem from the tests directory - this data is for demonstration and testing only, it is synthetically created by the metallurgy package for the Cu-Zr binary alloy system.

```python import cerebral as cb

cb.setup( { "targets": [{"name": "price"}], "inputfeatures": [ "percentages" ], "data": {"files": ["tests/CuZrprices.csv"]}, } )

data = cb.features.load_data() ```

```

data composition price Cupercentage Zrpercentage 0 Cu100 6.000000 1.000 0.000 1 Cu99.9Zr0.1 6.044626 0.999 0.001 2 Cu99.7Zr0.3 6.133763 0.997 0.003 3 Cu99.6Zr0.4 6.178273 0.996 0.004 4 Cu99.4Zr0.6 6.267177 0.994 0.006 .. ... ... ... ... 662 Zr99.4Cu0.6 36.969779 0.006 0.994 663 Zr99.5Cu0.5 36.991515 0.005 0.995 664 Zr99.7Cu0.3 37.034949 0.003 0.997 665 Zr99.8Cu0.2 37.056646 0.002 0.998 666 Zr100 37.100000 0.000 1.000 ```

Once a DataFrame of alloy compositions, input features, and prediction targets is available, it can be used to train a model. The following example takes the DataFrame created above, and trains a neural network to reproduce the target features (for a maximum of 200 training epochs). The neural network model produced is a standard Keras / TensorFlow model.

```python model, history, traindata, testdata = cb.models.trainmodel( data, maxepochs=200 )

model

history.history["loss"] [22.522766767894105, 21.966949822959215, ...]

```

Once a model has been created, cerebral provides automation for evaluating its performance by comparison against the training and test datasets. Since the pricing data is based on a very simple linear mixture, the model is able to learn quite well the relationship between percentages of Cu and Zr and the price.

```python
( trainpredictions, trainerrors, testpredictions, testerrors, metrics, ) = cb.models.evaluatemodel( model, traindata["dataset"], traindata["labels"], testds=testdata["dataset"], testlabels=testdata["labels"], traincompositions=traindata["compositions"], testcompositions=test_data["compositions"], )

metrics { 'price': { 'train': { 'Rsq': 0.9994298579318788, 'RMSE': 0.21407108083268242, 'MAE': 0.16591635524599488 }, 'test': { 'Rsq': 0.9994089218056131, 'RMSE': 0.21349478924250365, 'MAE': 0.1721696906690461 } } }

```

Futher, the model can be used to generate predictions for arbitrary alloys, as long as the required input features are supplied. Here, we see that the simple example model predicts price value for pure copper which is in the vicinity of the value originally calculated by linear mixture:

```python

cb.models.predict(model, "Cu100")["price"] {'price': array([6.60157898])}

mg.calculate("Cu100", "price") 6.0 ```

Documentation

Documentation is available here.

Owner

  • Name: Robert Forrest
  • Login: Robert-Forrest
  • Kind: user
  • Location: UK

Full-stack Engineering. Computational Materials Science + ML

GitHub Events

Total
Last Year

Committers

Last synced: almost 3 years ago

All Time
  • Total Commits: 164
  • Total Committers: 1
  • Avg Commits per committer: 164.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Robert Forrest r****t@l****m 164

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 0
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 0
  • Total pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 70 last-month
  • Total dependent packages: 2
  • Total dependent repositories: 1
  • Total versions: 8
  • Total maintainers: 1
pypi.org: cerebral

Tool for creating multi-output deep ensemble neural-networks

  • Versions: 8
  • Dependent Packages: 2
  • Dependent Repositories: 1
  • Downloads: 70 Last month
Rankings
Dependent packages count: 3.2%
Average: 20.7%
Dependent repos count: 21.6%
Downloads: 23.8%
Stargazers count: 25.1%
Forks count: 29.8%
Maintainers (1)
Last synced: 6 months ago

Dependencies

docs/requirements.txt pypi
  • sphinx *
  • sphinx-autoapi *
  • sphinx_math_dollar *
  • sphinx_mdinclude *
.github/workflows/publish-to-pypi.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
  • pypa/gh-action-pypi-publish release/v1 composite
.github/workflows/tests.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite