Nempy
Nempy: A Python package for modelling the Australian National Electricity Market dispatch procedure - Published in JOSS (2022)
Science Score: 96.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 8 DOI reference(s) in README and JOSS metadata -
✓Academic publication links
Links to: joss.theoj.org -
○Committers with academic emails
-
✓Institutional organization owner
Organization unsw-ceem has institutional domain (ceem.unsw.edu.au) -
✓JOSS paper metadata
Published in Journal of Open Source Software
Keywords from Contributors
Scientific Fields
Repository
A Python package for modelling the Australian National Electricity Market dispatch procedure
Basic Info
Statistics
- Stars: 62
- Watchers: 10
- Forks: 33
- Open Issues: 6
- Releases: 15
Metadata Files
README.md
Nempy
Table of Contents
Introduction
Nempy is a Python package for modelling the dispatch procedure of the Australian National Electricity Market (NEM). The idea is that you can start simple and grow the complexity of your model by adding features such as ramping constraints, interconnectors, FCAS markets and more. See the examples below.
| |
|:--:|
| Dispatch price results from the New South Wales region for 1000 randomly selected intervals in the 2019 calendar year. The actual prices, prior to scaling or capping, are also shown for comparison. Results from two Nempy models are shown, one with a full set of dispatch features, and one without FCAS markets or generic constraints (network and security constraints). Actual prices, results from the full featured model, and the simpler model are shown in descending order for actual prices, results from the simpler model are also shown resorted. |
For further details, refer to the documentation.
For a brief introduction to the NEM, refer to this document.
Installation
Installing Nempy to use in your project is easy.
bash
pip install nempy
Documentation
A more detailed introduction to Nempy, examples, and reference documentation can be found on the readthedocs page.
Community
Nempy is open-source and we welcome all forms of community engagement.
Support
You can seek support for using Nempy using the discussion tab on GitHub, checking the issues register, or by contacting Nick directly (n.gorman at unsw.edu.au).
If you cannot find a pre-existing issue related to your enquiry, you can submit a new one via the issues register. Issue submissions do not need to adhere to any particular format.
Future support and maintenance
CEEM continues to support and maintain Nempy! If Nempy is useful to your work, research, or business, please reach out and inform us so we can consider your use case and needs.
Contributing
Contributions via pull requests are welcome. Contributions should:
- Follow the PEP8 style guide (with exception of line length up to 120 rather than 80)
- Ensure that all existing automated tests continue to pass (unless you are explicitly changing intended behavour; if you are, please highlight this in your pull request description)
- Implement automated tests for new features
- Provide doc strings for public interfaces
Installation for development
To install Nempy for development:
- Clone or fork the repo
- Install
uv - Install
nempyusinguvby runninguv syncin the project directory - uv will create .venv, which you can configure your IDE to use, or you can use explicity to run a python file by running
uv run your_code.py
Author
Nempy's development was led by Nick Gorman as part of his PhD candidature at the Collaboration on Energy and Environmental Markets at the University of New South Wales' School of Photovoltaics and Renewable Energy Engineering. (https://www.ceem.unsw.edu.au/).
Citation
If you use Nempy, please cite the package via the JOSS paper (suggested citation below):
Gorman et al., (2022). Nempy: A Python package for modelling the Australian National Electricity Market dispatch procedure. Journal of Open Source Software, 7(70), 3596, https://doi.org/10.21105/joss.03596
License
Nempy was created by Nicholas Gorman. It is licensed under the terms of the BSD 3-Clause Licence.
Examples
A simple example
```python import pandas as pd from nempy import markets # Volume of each bid, number of bands must equal number of bands in price_bids. volume_bids = pd.DataFrame({ 'unit': ['A', 'B'], '1': [20.0, 50.0], # MW '2': [20.0, 30.0], # MW '3': [5.0, 10.0] # More bid bands could be added. }) # Price of each bid, bids must be monotonically increasing. price_bids = pd.DataFrame({ 'unit': ['A', 'B'], '1': [50.0, 50.0], # $/MW '2': [60.0, 55.0], # $/MW '3': [100.0, 80.0] # . . . }) # Other unit properties unit_info = pd.DataFrame({ 'unit': ['A', 'B'], 'region': ['NSW', 'NSW'], # MW }) # The demand in the region\s being dispatched demand = pd.DataFrame({ 'region': ['NSW'], 'demand': [120.0] # MW }) # Create the market model market = markets.SpotMarket(unit_info=unit_info, market_regions=['NSW']) market.set_unit_volume_bids(volume_bids) market.set_unit_price_bids(price_bids) market.set_demand_constraints(demand) # Calculate dispatch and pricing market.dispatch() # Return the total dispatch of each unit in MW. print(market.get_unit_dispatch()) # unit service dispatch # 0 A energy 40.0 # 1 B energy 80.0 # Return the price of energy in each region. print(market.get_energy_prices()) # region price # 0 NSW 60.0 ```A detailed example
The example demonstrates the broad range of market features that can be implemented with Nempy and the use of auxiliary modelling tools for accessing historical market data published by AEMO and preprocessing it for compatibility with Nempy. > [!WARNING] > This example downloads approximately 54 GB of data from AEMO. ```python # Notice: # - This script downloads large volumes of historical market data (~54 GB) from AEMO's nemweb # portal. You can also reduce the data usage by restricting the time window given to the # xml_cache_manager and in the get_test_intervals function. The boolean on line 22 can # also be changed to prevent this happening repeatedly once the data has been downloaded. import sqlite3 from datetime import datetime, timedelta import random import pandas as pd from nempy import markets from nempy.historical_inputs import loaders, mms_db, \ xml_cache, units, demand, interconnectors, constraints con = sqlite3.connect('D:/nempy_2024_07/historical_mms.db') mms_db_manager = mms_db.DBManager(connection=con) xml_cache_manager = xml_cache.XMLCacheManager('D:/nempy_2024_07/xml_cache') # The second time this example is run on a machine this flag can # be set to false to save downloading the data again. download_inputs = True if download_inputs: # This requires approximately 4 GB of storage. mms_db_manager.populate(start_year=2024, start_month=7, end_year=2024, end_month=7) # This requires approximately 50 GB of storage. xml_cache_manager.populate_by_day(start_year=2024, start_month=7, start_day=1, end_year=2024, end_month=8, end_day=1) raw_inputs_loader = loaders.RawInputsLoader( nemde_xml_cache_manager=xml_cache_manager, market_management_system_database=mms_db_manager) # A list of intervals we want to recreate historical dispatch for. def get_test_intervals(number=100): start_time = datetime(year=2024, month=7, day=1, hour=0, minute=0) end_time = datetime(year=2024, month=8, day=1, hour=0, minute=0) difference = end_time - start_time difference_in_5_min_intervals = difference.days * 12 * 24 random.seed(1) intervals = random.sample(range(1, difference_in_5_min_intervals), number) times = [start_time + timedelta(minutes=5 * i) for i in intervals] times_formatted = [t.isoformat().replace('T', ' ').replace('-', '/') for t in times] return times_formatted # List for saving outputs to. outputs = [] c = 0 # Create and dispatch the spot market for each dispatch interval. for interval in get_test_intervals(number=100): c += 1 print(str(c) + ' ' + str(interval)) raw_inputs_loader.set_interval(interval) unit_inputs = units.UnitData(raw_inputs_loader) interconnector_inputs = interconnectors.InterconnectorData(raw_inputs_loader) constraint_inputs = constraints.ConstraintData(raw_inputs_loader) demand_inputs = demand.DemandData(raw_inputs_loader) unit_info = unit_inputs.get_unit_info() market = markets.SpotMarket(market_regions=['QLD1', 'NSW1', 'VIC1', 'SA1', 'TAS1'], unit_info=unit_info) # Set bids volume_bids, price_bids = unit_inputs.get_processed_bids() market.set_unit_volume_bids(volume_bids) market.set_unit_price_bids(price_bids) # Set bid in capacity limits unit_bid_limit = unit_inputs.get_unit_bid_availability() cost = constraint_inputs.get_constraint_violation_prices()['unit_capacity'] market.set_unit_bid_capacity_constraints(unit_bid_limit, violation_cost=cost) # Set limits provided by the unconstrained intermittent generation # forecasts. Primarily for wind and solar. unit_uigf_limit = unit_inputs.get_unit_uigf_limits() cost = constraint_inputs.get_constraint_violation_prices()['uigf'] market.set_unconstrained_intermittent_generation_forecast_constraint( unit_uigf_limit, violation_cost=cost ) # Set unit ramp rates. ramp_rates = unit_inputs.get_bid_ramp_rates() scada_ramp_rates = unit_inputs.get_scada_ramp_rates() fast_start_profiles = unit_inputs.get_fast_start_profiles_for_dispatch() cost = constraint_inputs.get_constraint_violation_prices()['ramp_rate'] market.set_unit_ramp_rate_constraints( ramp_rates, scada_ramp_rates, fast_start_profiles, run_type="fast_start_first_run", violation_cost=cost ) # Set unit FCAS trapezium constraints. unit_inputs.add_fcas_trapezium_constraints() cost = constraint_inputs.get_constraint_violation_prices()['fcas_max_avail'] fcas_availability = unit_inputs.get_fcas_max_availability() market.set_fcas_max_availability(fcas_availability, violation_cost=cost) cost = constraint_inputs.get_constraint_violation_prices()['fcas_profile'] regulation_trapeziums = unit_inputs.get_fcas_regulation_trapeziums() market.set_energy_and_regulation_capacity_constraints(regulation_trapeziums, violation_cost=cost) scada_ramp_rates = unit_inputs.get_scada_ramp_rates(inlude_initial_output=True) market.set_joint_ramping_constraints_reg( scada_ramp_rates, fast_start_profiles, run_type="fast_start_first_run", violation_cost=cost ) contingency_trapeziums = unit_inputs.get_contingency_services() market.set_joint_capacity_constraints(contingency_trapeziums, violation_cost=cost) # Set interconnector definitions, limits and loss models. interconnectors_definitions = \ interconnector_inputs.get_interconnector_definitions() loss_functions, interpolation_break_points = \ interconnector_inputs.get_interconnector_loss_model() market.set_interconnectors(interconnectors_definitions) market.set_interconnector_losses(loss_functions, interpolation_break_points) # Add generic constraints and FCAS market constraints. fcas_requirements = constraint_inputs.get_fcas_requirements() cost = constraint_inputs.get_violation_costs() market.set_fcas_requirements_constraints(fcas_requirements, violation_cost=cost) generic_rhs = constraint_inputs.get_rhs_and_type_excluding_regional_fcas_constraints() market.set_generic_constraints(generic_rhs, violation_cost=cost) unit_generic_lhs = constraint_inputs.get_unit_lhs() market.link_units_to_generic_constraints(unit_generic_lhs) interconnector_generic_lhs = constraint_inputs.get_interconnector_lhs() market.link_interconnectors_to_generic_constraints(interconnector_generic_lhs) # Set the operational demand to be met by dispatch. regional_demand = demand_inputs.get_operational_demand() cost = constraint_inputs.get_constraint_violation_prices()['regional_demand'] market.set_demand_constraints(regional_demand, violation_cost=cost) # Set tiebreak constraint to equalise dispatch of equally priced bids. cost = constraint_inputs.get_constraint_violation_prices()['tiebreak'] market.set_tie_break_constraints(cost) # Get unit dispatch without fast start constraints and use it to # make fast start unit commitment decisions. market.dispatch() dispatch = market.get_unit_dispatch() cost = constraint_inputs.get_constraint_violation_prices()['fast_start'] fast_start_profiles = unit_inputs.get_fast_start_profiles_for_dispatch(dispatch) cols = ['unit', 'end_mode', 'time_in_end_mode', 'mode_two_length', 'mode_four_length', 'min_loading'] fsp = fast_start_profiles.loc[:, cols] market.set_fast_start_constraints(fsp, violation_cost=cost) ramp_rates = unit_inputs.get_bid_ramp_rates() scada_ramp_rates = unit_inputs.get_scada_ramp_rates() cols = ['unit', 'end_mode', 'time_since_end_of_mode_two', 'min_loading'] fsp = fast_start_profiles.loc[:, cols] cost = constraint_inputs.get_constraint_violation_prices()['ramp_rate'] market.set_unit_ramp_rate_constraints( ramp_rates, scada_ramp_rates, fsp, run_type="fast_start_second_run", violation_cost=cost ) cost = constraint_inputs.get_constraint_violation_prices()['fcas_profile'] scada_ramp_rates = unit_inputs.get_scada_ramp_rates(inlude_initial_output=True) market.set_joint_ramping_constraints_reg( scada_ramp_rates, fsp, run_type="fast_start_second_run", violation_cost=cost ) # If AEMO historically used the over constrained dispatch rerun # process then allow it to be used in dispatch. This is needed # because sometimes the conditions for over constrained dispatch # are present but the rerun process isn't used. if constraint_inputs.is_over_constrained_dispatch_rerun(): market.dispatch(allow_over_constrained_dispatch_re_run=True, energy_market_floor_price=-1000.0, energy_market_ceiling_price=17500.0, fcas_market_ceiling_price=1000.0) else: # The market price ceiling and floor are not needed here # because they are only used for the over constrained # dispatch rerun process. market.dispatch(allow_over_constrained_dispatch_re_run=False) # Save prices from this interval prices = market.get_energy_prices() prices['time'] = interval # Getting historical prices for comparison. Note, ROP price, which is # the regional reference node price before the application of any # price scaling by AEMO, is used for comparison. historical_prices = mms_db_manager.DISPATCHPRICE.get_data(interval) prices = pd.merge(prices, historical_prices, left_on=['time', 'region'], right_on=['SETTLEMENTDATE', 'REGIONID']) outputs.append( prices.loc[:, ['time', 'region', 'price', 'ROP']]) con.close() outputs = pd.concat(outputs) outputs['error'] = outputs['price'] - outputs['ROP'] outputs.to_csv("bdu_prices.csv") print('\n Summary of error in energy price volume weighted average price. \n' 'Comparison is against ROP, the price prior to \n' 'any post dispatch adjustments, scaling, capping etc.') print('Mean price error: {}'.format(outputs['error'].mean())) print('Median price error: {}'.format(outputs['error'].quantile(0.5))) print('5% percentile price error: {}'.format(outputs['error'].quantile(0.05))) print('95% percentile price error: {}'.format(outputs['error'].quantile(0.95))) # Summary of error in energy price volume weighted average price. # Comparison is against ROP, the price prior to # any post dispatch adjustments, scaling, capping etc. # Mean price error: 0.13818277307210394 # Median price error: 0.0 # 5% percentile price error: -0.13335830516772942 # 95% percentile price error: 0.013533539900288811 ```Owner
- Name: Collaboration on Energy and Environmental Markets (CEEM)
- Login: UNSW-CEEM
- Kind: organization
- Location: Sydney Australia
- Website: http://ceem.unsw.edu.au/
- Repositories: 27
- Profile: https://github.com/UNSW-CEEM
JOSS Publication
Nempy: A Python package for modelling the Australian National Electricity Market dispatch procedure
Authors
School of Photovoltaics and Renewable Energy Engineering, University of New South Wales, Australia, Collaboration on Energy and Environmental Markets (CEEM), University of New South Wales, Australia
School of Photovoltaics and Renewable Energy Engineering, University of New South Wales, Australia, Collaboration on Energy and Environmental Markets (CEEM), University of New South Wales, Australia
School of Electrical Engineering and Telecommunications, University of New South Wales, Australia, Collaboration on Energy and Environmental Markets (CEEM), University of New South Wales, Australia
Tags
electricity markets economic dispatch Australian National Electricity Market NEM dispatchGitHub Events
Total
- Create event: 6
- Release event: 6
- Issues event: 3
- Watch event: 10
- Delete event: 4
- Issue comment event: 11
- Push event: 23
- Pull request event: 9
- Fork event: 8
Last Year
- Create event: 6
- Release event: 6
- Issues event: 3
- Watch event: 10
- Delete event: 4
- Issue comment event: 11
- Push event: 23
- Pull request event: 9
- Fork event: 8
Committers
Last synced: 5 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| nick-gorman | 4****n | 378 |
| prakaa | a****7@g****m | 22 |
| Michael Lee | m****c@g****m | 6 |
| yueXiao2 | 5****2 | 2 |
| dependabot[bot] | 4****] | 2 |
| Zhou32 | z****5@g****m | 1 |
| Ben Elliston | b****e@a****u | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 4 months ago
All Time
- Total issues: 14
- Total pull requests: 22
- Average time to close issues: 2 months
- Average time to close pull requests: 3 months
- Total issue authors: 9
- Total pull request authors: 10
- Average comments per issue: 2.43
- Average comments per pull request: 1.18
- Merged pull requests: 12
- Bot issues: 0
- Bot pull requests: 6
Past Year
- Issues: 3
- Pull requests: 7
- Average time to close issues: 4 months
- Average time to close pull requests: 8 days
- Issue authors: 3
- Pull request authors: 3
- Average comments per issue: 1.33
- Average comments per pull request: 0.43
- Merged pull requests: 3
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- nick-gorman (4)
- mlee94 (3)
- spyj-1989 (1)
- andrewhn (1)
- ghackebeil (1)
- dehorsley (1)
- jpolyy (1)
- MattAmos (1)
- dec-heim (1)
Pull Request Authors
- dependabot[bot] (10)
- nick-gorman (4)
- yueXiao2 (2)
- mlee94 (2)
- Zhou32 (2)
- ghackebeil (2)
- cchristiansen (2)
- prakaa (1)
- bje- (1)
- dec-heim (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 3
-
Total downloads:
- pypi 2,571 last-month
-
Total dependent packages: 1
(may contain duplicates) -
Total dependent repositories: 1
(may contain duplicates) - Total versions: 50
- Total maintainers: 1
proxy.golang.org: github.com/UNSW-CEEM/nempy
- Documentation: https://pkg.go.dev/github.com/UNSW-CEEM/nempy#section-documentation
- License: bsd-3-clause
-
Latest release: v3.0.3+incompatible
published 10 months ago
Rankings
proxy.golang.org: github.com/unsw-ceem/nempy
- Documentation: https://pkg.go.dev/github.com/unsw-ceem/nempy#section-documentation
- License: bsd-3-clause
-
Latest release: v3.0.3+incompatible
published 10 months ago
Rankings
pypi.org: nempy
A flexible tool kit for modelling Australia's National Electricity Market dispatch procedure.
- Documentation: https://nempy.readthedocs.io/
- License: bsd-3-clause
-
Latest release: 3.0.3
published 10 months ago
Rankings
Maintainers (1)
Dependencies
- autodocsumm ==0.1.13
- actions/checkout v2 composite
- actions/setup-python v1 composite
- alabaster 0.7.13
- autodocsumm 0.2.11
- babel 2.12.1
- certifi 2023.7.22
- cffi 1.15.1
- charset-normalizer 3.2.0
- colorama 0.4.6
- docutils 0.19
- exceptiongroup 1.1.3
- idna 3.4
- imagesize 1.4.1
- importlib-metadata 6.8.0
- iniconfig 2.0.0
- jinja2 3.1.2
- markupsafe 2.1.3
- mip 1.15.0
- numpy 1.24.4
- numpy 1.25.2
- packaging 23.1
- pandas 2.0.3
- pluggy 1.3.0
- pycparser 2.21
- pygments 2.16.1
- pytest 7.4.2
- python-dateutil 2.8.2
- pytz 2023.3.post1
- requests 2.31.0
- six 1.16.0
- snowballstemmer 2.2.0
- sphinx 5.3.0
- sphinxcontrib-applehelp 1.0.4
- sphinxcontrib-devhelp 1.0.2
- sphinxcontrib-htmlhelp 2.0.1
- sphinxcontrib-jsmath 1.0.1
- sphinxcontrib-qthelp 1.0.3
- sphinxcontrib-serializinghtml 1.1.5
- tomli 2.0.1
- tzdata 2023.3
- urllib3 2.0.4
- xmltodict 0.12.0
- zipp 3.16.2
- mip >=1.11.0,<2.0.0
- pandas <3.0.0
- python >=3.8,<3.12
- requests >=2.0.0, <3.0.0
- xmltodict ==0.12.0
