sweep_python

an AntennaCAT compatable sweep optimizer

https://github.com/lc-linkous/sweep_python

Science Score: 49.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 5 DOI reference(s) in README
  • Academic publication links
    Links to: zenodo.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.3%) to scientific vocabulary

Keywords

antennacat grid-based-search random-search-optimization sweep-optimization
Last synced: 6 months ago · JSON representation

Repository

an AntennaCAT compatable sweep optimizer

Basic Info
  • Host: GitHub
  • Owner: LC-Linkous
  • License: gpl-2.0
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 146 MB
Statistics
  • Stars: 0
  • Watchers: 2
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Topics
antennacat grid-based-search random-search-optimization sweep-optimization
Created over 1 year ago · Last pushed 8 months ago
Metadata Files
Readme License Citation

README.md

sweep_python

DOI

Simple sweep optimizer written in Python.

The approaches in this repository are exhaustive searches through a combination of hyperparameters (the inputs for the feasible decision space of the objective function). They're not the fastest, but they're kind of fun to watch.

Table of Contents

Sweep Optimization

Grid-based Sweep

A grid-based sweep optimizer, often referred to as grid search, is a simple yet effective optimization technique commonly used for hyperparameter tuning in machine learning models. This method systematically explores a specified subset of the hyperparameter space by evaluating the performance of a model with all possible combinations of the provided hyperparameter values.

Random Search

Random search is an optimization method where solutions are randomly sampled from a defined space, evaluated, and iteratively improved based on the evaluations, aiming to find an optimal or near-optimal solution. Random search is generally not as efficient as more advanced optimization algorithms like gradient-based methods or evolutionary algorithms, especially in problems where the search space is structured or the objective function has a particular shape that can be exploited.

Requirements

This project requires numpy, pandas, and matplotlib for the full demos. To run the optimizer without visualization, only numpy and pandas are requirements

Use 'pip install -r requirements.txt' to install the following dependencies:

```python contourpy==1.2.1 cycler==0.12.1 fonttools==4.51.0 importlib_resources==6.4.0 kiwisolver==1.4.5 matplotlib==3.8.4 numpy==1.26.4 packaging==24.0 pandas==2.2.3 pillow==10.3.0 pyparsing==3.1.2 python-dateutil==2.9.0.post0 pytz==2025.1 six==1.16.0 tzdata==2025.1 zipp==3.18.1

```

Optionally, requirements can be installed manually with:

```python pip install matplotlib, numpy, pandas

``` This is an example for if you've had a difficult time with the requirements.txt file. Sometimes libraries are packaged together.

Implementation

Initialization

```python # Constant variables NOOFPARTICLES = 3 # Number of indpendent agents searching the space MINRES = [0.02] # min resolution for search MAXRES = [1.1] # max resolution for search TOL = 10 ** -6 # Convergence Tolerance MAXIT = 50000 # Maximum allowed iterations SEARCHMETHOD = 1 # int search 1 = basicgrid, 2 = random_search

# Objective function dependent variables
LB = func_configs.LB                    # Lower boundaries, [[0.21, 0, 0.1]]
UB = func_configs.UB                    # Upper boundaries, [[1, 1, 0.5]]
IN_VARS = func_configs.IN_VARS          # Number of input variables (x-values)   
OUT_VARS = func_configs.OUT_VARS        # Number of output variables (y-values)
TARGETS = func_configs.TARGETS          # Target values for output

func_F = func_configs.OBJECTIVE_FUNC  # objective function
constr_F = func_configs.CONSTR_FUNC   # constraint function

best_eval = 3          # set higher than normal because of the potential for missing the target

parent = None            # Optional parent class for swarm 
                                    # (Used for passing debug messages or
                                    # other information that will appear 
                                    # in GUI panels)

evaluate_threshold = True # use target or threshold. True = THRESHOLD, False = EXACT TARGET
suppress_output = True    # Suppress the console output of particle swarm
allow_update = True       # Allow objective call to update state 
                            # (can be set on each iteration)

# Constant variables
opt_params = {'NO_OF_PARTICLES': [NO_OF_PARTICLES],     # Number of indpendent agents searching the space
            'SEARCH_METHOD': [SEARCH_METHOD],           # int search 1 = basic_grid, 2 = random_search
            'MIN_RES': [MIN_RES],                       # min resolution for search
            'MAX_RES': [MAX_RES]}                       # max resolution for search


opt_df = pd.DataFrame(opt_params)
mySweep = sweep(LB, UB, TARGETS, TOL, MAXIT,
                        func_F, constr_F,
                        opt_df,
                        parent=parent)  

# arguments should take the form: 
# swarm([[float, float, ...]], [[float, float, ...]], [[float, ...]], float, int,
# func, func,
# dataFrame,
# class obj) 
#  
# opt_df contains class-specific tuning parameters
# NO_OF_PARTICLES: int
# weights: [[float, float, float]]
# boundary: int. 1 = random, 2 = reflecting, 3 = absorbing,   4 = invisible
# beta: float

```

State Machine-based Structure

This optimizer uses a state machine structure to control the movement of the particles, call to the objective function, and the evaluation of current positions. The state machine implementation preserves the initial algorithm while making it possible to integrate other programs, classes, or functions as the objective function.

A controller with a while loop to check the completion status of the optimizer drives the process. Completion status is determined by at least 1) a set MAX number of iterations, and 2) the convergence to a given target using the L2 norm. Iterations are counted by calls to the objective function.

Within this while loop are three function calls to control the optimizer class: * complete: the complete function checks the status of the optimizer and if it has met the convergence or stop conditions. * step: the step function takes a boolean variable (suppressoutput) as an input to control detailed printout on current particle (or agent) status. This function moves the optimizer one step forward.
* **call
objective**: the call_objective function takes a boolean variable (allow_update) to control if the objective function is able to be called. In most implementations, this value will always be true. However, there may be cases where the controller or a program running the state machine needs to assert control over this function without stopping the loop.

Additionally, getconvergencedata can be used to preview the current status of the optimizer, including the current best evaluation and the iterations.

The code below is an example of this process:

```python while not myOptimizer.complete(): # step through optimizer processing # this will update particle or agent locations myOptimizer.step(suppressoutput) # call the objective function, control # when it is allowed to update and return # control to optimizer myOptimizer.callobjective(allowupdate) # check the current progress of the optimizer # iter: the number of objective function calls # eval: current 'best' evaluation of the optimizer iter, eval = myOptimizer.getconvergencedata() if (eval < besteval) and (eval != 0): best_eval = eval

    # optional. if the optimizer is not printing out detailed 
    # reports, preview by checking the iteration and best evaluation

    if suppress_output:
        if iter%100 ==0: #print out every 100th iteration update
            print("Iteration")
            print(iter)
            print("Best Eval")
            print(best_eval)

```

Constraint Handling

Users must create their own constraint function for their problems, if there are constraints beyond the problem bounds. This is then passed into the constructor. If the default constraint function is used, it always returns true (which means there are no constraints).

Search Types

More search types will be added, but for initial deployment, a standard grid search is used.

basic_grid

The basic grid search uses the current position of a particle (or agent), and increments it one step towards the upper bounds based on the defined problem space. It can use 1 or more particles (or agents) to search a space. If one particle is used, it will start at the lower bound of the decision space, and increment based on the minimum resolution until the particle reaches the maximum boundary limit.

Resolution is a multi-dimensional vector to allow for tuning in all dimensions of the input space.

This method does not tend to converge with a small error tolerance.

random_search

The random search generates NOOFPARTICLES agents in order to search the defined problem space. Each agent is independent and does not move from its initial generated position.

Multi-Objective Optimization

The no preference method of multi-objective optimization, but a Pareto Front is not calculated. Instead the best choice (smallest norm of output vectors) is listed as the output.

Objective Function Handling

The optimizer minimizes the absolute value of the difference of the target outputs and the evaluated outputs. Future versions may include options for function minimization when target values are absent.

Creating a Custom Objective Function

Custom objective functions can be used by creating a directory with the following files: * configsF.py * constrF.py * func_F.py

configs_F.py contains lower bounds, upper bounds, the number of input variables, the number of output variables, the target values, and a global minimum if known. This file is used primarily for unit testing and evaluation of accuracy. If these values are not known, or are dynamic, then they can be included experimentally in the controller that runs the optimizer's state machine.

constr_F.py contains a function called constr_F that takes in an array, X, of particle positions to determine if the particle or agent is in a valid or invalid location.

func_F.py contains the objective function, func_F, which takes two inputs. The first input, X, is the array of particle or agent positions. The second input, NO_OF_OUTS, is the integer number of output variables, which is used to set the array size. In included objective functions, the default value is hardcoded to work with the specific objective function.

Below are examples of the format for these files.

configs_F.py: ```python OBJECTIVEFUNC = funcF CONSTRFUNC = constrF OBJECTIVEFUNCNAME = "onedimxtest.funcF" #format: FUNCTION NAME.FUNCTION CONSTRFUNCNAME = "onedimxtest.constrF" #format: FUNCTION NAME.FUNCTION

problem dependent variables

LB = [[0]] # Lower boundaries UB = [[1]] # Upper boundaries INVARS = 1 # Number of input variables (x-values) OUTVARS = 1 # Number of output variables (y-values) TARGETS = [0] # Target values for output GLOBAL_MIN = [] # Global minima sample, if they exist.

```

constr_F.py, with no constraints: python def constr_F(x): F = True return F

constr_F.py, with constraints: python def constr_F(X): F = True # objective function/problem constraints if (X[2] > X[0]/2) or (X[2] < 0.1): F = False return F

func_F.py: ```python import numpy as np import time

def funcF(X, NOOFOUTS=1): F = np.zeros((NOOF_OUTS)) noErrors = True try: x = X[0] F = np.sin(5 * x*3) + np.cos(5 * x) * (1 - np.tanh(x * 2)) except Exception as e: print(e) noErrors = False

return [F], noErrors

```

Internal Objective Function Example

There are three functions included in the repository: 1) Himmelblau's function, which takes 2 inputs and has 1 output 2) A multi-objective function with 3 inputs and 2 outputs (see lundquist3var) 3) A single-objective function with 1 input and 1 output (see onedimx_test)

Each function has four files in a directory: 1) configsF.py - contains imports for the objective function and constraints, CONSTANT assignments for functions and labeling, boundary ranges, the number of input variables, the number of output values, and the target values for the output 2) constrF.py - contains a function with the problem constraints, both for the function and for error handling in the case of under/overflow. 3) func_F.py - contains a function with the objective function. 4) graph.py - contains a script to graph the function for visualization.

Other multi-objective functions can be applied to this project by following the same format (and several have been collected into a compatible library, and will be released in a separate repo)

Himmelblaus function

Plotted Himmelblaus Function with 3D Plot on the Left, and a 2D Contour on the Right

math f(x, y) = (x^2 + y - 11)^2 + (x + y^2 - 7)^2

| Global Minima | Boundary | Constraints | |----------|----------|----------| | f(3, 2) = 0 | $-5 \leq x,y \leq 5$ | | | f(-2.805118, 3.121212) = 0 | $-5 \leq x,y \leq 5$ | | | f(-3.779310, -3.283186) = 0 | $-5 \leq x,y \leq 5$ | | | f(3.584428, -1.848126) = 0 | $-5 \leq x,y \leq 5$ | |

Function Feasible Decision Space and Objective Space with Pareto Front

Plotted Multi-Objective Function Feasible Decision Space and Objective Space with Pareto Front

math \text{minimize}: \begin{cases} f_{1}(\mathbf{x}) = (x_1-0.5)^2 + (x_2-0.1)^2 \\ f_{2}(\mathbf{x}) = (x_3-0.2)^4 \end{cases}

| Num. Input Variables| Boundary | Constraints | |----------|----------|----------| | 3 | $0.21\leq x1\leq 1$
$0\leq x
2\leq 1$
$0.1 \leq x3\leq 0.5$ | $x3\gt \frac{x1}{2}$ or $x3\lt 0.1$|

Function Feasible Decision Space and Objective Space with Pareto Front

Plotted Single Input, Single-objective Function Feasible Decision Space and Objective Space with Pareto Front

math f(\mathbf{x}) = sin(5 * x^3) + cos(5 * x) * (1 - tanh(x^2)) | Num. Input Variables| Boundary | Constraints | |----------|----------|----------| | 1 | $0\leq x\leq 1$ | $0\leq x\leq 1$| |

Local minima at $(0.444453, -0.0630916)$

Global minima at $(0.974857, -0.954872)$

Target vs. Threshold Configuration

An April 2025 feature is the user ability to toggle TARGET and THRESHOLD evaluation for the optimized values. The key variables for this are:

```python

Boolean. use target or threshold. True = THRESHOLD, False = EXACT TARGET

evaluate_threshold = True

array

TARGETS = func_configs.TARGETS # Target values for output from function configs

OR:

TARGETS = [0,0,0] #manually set BASED ON PROBLEM DIMENSIONS

threshold is same dims as TARGETS

0 = use target value as actual target. value should EQUAL target

1 = use as threshold. value should be LESS THAN OR EQUAL to target

2 = use as threshold. value should be GREATER THAN OR EQUAL to target

DEFAULT THRESHOLD

THRESHOLD = np.zeros_like(TARGETS)

OR

THRESHOLD = [0,1,2] # can be any mix of TARGET and THRESHOLD
```

To implement this, the original self.Flist objective function calculation has been replaced with the function objective_function_evaluation, which returns a numpy array.

The original calculation: python self.Flist = abs(self.targets - self.Fvals) Where self.Fvals is a re-arranged and error checked returned value from the passed in function from func_F.py (see examples for the internal objective function or creating a custom objective function).

When using a THRESHOLD, the Flist value corresponding to the target is set to epsilon (the smallest system value) if the evaluated func_F value meets the threshold condition for that target item. If the threshold is not met, the absolute value of the difference of the target output and the evaluated output is used. With a THRESHOLD configuration, each value in the numpy array is evaluated individually, so some values can be 'greater than or equal to' the target while others are 'equal' or 'less than or equal to' the target.

Example Implementations

Basic Sweep Example

main_test.py provides a sample use case of the optimizer.

Detailed Messages

main_test_details.py provides an example using a parent class, and the self.suppress_output flag to control error messages that are passed back to the parent class to be printed with a timestamp. This implementation sets up the hooks for integration with AntennaCAT in order to provide the user feedback of warnings and errors.

Realtime Graph

Example Grid Sweep Convergence (Attempt)

Grid Search. Left: particle search locations, Right: fitness function results (open circles), and target (red star)



Example Random Sweep Convergence (Attempt)

Random Search. Left: particle search locations, Right: fitness function results (open circles), and target (red star)



main_test_graph.py provides an example using a parent class, and the self.suppress_output and flag to control error messages that are passed back to the parent class to be printed with a timestamp. Additionally, a realtime graph shows particle locations at every step.

The figures above are a snapshots of the search. The left shows all of the search locations of a single particle (NOTE: toggle a the 'clear' boolean to turn this feature off), and the right side shows the target (marked by a star) and the fitness function locations (the open circles). While the fitness of the particle is very close to the target, it does not come closer than the 10E-6 tolerance, so the search does not converge.

NOTE: if you close the graph as the code is running, the code will continue to run, but the graph will not re-open.

References

This repo does not currently reference any code of papers for the sweep algorithm.

Related Publications and Repositories

This software works as a stand-alone implementation, and as one of the optimizers integrated into AntennaCAT.

Licensing

The code in this repository has been released under GPL-2.0

How to Cite

The pre-May 2025 code can be referenced using the following DOI:

10.5281/zenodo.15048577

In IEEE format:

L. Linkous, "sweeppython". GitHub, 2024. [Software]. https://github.com/LC-Linkous/sweeppython. DOI: 10.5281/zenodo.15048577

Owner

  • Name: L. Linkous
  • Login: LC-Linkous
  • Kind: user
  • Location: Richmond
  • Company: VCU

Electrical Engineering Ph.D student

GitHub Events

Total
  • Push event: 6
Last Year
  • Push event: 6