- Notifications
You must be signed in to change notification settings - Fork0
an AntennaCAT compatable sweep optimizer
License
LC-Linkous/sweep_python
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Simple sweep optimizer written in Python.
The approaches in this repository areexhaustive searches through a combination of hyperparameters (the inputs for the feasible decision space of the objective function). They're not the fastest, but they're kind of fun to watch.
- Sweep Optimization
- Requirements
- Implementation
- Error Handling
- Example Implementations
- References
- Related Publications and Repositories
- Licensing
- How to Cite
A grid-based sweep optimizer, often referred to as grid search, is a simple yet effective optimization technique commonly used for hyperparameter tuning in machine learning models. This method systematically explores a specified subset of the hyperparameter space by evaluating the performance of a model with all possible combinations of the provided hyperparameter values.
Random search is an optimization method where solutions are randomly sampled from a defined space, evaluated, and iteratively improved based on the evaluations, aiming to find an optimal or near-optimal solution. Random search is generally not as efficient as more advanced optimization algorithms like gradient-based methods or evolutionary algorithms, especially in problems where the search space is structured or the objective function has a particular shape that can be exploited.
This project requires numpy, pandas, and matplotlib for the full demos. To run the optimizer without visualization, only numpy and pandas are requirements
Use 'pip install -r requirements.txt' to install the following dependencies:
contourpy==1.2.1cycler==0.12.1fonttools==4.51.0importlib_resources==6.4.0kiwisolver==1.4.5matplotlib==3.8.4numpy==1.26.4packaging==24.0pandas==2.2.3pillow==10.3.0pyparsing==3.1.2python-dateutil==2.9.0.post0pytz==2025.1six==1.16.0tzdata==2025.1zipp==3.18.1
Optionally, requirements can be installed manually with:
pipinstallmatplotlib,numpy,pandas
This is an example for if you've had a difficult time with the requirements.txt file. Sometimes libraries are packaged together.
# Constant variablesNO_OF_PARTICLES=3# Number of indpendent agents searching the spaceMIN_RES= [0.02]# min resolution for searchMAX_RES= [1.1]# max resolution for searchTOL=10**-6# Convergence ToleranceMAXIT=50000# Maximum allowed iterationsSEARCH_METHOD=1# int search 1 = basic_grid, 2 = random_search# Objective function dependent variablesLB=func_configs.LB# Lower boundaries, [[0.21, 0, 0.1]]UB=func_configs.UB# Upper boundaries, [[1, 1, 0.5]]IN_VARS=func_configs.IN_VARS# Number of input variables (x-values)OUT_VARS=func_configs.OUT_VARS# Number of output variables (y-values)TARGETS=func_configs.TARGETS# Target values for outputfunc_F=func_configs.OBJECTIVE_FUNC# objective functionconstr_F=func_configs.CONSTR_FUNC# constraint functionbest_eval=3# set higher than normal because of the potential for missing the targetparent=None# Optional parent class for swarm# (Used for passing debug messages or# other information that will appear# in GUI panels)evaluate_threshold=True# use target or threshold. True = THRESHOLD, False = EXACT TARGETsuppress_output=True# Suppress the console output of particle swarmallow_update=True# Allow objective call to update state# (can be set on each iteration)# Constant variablesopt_params= {'NO_OF_PARTICLES': [NO_OF_PARTICLES],# Number of indpendent agents searching the space'SEARCH_METHOD': [SEARCH_METHOD],# int search 1 = basic_grid, 2 = random_search'MIN_RES': [MIN_RES],# min resolution for search'MAX_RES': [MAX_RES]}# max resolution for searchopt_df=pd.DataFrame(opt_params)mySweep=sweep(LB,UB,TARGETS,TOL,MAXIT,func_F,constr_F,opt_df,parent=parent)# arguments should take the form:# swarm([[float, float, ...]], [[float, float, ...]], [[float, ...]], float, int,# func, func,# dataFrame,# class obj)## opt_df contains class-specific tuning parameters# NO_OF_PARTICLES: int# weights: [[float, float, float]]# boundary: int. 1 = random, 2 = reflecting, 3 = absorbing, 4 = invisible# beta: float
This optimizer uses a state machine structure to control the movement of the particles, call to the objective function, and the evaluation of current positions. The state machine implementation preserves the initial algorithm while making it possible to integrate other programs, classes, or functions as the objective function.
A controller with awhile loop to check the completion status of the optimizer drives the process. Completion status is determined by at least 1) a set MAX number of iterations, and 2) the convergence to a given target using the L2 norm. Iterations are counted by calls to the objective function.
Within thiswhile loop are three function calls to control the optimizer class:
- complete: the
complete functionchecks the status of the optimizer and if it has met the convergence or stop conditions. - step: the
step functiontakes a boolean variable (suppress_output) as an input to control detailed printout on current particle (or agent) status. This function moves the optimizer one step forward. - call_objective: the
call_objective functiontakes a boolean variable (allow_update) to control if the objective function is able to be called. In most implementations, this value will always be true. However, there may be cases where the controller or a program running the state machine needs to assert control over this function without stopping the loop.
Additionally,get_convergence_data can be used to preview the current status of the optimizer, including the current best evaluation and the iterations.
The code below is an example of this process:
whilenotmyOptimizer.complete():# step through optimizer processing# this will update particle or agent locationsmyOptimizer.step(suppress_output)# call the objective function, control# when it is allowed to update and return# control to optimizermyOptimizer.call_objective(allow_update)# check the current progress of the optimizer# iter: the number of objective function calls# eval: current 'best' evaluation of the optimizeriter,eval=myOptimizer.get_convergence_data()if (eval<best_eval)and (eval!=0):best_eval=eval# optional. if the optimizer is not printing out detailed# reports, preview by checking the iteration and best evaluationifsuppress_output:ifiter%100==0:#print out every 100th iteration updateprint("Iteration")print(iter)print("Best Eval")print(best_eval)
Users must create their own constraint function for their problems, if there are constraints beyond the problem bounds. This is then passed into the constructor. If the default constraint function is used, it always returns true (which means there are no constraints).
More search types will be added, but for initial deployment, a standard grid search is used.
The basic grid search uses the current position of a particle (or agent), and increments it one step towards the upper bounds based on the defined problem space. It can use 1 or more particles (or agents) to search a space. If one particle is used, it will start at the lower bound of the decision space, and increment based on the minimum resolution until the particle reaches the maximum boundary limit.
Resolution is a multi-dimensional vector to allow for tuning in all dimensions of the input space.
This method does not tend to converge with a small error tolerance.
The random search generates NO_OF_PARTICLES agents in order to search the defined problem space. Each agent is independent and does not move from its initial generated position.
The no preference method of multi-objective optimization, but a Pareto Front is not calculated. Instead the best choice (smallest norm of output vectors) is listed as the output.
The optimizer minimizes the absolute value of the difference of the target outputs and the evaluated outputs. Future versions may include options for function minimization when target values are absent.
Custom objective functions can be used by creating a directory with the following files:
- configs_F.py
- constr_F.py
- func_F.py
configs_F.py contains lower bounds, upper bounds, the number of input variables, the number of output variables, the target values, and a global minimum if known. This file is used primarily for unit testing and evaluation of accuracy. If these values are not known, or are dynamic, then they can be included experimentally in the controller that runs the optimizer's state machine.
constr_F.py contains a function calledconstr_F that takes in an array,X, of particle positions to determine if the particle or agent is in a valid or invalid location.
func_F.py contains the objective function,func_F, which takes two inputs. The first input,X, is the array of particle or agent positions. The second input,NO_OF_OUTS, is the integer number of output variables, which is used to set the array size. In included objective functions, the default value is hardcoded to work with the specific objective function.
Below are examples of the format for these files.
configs_F.py:
OBJECTIVE_FUNC=func_FCONSTR_FUNC=constr_FOBJECTIVE_FUNC_NAME="one_dim_x_test.func_F"#format: FUNCTION NAME.FUNCTIONCONSTR_FUNC_NAME="one_dim_x_test.constr_F"#format: FUNCTION NAME.FUNCTION# problem dependent variablesLB= [[0]]# Lower boundariesUB= [[1]]# Upper boundariesIN_VARS=1# Number of input variables (x-values)OUT_VARS=1# Number of output variables (y-values)TARGETS= [0]# Target values for outputGLOBAL_MIN= []# Global minima sample, if they exist.
constr_F.py, with no constraints:
defconstr_F(x):F=TruereturnF
constr_F.py, with constraints:
defconstr_F(X):F=True# objective function/problem constraintsif (X[2]>X[0]/2)or (X[2]<0.1):F=FalsereturnF
func_F.py:
importnumpyasnpimporttimedeffunc_F(X,NO_OF_OUTS=1):F=np.zeros((NO_OF_OUTS))noErrors=Truetry:x=X[0]F=np.sin(5*x**3)+np.cos(5*x)* (1-np.tanh(x**2))exceptExceptionase:print(e)noErrors=Falsereturn [F],noErrors
There are three functions included in the repository:
- Himmelblau's function, which takes 2 inputs and has 1 output
- A multi-objective function with 3 inputs and 2 outputs (see lundquist_3_var)
- A single-objective function with 1 input and 1 output (see one_dim_x_test)
Each function has four files in a directory:
- configs_F.py - contains imports for the objective function and constraints, CONSTANT assignments for functions and labeling, boundary ranges, the number of input variables, the number of output values, and the target values for the output
- constr_F.py - contains a function with the problem constraints, both for the function and for error handling in the case of under/overflow.
- func_F.py - contains a function with the objective function.
- graph.py - contains a script to graph the function for visualization.
Other multi-objective functions can be applied to this project by following the same format (and several have been collected into a compatible library, and will be released in a separate repo)
Plotted Himmelblau’s Function with 3D Plot on the Left, and a 2D Contour on the Right
| Global Minima | Boundary | Constraints |
|---|---|---|
| f(3, 2) = 0 | ||
| f(-2.805118, 3.121212) = 0 | ||
| f(-3.779310, -3.283186) = 0 | ||
| f(3.584428, -1.848126) = 0 |
Plotted Multi-Objective Function Feasible Decision Space and Objective Space with Pareto Front
| Num. Input Variables | Boundary | Constraints |
|---|---|---|
| 3 |
Plotted Single Input, Single-objective Function Feasible Decision Space and Objective Space with Pareto Front
| Num. Input Variables | Boundary | Constraints |
|---|---|---|
| 1 |
Local minima at
Global minima at
An April 2025 feature is the user ability to toggle TARGET and THRESHOLD evaluation for the optimized values. The key variables for this are:
# Boolean. use target or threshold. True = THRESHOLD, False = EXACT TARGETevaluate_threshold=True# arrayTARGETS=func_configs.TARGETS# Target values for output from function configs# OR:TARGETS= [0,0,0]#manually set BASED ON PROBLEM DIMENSIONS# threshold is same dims as TARGETS# 0 = use target value as actual target. value should EQUAL target# 1 = use as threshold. value should be LESS THAN OR EQUAL to target# 2 = use as threshold. value should be GREATER THAN OR EQUAL to target#DEFAULT THRESHOLDTHRESHOLD=np.zeros_like(TARGETS)# ORTHRESHOLD= [0,1,2]# can be any mix of TARGET and THRESHOLD
To implement this, the originalself.Flist objective function calculation has been replaced with the functionobjective_function_evaluation, which returns a numpy array.
The original calculation:
self.Flist=abs(self.targets-self.Fvals)
Whereself.Fvals is a re-arranged and error checked returned value from the passed in function fromfunc_F.py (see examples for the internal objective function or creating a custom objective function).
When using a THRESHOLD, theFlist value corresponding to the target is set to epsilon (the smallest system value) if the evaluatedfunc_F value meets the threshold condition for that target item. If the threshold is not met, the absolute value of the difference of the target output and the evaluated output is used. With a THRESHOLD configuration, each value in the numpy array is evaluated individually, so some values can be 'greater than or equal to' the target while others are 'equal' or 'less than or equal to' the target.
main_test.py provides a sample use case of the optimizer.
main_test_details.py provides an example using a parent class, and the self.suppress_output flag to control error messages that are passed back to the parent class to be printed with a timestamp. This implementation sets up the hooks for integration with AntennaCAT in order to provide the user feedback of warnings and errors.
Grid Search. Left: particle search locations, Right: fitness function results (open circles), and target (red star)
Random Search. Left: particle search locations, Right: fitness function results (open circles), and target (red star)
main_test_graph.py provides an example using a parent class, and the self.suppress_output and flag to control error messages that are passed back to the parent class to be printed with a timestamp. Additionally, a realtime graph shows particle locations at every step.
The figures above are a snapshots of the search. The left shows all of the search locations of a single particle (NOTE: toggle a the 'clear' boolean to turn this feature off), and the right side shows the target (marked by a star) and the fitness function locations (the open circles). While the fitness of the particle is very close to the target, it does not come closer than the 10E-6 tolerance, so the search does not converge.
NOTE: if you close the graph as the code is running, the code will continue to run, but the graph will not re-open.
This repo does not currently reference any code of papers for the sweep algorithm.
This software works as a stand-alone implementation, and as one of the optimizers integrated into AntennaCAT.
The code in this repository has been released under GPL-2.0
The pre-May 2025 code can be referenced using the following DOI:
10.5281/zenodo.15048577
In IEEE format:
L. Linkous, "sweep_python". GitHub, 2024. [Software].https://github.com/LC-Linkous/sweep_python. DOI:10.5281/zenodo.15048577
About
an AntennaCAT compatable sweep optimizer
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Uh oh!
There was an error while loading.Please reload this page.




