daps
Dimensionally adaptive prime search for discontinuous, non-smooth, and multi-modal landscapes where traditional methods might fail.
Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (16.1%) to scientific vocabulary
Repository
Dimensionally adaptive prime search for discontinuous, non-smooth, and multi-modal landscapes where traditional methods might fail.
Basic Info
Statistics
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 2
- Releases: 0
Metadata Files
README.md
DAPS - Dimensionally Adaptive Prime Search

A high-performance global optimization algorithm for 1D, 2D, and 3D functions, implemented in C++ with Python bindings.
How It Works
DAPS uses prime number-based grid sampling to avoid aliasing problems common in regular grid search methods. It dynamically adapts resolution and shrinks the search domain around promising regions. It assumes a measurable loss function at every evaluation. Primes are treated as resolution knobs and can be increased or decreased depending on the degree of accuracy needed.
For theoretical details, see the research paper.
🎧 DAPS Podcast Episode
Listen to the introduction of Dimensionally Adaptive Prime Search (DAPS) — the story, the math, and the future: Podcast
Overview
DAPS efficiently finds global minima of complex functions using a prime number-based adaptive grid search strategy. It excels at navigating complex landscapes with multiple local minima, valleys, and discontinuities.
Key Features
- Multi-Dimensional: Optimize functions in 1D, 2D, or 3D spaces
- High Performance: C++ core with Cython bindings
- Global Optimization: Designed to escape local minima
- Adaptive Resolution: Dynamically adjusts search precision
- SciPy Compatible: Familiar interface for easy integration
Quick Start
```bash
Install from PyPI
pip install daps
Or install from source
git clone https://github.com/sethuiyer/DAPS.git cd DAPS pip install -e . ```
Basic Usage
```python from daps import daps_minimize
1D Optimization Example
result = dapsminimize( 'spherefunction', bounds=[-5, 5], options={'dimensions': 1, 'maxiter': 50} )
print(f"Optimal solution: {result['x']}, value: {result['fun']}") ```
Custom Functions
```python from daps import daps_minimize, DAPSFunction import numpy as np
Define a custom 2D function
def himmelblau(x, y): return (x2 + y - 11)2 + (x + y2 - 7)2
Wrap in a DAPSFunction with metadata
func = DAPSFunction( func=himmelblau, name="Himmelblau", bounds=[-5, 5, -5, 5], dimensions=2, description="Classic test function with four identical local minima" )
Optimize
result = daps_minimize(func, options={'maxiter': 80}) print(f"Optimal point: ({result['x'][0]:.4f}, {result['x'][1]:.4f})") ```
⚠️ Development Status
The pure Python implementation (base.py) is fully functional. C++/Cython integration and packaging are under active development.
Interactive Demo
```bash cd interactive ./run_demo.sh # Linux/Mac
or
run_demo.bat # Windows ```
Here’s a PyTorch‑compatible DAPS optimizer :
- Starts at prime=97
- Never drops below prime=2
- Works for n‑dimensional functions in batch (GPU‑ready)
- Adapts prime resolution, shrinks domain, and clamps to your original bounds
```python import torch from sympy import primerange
class DAPSOptimizerTorch: def init(self, bounds, device='cpu', primestart=97): primes = list(primerange(2,500)) self.primelist = primes self.primeidx = primes.index(primestart) self.minidx = 0 self.maxidx = len(primes)-1 self.device = torch.device(device) self.bounds = torch.tensor(bounds, device=self.device).view(-1,2)
def optimize(self, func, maxiter=10, samples=1000, shrink=0.5, tol=1e-6):
domain = self.bounds.clone()
best_val, best_x = float('inf'), None
for _ in range(maxiter):
p = self.prime_list[self.prime_idx]
pts = domain[:,0] + (domain[:,1]-domain[:,0]) * torch.rand(samples, self.bounds.size(0), device=self.device)
vals = func(pts).flatten()
idx = torch.argmin(vals)
val, x = vals[idx].item(), pts[idx]
if val < best_val:
best_val, best_x = val, x.clone()
self.prime_idx = min(self.prime_idx+1, self.max_idx)
else:
self.prime_idx = max(self.prime_idx-1, self.min_idx)
span = domain[:,1] - domain[:,0]
domain[:,0] = torch.max(self.bounds[:,0], best_x - span*shrink/2)
domain[:,1] = torch.min(self.bounds[:,1], best_x + span*shrink/2)
if best_val < tol:
break
return best_x.cpu().numpy(), best_val
```
🔥 Usage Example
```python import numpy as np
3‑D Rosenbrock as torch batch function
def rosenbrock_batch(X): x,y,z = X[:,0], X[:,1], X[:,2] return (100(y-x2)2 + (1-x)2 + 100(z-y2)2).unsqueeze(1)
bounds = [-5,5, -5,5, -5,5] opt = DAPSOptimizerTorch(bounds, device='cpu', primestart=97) bestpoint, bestval = opt.optimize(rosenbrockbatch, maxiter=20, samples=2000) print(bestpoint, bestval) ```
That’s your n‑dimensional, GPU‑ready, prime‑adaptive optimizer.
Citation
bibtex
@article{iyerpreprintprime,
title={Prime-Adaptive Search (PAS): A Novel Method for Efficient Optimization in Discontinuous Landscapes},
author={Iyer, Sethu},
year={2025},
url={https://github.com/sethuiyer/DAPS},
}
License
MIT License - See LICENSE file for details.
Owner
- Name: Sethu Iyer
- Login: sethuiyer
- Kind: user
- Repositories: 26
- Profile: https://github.com/sethuiyer
Data Scientist at Reliance Jio. Previously R&D Engineer at Amelia. BITS Pilani Math and CSE
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you use this software, please cite it using the metadata from this file."
title: "DAPS: Dimensionally Adaptive Prime Search Optimizer"
authors:
- family-names: "Iyer"
given-names: "Sethu"
orcid: "https://orcid.org/0000-0000-0000-0000" # Replace with actual ORCID if available
repository-code: "https://github.com/sethuiyer/DAPS"
abstract: "A high-performance global optimization algorithm for 3D functions, implemented in C++ with Python bindings via Cython."
keywords:
- optimization
- global-optimization
- 3d-optimization
- cython
- c++
license: MIT
version: 0.1.0
date-released: "2023-01-01" # Replace with actual release date
type: software
preferred-citation:
type: software
authors:
- family-names: "Iyer"
given-names: "Sethu"
orcid: "https://orcid.org/0000-0000-0000-0000" # Replace with actual ORCID if available
title: "DAPS: Dimensionally Adaptive Prime Search Optimizer"
year: 2023
repository-code: "https://github.com/sethuiyer/DAPS"
abstract: "DAPS (Dimensionally Adaptive Prime Search) is a novel optimization algorithm designed for efficiently finding global minima of complex 3D functions. It utilizes a unique prime number-based grid search strategy with adaptive refinement to navigate complex objective function landscapes with multiple local minima, valleys, and cliffs."
license: MIT
version: 0.1.0
GitHub Events
Total
- Watch event: 1
- Public event: 1
- Push event: 17
- Pull request event: 2
- Create event: 2
Last Year
- Watch event: 1
- Public event: 1
- Push event: 17
- Pull request event: 2
- Create event: 2
Issues and Pull Requests
Last synced: 10 months ago
All Time
- Total issues: 0
- Total pull requests: 2
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 0
- Total pull request authors: 1
- Average comments per issue: 0
- Average comments per pull request: 0.0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 2
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 1
- Average comments per issue: 0
- Average comments per pull request: 0.0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
- sethuiyer (4)