Skip to content

Commit

Permalink
[FEAT][FIREFLY]
Browse files Browse the repository at this point in the history
  • Loading branch information
Your Name committed Sep 14, 2024
1 parent 9a0ca50 commit 59ff0b2
Show file tree
Hide file tree
Showing 8 changed files with 299 additions and 27 deletions.
50 changes: 49 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -199,6 +199,27 @@ print(output)

### Firefly

Exploration into the Firefly algorithm (a generalized version of particle swarm optimization) in Pytorch. In particular interested in hybrid <a href="https://academic.oup.com/jcde/article/9/2/706/6566441">firefly + genetic algorithms</a>, or ones that are <a href="https://www.sciencedirect.com/science/article/abs/pii/S0957417423005298">gender-based</a>. This code was adapted from lucidrains.

```python
from swarms_torch.firefly import FireflyOptimizer
from torch import Tensor


def rosenbrock(x: Tensor) -> Tensor:
return (
100 * (x[..., 1:] - x[..., :-1] ** 2) ** 2 + (1 - x[..., :-1]) ** 2
).sum(dim=-1)


if __name__ == "__main__":
optimizer = FireflyOptimizer(cost_function=rosenbrock)
optimizer.optimize()
best_solution = optimizer.get_best_solution()
print(f"Best solution: {best_solution}")

```




Expand Down Expand Up @@ -239,4 +260,31 @@ Help us accelerate our backlog by supporting us financially! Note, we're an open
<a href="https://polar.sh/kyegomez"><img src="https://polar.sh/embed/fund-our-backlog.svg?org=kyegomez" /></a>

# License
MIT
MIT


## Citations

```bibtex
@article{Yang2018WhyTF,
title = {Why the Firefly Algorithm Works?},
author = {Xin-She Yang and Xingshi He},
journal = {ArXiv},
year = {2018},
volume = {abs/1806.01632},
url = {https://api.semanticscholar.org/CorpusID:46940737}
}
```

```bibtex
@article{article,
author = {El-Shorbagy, M. and Elrefaey, Adel},
year = {2022},
month = {04},
pages = {706-730},
title = {A hybrid genetic-firefly algorithm for engineering design problems},
volume = {Journal of Computational Design and Engineering, Volume 9},
journal = {Journal of Computational Design and Engineering},
doi = {10.1093/jcde/qwac013}
}
```
157 changes: 157 additions & 0 deletions docs/swarms/firefly.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,157 @@
# FireflyOptimizer

```python
class FireflyOptimizer(cost_function, steps=5000, species=4, population_size=1000, dimensions=10, lower_bound=-4.0, upper_bound=4.0, mix_species_every=25, beta0=2.0, gamma=1.0, alpha=0.1, alpha_decay=0.995, use_genetic_algorithm=False, breed_every=10, tournament_size=100, num_children=500, use_cuda=True, verbose=True)
```

The `FireflyOptimizer` class implements the Firefly Algorithm to minimize a given objective function. It simulates the flashing behavior of fireflies to explore the search space efficiently.

## Parameters

- **cost_function** (callable):
The objective function to minimize. Should accept a `torch.Tensor` and return a `torch.Tensor` of costs.

- **steps** (int, optional):
Number of optimization steps. Default: `5000`.

- **species** (int, optional):
Number of species in the population. Default: `4`.

- **population_size** (int, optional):
Number of fireflies in each species. Default: `1000`.

- **dimensions** (int, optional):
Dimensionality of the search space. Default: `10`.

- **lower_bound** (float, optional):
Lower bound of the search space. Default: `-4.0`.

- **upper_bound** (float, optional):
Upper bound of the search space. Default: `4.0`.

- **mix_species_every** (int, optional):
Interval (in steps) to mix species. Default: `25`.

- **beta0** (float, optional):
Base attractiveness coefficient. Default: `2.0`.

- **gamma** (float, optional):
Light absorption coefficient controlling intensity decay. Default: `1.0`.

- **alpha** (float, optional):
Randomness scaling factor. Default: `0.1`.

- **alpha_decay** (float, optional):
Decay rate of `alpha` per step. Default: `0.995`.

- **use_genetic_algorithm** (bool, optional):
Whether to include genetic algorithm operations. Default: `False`.

- **breed_every** (int, optional):
Steps between breeding operations when using genetic algorithm. Default: `10`.

- **tournament_size** (int, optional):
Number of participants in each tournament selection. Default: `100`.

- **num_children** (int, optional):
Number of offspring produced during breeding. Default: `500`.

- **use_cuda** (bool, optional):
Use CUDA for computations if available. Default: `True`.

- **verbose** (bool, optional):
Print progress messages during optimization. Default: `True`.

## Attributes

| Attribute | Type | Description |
|--------------------|-----------------|--------------------------------------------------------|
| `fireflies` | `torch.Tensor` | Positions of the fireflies in the search space. |
| `device` | `torch.device` | Device used for computations (`cpu` or `cuda`). |
| `current_alpha` | `float` | Current value of `alpha` during optimization. |

## Methods

### `optimize()`

Runs the optimization loop for the specified number of steps.

**Example:**

```python
optimizer.optimize()
```

### `get_best_solution()`

Retrieves the best solution found by the optimizer.

**Returns:**

- **best_firefly** (`torch.Tensor`):
The best solution vector found.

**Example:**

```python
best_solution = optimizer.get_best_solution()
print(f"Best solution: {best_solution}")
```

### `generate()`

Generates a new set of fireflies, reinitializing their positions.

**Returns:**

- **fireflies** (`torch.Tensor`):
The new set of fireflies.

**Example:**

```python
optimizer.generate()
```

### `reset()`

Resets the optimizer to its initial state, including `alpha` and firefly positions.

**Example:**

```python
optimizer.reset()
```

---

**Note:** The Firefly Algorithm is inspired by the flashing behavior of fireflies and is suitable for continuous optimization problems. This implementation allows for customization and includes optional genetic algorithm operations for enhanced performance.

**Example Usage:**

```python
from swarms_torch.firefly import FireflyOptimizer
from torch import Tensor


def rosenbrock(x: Tensor) -> Tensor:
return (
100 * (x[..., 1:] - x[..., :-1] ** 2) ** 2 + (1 - x[..., :-1]) ** 2
).sum(dim=-1)


if __name__ == "__main__":
optimizer = FireflyOptimizer(
cost_function=rosenbrock,
steps=100,
species=10,
population_size=100,
dimensions=10,
lower_bound=-4,
upper_bound=4,
# Many more parameters can be set, see the documentation for more details
)
optimizer.optimize()
best_solution = optimizer.get_best_solution()
print(f"Best solution: {best_solution}")
```
24 changes: 24 additions & 0 deletions examples/fire_fly_example.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
from swarms_torch.firefly import FireflyOptimizer
from torch import Tensor


def rosenbrock(x: Tensor) -> Tensor:
return (
100 * (x[..., 1:] - x[..., :-1] ** 2) ** 2 + (1 - x[..., :-1]) ** 2
).sum(dim=-1)


if __name__ == "__main__":
optimizer = FireflyOptimizer(
cost_function=rosenbrock,
steps=100,
species=10,
population_size=100,
dimensions=10,
lower_bound=-4,
upper_bound=4,
# Many more parameters can be set, see the documentation for more details
)
optimizer.optimize()
best_solution = optimizer.get_best_solution()
print(f"Best solution: {best_solution}")
2 changes: 2 additions & 0 deletions swarms_torch/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
Particle,
TransformerParticleSwarmOptimization,
)
from swarms_torch.firefly import FireflyOptimizer
from swarms_torch.structs import * # noqa

__all__ = [
Expand All @@ -28,4 +29,5 @@
"TransformerParticleSwarmOptimization",
"HivemindSwarm",
"MixtureOfMambas",
"FireflyOptimizer",
]
2 changes: 2 additions & 0 deletions swarms_torch/drone_swarm.py
Original file line number Diff line number Diff line change
Expand Up @@ -309,10 +309,12 @@ def forward(
final_neighborhood_embedding = self.neighbor_mlp(obs_neighbors)
return final_neighborhood_embedding


@dataclass
class SwarmMultiHeadAttentionEncoder(nn.Module):
dim: int


@dataclass
class QuadSingleHeadAttentionEncoderSim2Real(SwarmMultiHeadAttentionEncoder):
obs_space: int
Expand Down
Loading

0 comments on commit 59ff0b2

Please sign in to comment.