Add results

This commit is contained in:
Peter Vacho 2025-03-16 22:10:12 +01:00
parent 16d16c7559
commit 3538efa66e
Signed by: school
GPG key ID: 8CFC3837052871B4
13 changed files with 114 additions and 0 deletions

114
PROTOCOL.md Normal file
View file

@ -0,0 +1,114 @@
<!--
I decided to write this protocol in simple markdown, as I find it the easiest.
To view it with the images rendered, you will need a markdown viewer on your system.
Alternatively, you can see it rendered on my Forgejo page for this repository:
https://git.itsdrike.com/ap8mi/task2/src/branch/main/PROTOCOL.md
You can also look at the images directly, they're in the `imgs` directory, and their
file names do a fairly good job at explaining the settings used.
-->
# Protocol
In this task, we were supposed to implement two optimization algorithms in an effort to find the minimum value of the
functions that we've worked with in the last task. The two algorithms were Local Search (LS) and Stochastic Hill
Climbing (SHC). Both of these are pretty similar, the only difference between them is that the LS algorithm includes the
center point when determining the minimum, while the SHC algorithm only considers the new points.
From my testing, the SHC algorithm usually performed worse with the same amount of iterations, because LS was more
focused on exploring the lowest point of the minimum, while SHC often didn't and instead jumped elsewhere.
That said, because of this fact, SHC actually performed better in cases where LS got stuck in a local minimum, as LS
can't recover from this, while SHC can.
## Output (graphs)
Below are a few examples of the runs I performed:
### Sphere
I first ran the algorithm on the sphere function, with the following settings:
![img](./imgs/1_config.png)
For Local Search, this was the resulting graph:
![img](./imgs/1_ls.png)
For Stochastic Hill Climbing, the result was this graph:
![img](./imgs/1_shc.png)
### Rosenbrock
Next, I tried the Rosenbrock function, with the following settings:
![img](./imgs/2_config.png)
For LS, I got:
![img](./imgs/2_ls.png)
For SHC, I got:
![img](./imgs/2_shc.png)
### Further experimenting with Rosenbrock
I also tried adjusting some of the settings with the Rosenbrock function, to compare how they affect the LS and SHC
algorithms. Each time, I only modified a single settings, with the rest matching the original Rosenbrock configuration
(already shown above).
#### 1000 iterations
I first tried increasing the number of iterations to 1000 (from 30).
For LS, I got:
![img](./imgs/2_ls_iterations-1000.png)
for SHC, I got:
![img](./imgs/2_shc_iterations-1000.png)
#### 100 neighbors
I then tried increasing the amount of neighbor points to 100 (from 3).
For LS, I got:
![img](./imgs/2_ls_neighbors-100.png)
for SHC, I got:
![img](./imgs/2_shc_neighbors-100.png)
#### Standard deviation of 30
Finally, I tried increasing the standard deviation of the normal distribution to 30 (from 1.5).
For LS, I got:
![img](./imgs/2_ls_stddev-30.png)
for SHC, I got:
![img](./imgs/2_shc_stddev-30.png)
### Summary
I've tested a bunch more cases locally, however, I didn't want to clutter this report with a bunch of those images,
however, from this testing, I found out that in most cases:
- The standard deviation should generally be fairly small, enough to allow precise searching around the minimum however,
not too small that the algorithm cannot explore enough of the search space in a single iteration. It's necessary to
find a number that balances this and works well.
- The number of iterations only matters up to a certain point. For LS, once a point close to a local minimum is found,
additional iterations may only refine the solution slightly, as the probability of finding a better point gets
progressively smaller. For SHC, increasing iterations generally improves the chances of escaping local minima and
potentially finding a better solution, especially when the standard deviation allows for exploration.
- The number of neighbors is a key factor in performance. Generally, increasing the number of neighbors allows the
algorithm to explore more options and find the optimal point faster. However, there are diminishing returns beyond a
certain point, where adding more neighbors increases computational cost without improving performance significantly,
especially with a small standard deviation.

BIN
imgs/1_config.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

BIN
imgs/1_ls.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

BIN
imgs/1_shc.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

BIN
imgs/2_config.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

BIN
imgs/2_ls.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

BIN
imgs/2_ls_neighbors-100.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

BIN
imgs/2_ls_stddev-30.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

BIN
imgs/2_shc.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

BIN
imgs/2_shc_stddev-30.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB