Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/se2p/sbse2023

Jupyter notebooks accompanying the Search-Based Software Engineering Course WS23/24
https://github.com/se2p/sbse2023

Last synced: 20 days ago
JSON representation

Jupyter notebooks accompanying the Search-Based Software Engineering Course WS23/24

Awesome Lists containing this project

README

        

# Search-Based Software Engineering Course WS23/24

This repository contains the code examples and content of the lectures. I
will be uploading rendered versions as PDF files to StudIP, and include
rendered Markdown versions in this repository. If you want to run the
notebooks yourself, you will need to install [Jupyter](https://jupyter.org/install).
If you need help with setting up Jupyter, here's a tutorial on [how to install jupyter notebook on your machine](https://www.dataquest.io/blog/jupyter-notebook-tutorial/).

## Chapter 1: Random and Local Search

The first chapter covers the coding examples from the first two weeks, on basic random search and local search algorithms.
[Markdown Export](rendered/Random%20and%20Local%20Search.md)

## Chapter 2: Evolutionary Search (Part 1)

This chapter covers basic evolutionary strategies and genetic algorithms.
[Markdown Export](rendered/Evolutionary%20Search%20-%20Part%201.md)

## Chapter 3: Evolutionary Search (Part 2)

This chapter looks into the various search operators of a genetic algorithm:
Survivor selection, parent selection, crossover, mutation, and the
population itself.
[Markdown Export](rendered/Evolutionary%20Search%20-%20Part%202.md)

## Chapter 4: Multi-Objective Optimisation (Part 1)

This chapter covers the basics of Pareto optimality, NSGA-II, and comparison
of multi-objective search algorithms.
[Markdown Export](rendered/Multi-Objective%20Optimisation%20-%20Part%201.md)

## Chapter 5: Multi-Objective Optimisation (Part 2)

This chapter covers several alternative multi-objective search algorithms:
A random baseline, PAES, SPEA2, TwoArchives, and SMS-EMOA.
[Markdown Export](rendered/Multi-Objective%20Optimisation%20-%20Part%202.md)

## Chapter 6: Search-based Test Generation (Part 1)

This chapter looks at how the problem of test input generation can be cast
as a search problem, and how to automatically instrument programs for
fitness generation.
[Markdown Export](rendered/Search-Based%20Test%20Generation%20-%20Part%201.md)

## Chapter 7: Search-based Test Generation (Part 2)

This chapter continues whole test suite generation, and then moves on to
many objective optimisation for test generation.

[Markdown Export](rendered/Search-Based%20Test%20Generation%20-%20Part%202.md)

## Chapter 8: Genetic Programming

This chapter introduces classic genetic programming for scenarios assuming
type closure, and applies this to symbolic regression and spectrum-based
fault localisation. It also looks at grammatical evolution and automated program repair.

[Markdown Export](rendered/Genetic%20Programming.md)

## Chapter 9: Neuroevolution

This chapter introduces the field of Neuroevolution in which evolutionary algorithms are used to optimise artificial
neural networks. We start with the definition of neural networks and the pole balancing problem, a popular reinforcement
learning task which will be solved using two different Neuroevolution algorithms. The two algorithms, Symbiotic Adaptive
Neuroevolution (SANE) and Cooperative Synapse Neuroevolution (CoSyNE), respectively, evolve a population of hidden
neurons and connection weights.

[Markdown Export](rendered/Neuroevolution.md)

## Chapter 10: Parameter Tuning and Parameter Control

This chapter considers how to choose values for the many parameters that we
have introduced in our evolutionary algorithms, how to optimise these
values, and how to adapt them to new problems.

[Markdown Export](rendered/Parameter%20Control%20and%20Adaptation.md)

## Chapter 11: Advanced Evolutionary Algorithms

This chapter considers several advanced variants of the evolutionary
algorithms we have discussed in previous chapter: Memetic algorithms combine
global and local search; island model GAs divide the population of a GA into
independent subpopulations; estimation of distribution algorithms try to
explicitly optimise the probability distribution that is otherwise
implicitly represented by the population; differential evolution uses novel
search operators unlike the ones we have used in standard GAs; hyper
heuristics try to combine different heuristics to adapt to the problem at hand.

[Markdown Export](rendered/Advanced%20Evolutionary%20Algorithms.md)

## Chapter 12: Swarm Optimisation

This chapter briefly introduces two swarm optimisation techniques: Ant colony optimisation
tries to imitate the stigmergic communication of ants for the purpose of
optimisation. Particle swarm optimisation simulates the swarm behaviour of
birds or fish.

[Markdown Export](rendered/Swarm%20Optimisation.md)