The accuracy and generalisability of a machine learning model is determined by the right choice of hyper-parameters. All models have hyper-parameters (even non-parametric models) and the wrong choice will result in a poor fit to the data, and useless predictions. Finding the optimal set of hyper-parameter for any model is always the most time consuming part of machine learning, and is usually done by extensive (and sometimes exhaustive) grid searches, or highly variable random selection. An alternative is to use evolutionary algorithms which apply search principles analogous to those of natural evolution in a variety of different problem domains. In this project you will implement a genetic algorithm and differential evolution algorithm for hyper-parameter optimization of machine learning models in python, test the performance of different conditions (mutations and generations), and compare the result (accuracy and computational efficiency) to random and grid searching methods. Data sets will be provided.