Categories
project

Machine Learning Provenance for Hyperparameter tuning

Hyperparameters are important to the performance of machine learning models. However, some models have many parameters and each parameter has many possible values, if not infinite. As a result, finding the best set of values for the hyperparameters can be time consuming. Currently people rely on previous experiences or methods such as ‘grid search’, but these do not always work or provide good answers.

In this project, we try to capture and visualise the provenance of hyper parameter tuning, i.e.,

  • What values have been tested for the hyperparameters
  • What are the performance results for those values

to help user better perform hyperparameter tuning, i.e., finding values that lead to better performance in shorter time.

  • Student (PhD), Michael Brobbey, Middlesex University, London, UK
  • Collaborator: Dr Sheeba Samuel, University of Jena, Germany
  • Collaborator:  Sneha Mohanty, Bauhaus University in Weimar , Germany
  • Collaborator: Prof Marc Streit, Johannes Kepler University Linz, Austria
  • Collaborator: Klaus Eckelt, Johannes Kepler University Linz, Austria