Population based training and federated learning frameworks for hyperparameter optimisation and ML unfairness using Ulimisana Optimisation Algorithm

Tshifhiwa Maumela, Fulufhelo Nelwamondo, Tshilidzi Marwala

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

This paper introduces the Ulimisana Optimisation Algorithm enabled Population Based Training (PBT-UOA) framework which allows for hyperparameters to be fine-tuned using a population based meta-heuristic algorithm at the same time as parameters are being optimised. Models are trained until near-convergence on the updated hyperparameters and the parameters of the best performing model are shared to warm start the other models in the next hyperparameter tuning iteration. In the PBT-UOA, all models are trained using the same dataset. This framework performed better than the Bayesian Optimisation algorithm. This paper also introduces the Ulimisana Optimisation Algorithm enabled Federated Learning (FL-UOA) framework which is an extension of the PBT-UOA. This framework is introduced to address the challenges of scattered datasets and privacy that is presented by the increase in connected end-devices. The FL-UOA learns on local data in scattered end-devices without sending datasets to a central server. The training datasets in local end-devices are used to evaluate models trained in other end-devices. The performance metrics are used to update the Social Trust Network (STN) of the FL-UOA framework. The FL-UOA outperformed the classic Federated Learning framework. This STN updating technique was tested in Machine Learning (ML) Unfairness to see how well it functioned as a regularisation term. This was achieved by training different models on subsets that contained datasets representing only specific sensitive groups. Results showed that by updating the hyperparameters while learning the parameters on the dataset scattered across different devices, the FL-UOA, takes advantage of diversified learning and reduces the ML Unfairness for models trained on group specific datasets.

Original languageEnglish
Pages (from-to)132-150
Number of pages19
JournalInformation Sciences
Volume612
DOIs
Publication statusPublished - Oct 2022

Keywords

  • Artificial Intelligence
  • Co-operative agents
  • Collaborative learning
  • Federated learning
  • Hyperparameter optimisation
  • Population based training
  • Ubuntu
  • Ulimisana/Letsema

ASJC Scopus subject areas

  • Software
  • Information Systems and Management
  • Artificial Intelligence
  • Theoretical Computer Science
  • Control and Systems Engineering
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Population based training and federated learning frameworks for hyperparameter optimisation and ML unfairness using Ulimisana Optimisation Algorithm'. Together they form a unique fingerprint.

Cite this