Research Output

WILDA: Wide Learning of Diverse Architectures for Classification of Large Datasets

  In order to address scalability issues, which can be a challenge for Deep Learning methods, we propose Wide Learning of Diverse Architectures-a model that scales horizontally rather than vertically, enabling distributed learning. We propose a distributed version of a quality-diversity evolutionary algorithm (MAP-Elites) to evolve an architecturally diverse ensemble of shallow networks, each of which extracts a feature vector from the data. These features then become the input to a single shallow network which is optimised using gradient descent to solve a classification task. The technique is shown to perform well on two benchmark classification problems (MNIST and CIFAR). Additional experiments provide insight into the role that diversity plays in contributing to the performance of the repertoire.

Citation

Pitt, J., Burth Kurka, D., Hart, E., & Cardoso, R. P. (2021). WILDA: Wide Learning of Diverse Architectures for Classification of Large Datasets. In Applications of Evolutionary Computation: EvoApplications 2021 Proceedings. , (649-664). https://doi.org/10.1007/978-3-030-72699-7_41

Authors

Keywords

Diversity, MAP-elites, Machine Learning, Ensemble

Monthly Views:

Available Documents