Research Output
Artificial Neural Networks Training Acceleration Through Network Science Strategies
  Deep Learning opened artificial intelligence to an unprecedented number of new applications. A critical success factor is the ability to train deeper neural networks, striving for stable and accurate models. This translates into Artificial Neural Networks (ANN) that become unmanageable as the number of features increases. The novelty of our approach is to employ Network Science strategies to tackle the complexity of the actual ANNs at each epoch of the training process. The work presented herein originates in our earlier publications, where we explored the acceleration effects obtained by enforcing, in turn, scale freeness, small worldness, and sparsity during the ANN training process. The efficiency of our approach has also been recently confirmed by independent researchers, who managed to train a million-node ANN on non-specialized laptops. Encouraged by these results, we have now moved into having a closer look at some tunable parameters of our previous approach to pursue a further acceleration effect. We now investigate on the revise fraction parameter, to verify the necessity of the role of its double-check. Our method is independent of specific machine learning algorithms or datasets, since we operate merely on the topology of the ANNs. We demonstrate that the revise phase can be avoided in order to half the overall execution time with an almost negligible loss of quality.

  • Date:

    14 February 2020

  • Publication Status:

    Published

  • Publisher

    Springer International Publishing

  • DOI:

    10.1007/978-3-030-40616-5_27

  • Funders:

    Historic Funder (pre-Worktribe)

Citation

Cavallaro, L., Bagdasar, O., De Meo, P., Fiumara, G., & Liotta, A. (2020). Artificial Neural Networks Training Acceleration Through Network Science Strategies. In Numerical Computations: Theory and Algorithms. , (330-336). https://doi.org/10.1007/978-3-030-40616-5_27

Authors

Keywords

Network Science, Artificial Neural Networks

Monthly Views:

Available Documents