Skip to main content
placeholder image

A structure optimization framework for feed-forward neural networks using sparse representation

Journal Article


Abstract


  • Traditionally, optimizing the structure of a feed-forward neural-network is time-consuming and it needs to balance the trade-off between the network size and network performance. In this paper, a sparse-representation based framework, termed SRS, is introduced to generate a small-sized network structure without compromising the network performance. Based on the forward selection strategy, the SRS framework selects significant elements (weights or hidden neurons) from the initial network that minimize the residual output error. The main advantage of the SRS framework is that it is able to optimize the network structure and training performance simultaneously. As a result, the training error is reduced while the number of selected elements increases. The efficiency and robustness of the SRS framework are evaluated based on several benchmark datasets. Experimental results indicate that the SRS framework performs favourably compared to alternative structure optimization algorithms.

Publication Date


  • 2016

Citation


  • Yang, J. & Ma, J. (2016). A structure optimization framework for feed-forward neural networks using sparse representation. Knowledge-Based Systems, 109 61-70.

Scopus Eid


  • 2-s2.0-84978924728

Ro Metadata Url


  • http://ro.uow.edu.au/smartpapers/178

Has Global Citation Frequency


Number Of Pages


  • 9

Start Page


  • 61

End Page


  • 70

Volume


  • 109

Place Of Publication


  • Netherlands

Abstract


  • Traditionally, optimizing the structure of a feed-forward neural-network is time-consuming and it needs to balance the trade-off between the network size and network performance. In this paper, a sparse-representation based framework, termed SRS, is introduced to generate a small-sized network structure without compromising the network performance. Based on the forward selection strategy, the SRS framework selects significant elements (weights or hidden neurons) from the initial network that minimize the residual output error. The main advantage of the SRS framework is that it is able to optimize the network structure and training performance simultaneously. As a result, the training error is reduced while the number of selected elements increases. The efficiency and robustness of the SRS framework are evaluated based on several benchmark datasets. Experimental results indicate that the SRS framework performs favourably compared to alternative structure optimization algorithms.

Publication Date


  • 2016

Citation


  • Yang, J. & Ma, J. (2016). A structure optimization framework for feed-forward neural networks using sparse representation. Knowledge-Based Systems, 109 61-70.

Scopus Eid


  • 2-s2.0-84978924728

Ro Metadata Url


  • http://ro.uow.edu.au/smartpapers/178

Has Global Citation Frequency


Number Of Pages


  • 9

Start Page


  • 61

End Page


  • 70

Volume


  • 109

Place Of Publication


  • Netherlands