Evolving Deep Neural Networks

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Download
Authors Dan Fink, Risto Miikkulainen, Arshak Navruzyan, Aditya Rawal, Hormoz Shahrzad, Babak Hodjat, Elliot Meyerson, Nigel Duffy, Olivier Francon, Jason Liang, Bala Raju
Journal/Conference Name Neural Computing and Applications
Paper Category
Paper Abstract The success of deep learning depends on finding an architecture to fit the task. As deep learning has scaled up to more challenging tasks, the architectures have become difficult to design by hand. This paper proposes an automated method, CoDeepNEAT, for optimizing deep learning architectures through evolution. By extending existing neuroevolution methods to topology, components, and hyperparameters, this method achieves results comparable to best human designs in standard benchmarks in object recognition and language modeling. It also supports building a real-world application of automated image captioning on a magazine website. Given the anticipated increases in available computing power, evolution of deep networks is promising approach to constructing deep learning applications in the future.
Date of publication 2017
Code Programming Language Multiple
Comment

Copyright Researcher 2022