Innovation engines: automated creativity and improved stochastic optimization via deep learning

Aus de_evolutionary_art_org
Wechseln zu: Navigation, Suche


Nguyen, A., Yosinski, J., Clune, J.: Innovation engines: automated creativity and improved stochastic optimization via deep learning. In: Proceedings of the Genetic and Evolutionary Computation Conference (2015)



The Achilles Heel of stochastic optimization algorithms is getting trapped on local optima. Novelty Search avoids this problem by encouraging a search in all interesting directions. That occurs by replacing a performance objective with a reward for novel behaviors, as defined by a human-crafted, and often simple, behavioral distance function. While Novelty Search is a major conceptual breakthrough and outperforms traditional stochastic optimization on certain problems, it is not clear how to apply it to challenging, high-dimensional problems where specifying a useful behavioral distance function is difficult. For example, in the space of images, how do you encourage novelty to produce hawks and heroes instead of endless pixel static? Here we propose a new algorithm, the Innovation Engine, that builds on Novelty Search by replacing the human-crafted behavioral distance with a Deep Neural Network (DNN) that can recognize interesting differences between phenotypes. The key insight is that DNNs can recognize similarities and differences between phenotypes at an abstract level, wherein novelty means interesting novelty. For example, a novelty pressure in image space does not explore in the low-level pixel space, but instead creates a pressure to create new types of images (e.g. churches, mosques, obelisks, etc.). Here we describe the long-term vision for the Innovation Engine algorithm, which involves many technical challenges that remain to be solved. We then implement a simplified version of the algorithm that enables us to explore some of the algorithm's key motivations. Our initial results, in the domain of images, suggest that Innovation Engines could ultimately automate the production of endless streams of interesting solutions in any domain: e.g. producing intelligent software, robot controllers, optimized physical components, and art.

Extended Abstract


author = {Nguyen, Anh Mai and Yosinski, Jason and Clune, Jeff},
title = {Innovation Engines: Automated Creativity and Improved Stochastic Optimization via Deep Learning},
booktitle = {Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation},
series = {GECCO '15},
year = {2015},
isbn = {978-1-4503-3472-3},
location = {Madrid, Spain},
pages = {959--966},
numpages = {8},
url = {},
doi = {10.1145/2739480.2754703},
acmid = {2754703},
publisher = {ACM},
address = {New York, NY, USA},
keywords = {deep learning, deep neural networks, map-elites, novelty search},

Used References

1 J. E. Auerbach. Automated evolution of interesting images. In Artificial Life 13. MIT Press, 2012.

2 Yoshua Bengio, Learning Deep Architectures for AI, Now Publishers Inc., Hanover, MA, 2009

3 Y. Bengio, E. Thibodeau-Laufer, G. Alain, and J. Yosinski. Deep generative stochastic networks trainable by backprop. In Proc. of the ICML, 2014.

4 Clune, J., Lipson, H.: Evolving three-dimensional objects with a generative encoding inspired by developmental biology. In: Proceedings of the European Conference on Artificial Life, pp. 144–148 (2011)

5 Giuseppe Cuccu , Faustino Gomez, When novelty is not enough, Proceedings of the 2011 international conference on Applications of evolutionary computation, April 27-29, 2011, Torino, Italy

6 A. Cully, J. Clune, and J.-B. Mouret. Robots that can adapt like natural animals. arXiv preprint arXiv:1407.3501, 2014.

7 J. Deng et al. Imagenet: A large-scale hierarchical image database. In Conference on Computer Vision and Pattern Recognition, pages 248--255. IEEE, 2009.

8 G. E. Hinton and R. R. Salakhutdinov. Reducing the dimensionality of data with neural networks. Science, 2006.

9 Yangqing Jia , Evan Shelhamer , Jeff Donahue , Sergey Karayev , Jonathan Long , Ross Girshick , Sergio Guadarrama , Trevor Darrell, Caffe: Convolutional Architecture for Fast Feature Embedding, Proceedings of the ACM International Conference on Multimedia, November 03-07, 2014, Orlando, Florida, USA

10 A. Karpathy. What I learned from competing against a convnet on ImageNet., 2014.

11 M. Keane et al. Genetic programming IV: Routine human-competitive machine intelligence. 2006.

12 A. Krizhevsky, I. Sutskever, and G. E. Hinton. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems, pages 1097--1105, 2012.

13 J. Lehman and K. O. Stanley. Exploiting open-endedness to solve problems through the search for novelty. In ALIFE, pages 329--336, 2008.

14 Joel Lehman , Kenneth O. Stanley, Abandoning objectives: Evolution through the search for novelty alone, Evolutionary Computation, v.19 n.2, p.189-223, Summer 2011

15 J. Lehman and K. O. Stanley. Novelty search and the problem with objectives. In Genetic Programming Theory and Practice IX, pages 37--56. Springer, 2011.

16 J. Li, J. Storie, and J. Clune. Encouraging creative thinking in robots improves their ability to solve challenging problems. Algorithms, 13:14.

17 A. Liapis, H. P. Mart nez, J. Togelius, and G. N. Yannakakis. Transforming exploratory creativity with delenox. In Proc. of the Fourth International Conference on Computational Creativity, 2013.

18 J.-B. Mouret. Novelty-based multiobjectivization. In New Horizons in Evolutionary Robotics. Springer, 2011.

19 J.-B. Mouret and J. Clune. Illuminating search spaces by mapping elites. arXiv preprint, 2015.

20 J.-B. Mouret and S. Doncieux. Sferes v2: Evolvin'in the multi-core world. In Congress on Evolutionary Computation, pages 1--8, 2010.

21 A. Nguyen, J. Yosinski, and J. Clune. Deep neural networks are easily fooled: High confidence predictions for unrecognizable images. In Proc. of the Conference on Computer Vision and Pattern Recognition, 2015.

22 O. Russakovsky et al. Imagenet large scale visual recognition challenge. arXiv:1409.0575, 2014.

23 J. Schmidhuber. Developmental robotics, optimal artificial curiosity, creativity, music, and the fine arts. Connection Science, 18(2):173--187, 2006.

24 Jimmy Secretan , Nicholas Beato , David B. D'Ambrosio , Adelein Rodriguez , Adam Campbell , Jeremiah T. Folsom-Kovarik , Kenneth O. Stanley, Picbreeder: A case study in collaborative evolutionary exploration of design space, Evolutionary Computation, v.19 n.3, p.373-403, Fall 2011

25 Kenneth O. Stanley , Joel Lehman, Why Greatness Cannot Be Planned: The Myth of the Objective, Springer Publishing Company, Incorporated, 2015

26 Kenneth O. Stanley , Risto Miikkulainen, Evolving neural networks through augmenting topologies, Evolutionary Computation, v.10 n.2, p.99-127, Summer 2002

27 Kenneth O. Stanley, Compositional pattern producing networks: A novel abstraction of development, Genetic Programming and Evolvable Machines, v.8 n.2, p.131-162, June 2007

28 C. Szegedy et al. Going deeper with convolutions. arXiv preprint arXiv:1409.4842, 2014.

29 Brian G. Woolley , Kenneth O. Stanley, On the deleterious effects of a priori objectives on evolution and representation, Proceedings of the 13th annual conference on Genetic and evolutionary computation, July 12-16, 2011, Dublin, Ireland


Full Text

internal file

Sonstige Links