pdfcoffee

soumyasankar99
from soumyasankar99 More from this publisher
09.05.2023 Views

Chapter 14The final step consists simply of activating AutoML (see Figure 77):Figure 77: Activating AutoML from KaggleSummaryThe goal of AutoML is to enable domain experts who are not familiar with machinelearning technologies to use ML techniques easily. The primary goal is to reduce thesteep learning curve and the huge costs of handcrafting machine learning solutionsby making the whole end-to-end machine learning pipeline (data preparation,feature engineering, and automatic model generation) more automated.After reviewing the state-of-the-art solution available at the end of 2019, wediscussed how to use Cloud AutoML for text, videos, and images, achieving resultscomparable to the ones achieved with handcrafted models. AutoML is probably thefastest growing research topic and the interested reader can understand the latestresults at https://www.automl.org/.The next chapter discusses the math behind deep learning, a rather advanced topicthat is recommended if you are interested in understanding what is going on "underthe hood" when you play with neural networks.[ 541 ]

An introduction to AutoMLReferences1. Neural Architecture Search with Reinforcement Learning, Barret Zoph, Quoc V.Le; 2016, http://arxiv.org/abs/1611.01578.2. Efficient Neural Architecture Search via Parameter Sharing, Hieu Pham, MelodyY. Guan, Barret Zoph, Quoc V. Le, Jeff Dean, 2018, https://arxiv.org/abs/1802.03268.3. Transfer NAS: Knowledge Transfer between Search Spaces with TransformerAgents, Zalán Borsos, Andrey Khorlin, Andrea Gesmundo, 2019, https://arxiv.org/abs/1906.08102.4. NSGA-Net: Neural Architecture Search using Multi-Objective Genetic Algorithm,Zhichao Lu, Ian Whalen, Vishnu Boddeti, Yashesh Dhebar, KalyanmoyDeb, Erik Goodman, Wolfgang Banzhaf, 2018 https://arxiv.org/abs/1810.03522.5. Random Search for Hyper-Parameter Optimization, James Bergstra, YoshuaBengio, 2012, http://www.jmlr.org/papers/v13/bergstra12a.html.6. Auto-Keras: An Efficient Neural Architecture Search System, Haifeng Jin,Qingquan Song and Xia Hu, 2019, https://www.kdd.org/kdd2019/accepted-papers/view/auto-keras-an-efficient-neuralarchitecture-search-system.7. Automated deep learning design for medical image classification by healthcareprofessionals with no coding experience: a feasibility study, Livia Faes etal, The Lancet Digital Health Volume 1, Issue 5, September 2019, Pagese232-e242 https://www.sciencedirect.com/science/article/pii/S2589750019301086.[ 542 ]

An introduction to AutoML

References

1. Neural Architecture Search with Reinforcement Learning, Barret Zoph, Quoc V.

Le; 2016, http://arxiv.org/abs/1611.01578.

2. Efficient Neural Architecture Search via Parameter Sharing, Hieu Pham, Melody

Y. Guan, Barret Zoph, Quoc V. Le, Jeff Dean, 2018, https://arxiv.org/

abs/1802.03268.

3. Transfer NAS: Knowledge Transfer between Search Spaces with Transformer

Agents, Zalán Borsos, Andrey Khorlin, Andrea Gesmundo, 2019, https://

arxiv.org/abs/1906.08102.

4. NSGA-Net: Neural Architecture Search using Multi-Objective Genetic Algorithm,

Zhichao Lu, Ian Whalen, Vishnu Boddeti, Yashesh Dhebar, Kalyanmoy

Deb, Erik Goodman, Wolfgang Banzhaf, 2018 https://arxiv.org/

abs/1810.03522.

5. Random Search for Hyper-Parameter Optimization, James Bergstra, Yoshua

Bengio, 2012, http://www.jmlr.org/papers/v13/bergstra12a.html.

6. Auto-Keras: An Efficient Neural Architecture Search System, Haifeng Jin,

Qingquan Song and Xia Hu, 2019, https://www.kdd.org/kdd2019/

accepted-papers/view/auto-keras-an-efficient-neuralarchitecture-search-system.

7. Automated deep learning design for medical image classification by healthcare

professionals with no coding experience: a feasibility study, Livia Faes et

al, The Lancet Digital Health Volume 1, Issue 5, September 2019, Pages

e232-e242 https://www.sciencedirect.com/science/article/pii/

S2589750019301086.

[ 542 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!