pdfcoffee
Chapter 14The final step consists simply of activating AutoML (see Figure 77):Figure 77: Activating AutoML from KaggleSummaryThe goal of AutoML is to enable domain experts who are not familiar with machinelearning technologies to use ML techniques easily. The primary goal is to reduce thesteep learning curve and the huge costs of handcrafting machine learning solutionsby making the whole end-to-end machine learning pipeline (data preparation,feature engineering, and automatic model generation) more automated.After reviewing the state-of-the-art solution available at the end of 2019, wediscussed how to use Cloud AutoML for text, videos, and images, achieving resultscomparable to the ones achieved with handcrafted models. AutoML is probably thefastest growing research topic and the interested reader can understand the latestresults at https://www.automl.org/.The next chapter discusses the math behind deep learning, a rather advanced topicthat is recommended if you are interested in understanding what is going on "underthe hood" when you play with neural networks.[ 541 ]
An introduction to AutoMLReferences1. Neural Architecture Search with Reinforcement Learning, Barret Zoph, Quoc V.Le; 2016, http://arxiv.org/abs/1611.01578.2. Efficient Neural Architecture Search via Parameter Sharing, Hieu Pham, MelodyY. Guan, Barret Zoph, Quoc V. Le, Jeff Dean, 2018, https://arxiv.org/abs/1802.03268.3. Transfer NAS: Knowledge Transfer between Search Spaces with TransformerAgents, Zalán Borsos, Andrey Khorlin, Andrea Gesmundo, 2019, https://arxiv.org/abs/1906.08102.4. NSGA-Net: Neural Architecture Search using Multi-Objective Genetic Algorithm,Zhichao Lu, Ian Whalen, Vishnu Boddeti, Yashesh Dhebar, KalyanmoyDeb, Erik Goodman, Wolfgang Banzhaf, 2018 https://arxiv.org/abs/1810.03522.5. Random Search for Hyper-Parameter Optimization, James Bergstra, YoshuaBengio, 2012, http://www.jmlr.org/papers/v13/bergstra12a.html.6. Auto-Keras: An Efficient Neural Architecture Search System, Haifeng Jin,Qingquan Song and Xia Hu, 2019, https://www.kdd.org/kdd2019/accepted-papers/view/auto-keras-an-efficient-neuralarchitecture-search-system.7. Automated deep learning design for medical image classification by healthcareprofessionals with no coding experience: a feasibility study, Livia Faes etal, The Lancet Digital Health Volume 1, Issue 5, September 2019, Pagese232-e242 https://www.sciencedirect.com/science/article/pii/S2589750019301086.[ 542 ]
- Page 525 and 526: TensorFlow for Mobile and IoT and T
- Page 527 and 528: An introduction to AutoMLThat is pr
- Page 529 and 530: An introduction to AutoMLFeature co
- Page 531 and 532: An introduction to AutoMLThis Effic
- Page 533 and 534: An introduction to AutoMLGoogle Clo
- Page 535 and 536: An introduction to AutoMLThen, we c
- Page 537 and 538: An introduction to AutoMLOnce the d
- Page 539 and 540: An introduction to AutoMLIf your mo
- Page 541 and 542: An introduction to AutoMLClicking o
- Page 543 and 544: An introduction to AutoMLFigure 16:
- Page 545 and 546: An introduction to AutoMLYou can al
- Page 547 and 548: An introduction to AutoMLPut simply
- Page 549 and 550: An introduction to AutoMLLet's star
- Page 551 and 552: An introduction to AutoMLThe token
- Page 553 and 554: An introduction to AutoMLThis will
- Page 555 and 556: An introduction to AutoMLFigure 37:
- Page 557 and 558: An introduction to AutoMLAt the end
- Page 559 and 560: An introduction to AutoMLUsing Clou
- Page 561 and 562: An introduction to AutoMLOnce the d
- Page 563 and 564: An introduction to AutoMLAt the end
- Page 565 and 566: An introduction to AutoMLAs the nex
- Page 567 and 568: An introduction to AutoMLOnce the m
- Page 569 and 570: An introduction to AutoMLFigure 65:
- Page 571 and 572: An introduction to AutoMLOnce the m
- Page 573 and 574: An introduction to AutoMLWe can als
- Page 575: An introduction to AutoMLThe most e
- Page 579 and 580: The Math Behind Deep LearningSome m
- Page 581 and 582: The Math Behind Deep LearningSuppos
- Page 583 and 584: The Math Behind Deep LearningNote t
- Page 585 and 586: The Math Behind Deep LearningTheref
- Page 587 and 588: The Math Behind Deep LearningThe ea
- Page 589 and 590: The Math Behind Deep LearningThe re
- Page 591 and 592: The Math Behind Deep LearningCase 2
- Page 593 and 594: The Math Behind Deep LearningIn thi
- Page 595 and 596: The Math Behind Deep LearningHere,
- Page 597 and 598: The Math Behind Deep Learning(Note
- Page 599 and 600: The Math Behind Deep LearningIn man
- Page 601 and 602: The Math Behind Deep LearningIf we
- Page 603 and 604: The Math Behind Deep LearningChapte
- Page 605 and 606: The Math Behind Deep LearningThis c
- Page 607 and 608: Tensor Processing UnitMany people b
- Page 609 and 610: Tensor Processing UnitThe sequentia
- Page 611 and 612: Tensor Processing UnitIf you want t
- Page 613 and 614: Tensor Processing UnitOn the other
- Page 615 and 616: Tensor Processing UnitHow to use TP
- Page 617 and 618: Tensor Processing UnitNote that ful
- Page 619 and 620: Tensor Processing UnitEpoch 10/1060
- Page 621 and 622: Tensor Processing UnitFigure 11: Go
- Page 623 and 624: Tensor Processing UnitThen the usag
An introduction to AutoML
References
1. Neural Architecture Search with Reinforcement Learning, Barret Zoph, Quoc V.
Le; 2016, http://arxiv.org/abs/1611.01578.
2. Efficient Neural Architecture Search via Parameter Sharing, Hieu Pham, Melody
Y. Guan, Barret Zoph, Quoc V. Le, Jeff Dean, 2018, https://arxiv.org/
abs/1802.03268.
3. Transfer NAS: Knowledge Transfer between Search Spaces with Transformer
Agents, Zalán Borsos, Andrey Khorlin, Andrea Gesmundo, 2019, https://
arxiv.org/abs/1906.08102.
4. NSGA-Net: Neural Architecture Search using Multi-Objective Genetic Algorithm,
Zhichao Lu, Ian Whalen, Vishnu Boddeti, Yashesh Dhebar, Kalyanmoy
Deb, Erik Goodman, Wolfgang Banzhaf, 2018 https://arxiv.org/
abs/1810.03522.
5. Random Search for Hyper-Parameter Optimization, James Bergstra, Yoshua
Bengio, 2012, http://www.jmlr.org/papers/v13/bergstra12a.html.
6. Auto-Keras: An Efficient Neural Architecture Search System, Haifeng Jin,
Qingquan Song and Xia Hu, 2019, https://www.kdd.org/kdd2019/
accepted-papers/view/auto-keras-an-efficient-neuralarchitecture-search-system.
7. Automated deep learning design for medical image classification by healthcare
professionals with no coding experience: a feasibility study, Livia Faes et
al, The Lancet Digital Health Volume 1, Issue 5, September 2019, Pages
e232-e242 https://www.sciencedirect.com/science/article/pii/
S2589750019301086.
[ 542 ]