pdfcoffee
ConventionsThere are a number of text conventions used throughout this book.PrefaceCodeInText: Indicates code words in text, database table names, folder names,filenames, file extensions, pathnames, dummy URLs, user input, and Twitter handlesare shown as follows: "In addition, we load the true labels into Y_train and Y_testrespectively and perform a one-hot encoding on them."A block of code is set as follows:from TensorFlow.keras.models import Sequentialmodel = Sequential()model.add(Dense(12, input_dim=8, kernel_initializer='random_uniform'))When we wish to draw your attention to a particular part of a code block, therelevant lines or items are set in bold:model = Sequential()model.add(Dense(NB_CLASSES, input_shape=(RESHAPED,)))model.add(Activation('softmax'))model.summary()Any command-line input or output is written as follows:pip install quiver_engineBold: Indicates a new term and important word or words that you see on the screen.For example, in menus or dialog boxes, appear in the text like this: "Our simplenet started with an accuracy of 92.22%, which means that about eight handwrittencharacters out of 100 are not correctly recognized."[ xxiii ]
PrefaceWarnings or important notes appear in a box like this.Tips and tricks appear like this.Get in touchFeedback from our readers is always welcome.General feedback: If you have questions about any aspect of this book, mentionthe book title in the subject of your message and email us at customercare@packtpub.com.Errata: Although we have taken every care to ensure the accuracy of our content,mistakes do happen. If you have found a mistake in this book we would be gratefulif you would report this to us. Please visit, www.packtpub.com/support/errata,selecting your book, clicking on the Errata Submission Form link, and enteringthe details.Piracy: If you come across any illegal copies of our works in any form on theInternet, we would be grateful if you would provide us with the location address orwebsite name. Please contact us at copyright@packt.com with a link to the material.If you are interested in becoming an author: If there is a topic that you haveexpertise in and you are interested in either writing or contributing to a book,please visit authors.packtpub.com.ReviewsPlease leave a review. Once you have read and used this book, why not leave areview on the site that you purchased it from? Potential readers can then see and useyour unbiased opinion to make purchase decisions, we at Packt can understand whatyou think about our products, and our authors can see your feedback on their book.Thank you!For more information about Packt, please visit packt.com.[ xxiv ]
- Page 2 and 3: Deep Learning withTensorFlow 2 and
- Page 4 and 5: packt.comSubscribe to our online di
- Page 6 and 7: I want to thank my kids, Aurora, Le
- Page 8 and 9: Sujit Pal is a Technology Research
- Page 10 and 11: Table of ContentsPrefacexiChapter 1
- Page 12 and 13: [ iii ]Table of ContentsConverting
- Page 14 and 15: Table of ContentsSo what is the pro
- Page 16 and 17: [ vii ]Table of ContentsChapter 10:
- Page 18 and 19: Table of ContentsPretrained models
- Page 20 and 21: PrefaceDeep Learning with TensorFlo
- Page 22 and 23: • Supervised learning, in which t
- Page 24 and 25: PrefaceThe complexity of deep learn
- Page 26 and 27: PrefaceFigure 5: Adoption of deep l
- Page 28 and 29: Chapter 1, Neural Network Foundatio
- Page 30 and 31: PrefaceChapter 13, TensorFlow for M
- Page 34: PrefaceReferences1. Deep Learning w
- Page 37 and 38: Neural Network Foundations with Ten
- Page 39 and 40: Neural Network Foundations with Ten
- Page 41 and 42: Neural Network Foundations with Ten
- Page 43 and 44: Neural Network Foundations with Ten
- Page 45 and 46: Neural Network Foundations with Ten
- Page 47 and 48: Neural Network Foundations with Ten
- Page 49 and 50: Neural Network Foundations with Ten
- Page 51 and 52: Neural Network Foundations with Ten
- Page 53 and 54: Neural Network Foundations with Ten
- Page 55 and 56: Neural Network Foundations with Ten
- Page 57 and 58: Neural Network Foundations with Ten
- Page 59 and 60: Neural Network Foundations with Ten
- Page 61 and 62: Neural Network Foundations with Ten
- Page 63 and 64: Neural Network Foundations with Ten
- Page 65 and 66: Neural Network Foundations with Ten
- Page 67 and 68: Neural Network Foundations with Ten
- Page 69 and 70: Neural Network Foundations with Ten
- Page 71 and 72: Neural Network Foundations with Ten
- Page 73 and 74: Neural Network Foundations with Ten
- Page 75 and 76: Neural Network Foundations with Ten
- Page 77 and 78: Neural Network Foundations with Ten
- Page 79 and 80: Neural Network Foundations with Ten
- Page 81 and 82: Neural Network Foundations with Ten
Preface
Warnings or important notes appear in a box like this.
Tips and tricks appear like this.
Get in touch
Feedback from our readers is always welcome.
General feedback: If you have questions about any aspect of this book, mention
the book title in the subject of your message and email us at customercare@
packtpub.com.
Errata: Although we have taken every care to ensure the accuracy of our content,
mistakes do happen. If you have found a mistake in this book we would be grateful
if you would report this to us. Please visit, www.packtpub.com/support/errata,
selecting your book, clicking on the Errata Submission Form link, and entering
the details.
Piracy: If you come across any illegal copies of our works in any form on the
Internet, we would be grateful if you would provide us with the location address or
website name. Please contact us at copyright@packt.com with a link to the material.
If you are interested in becoming an author: If there is a topic that you have
expertise in and you are interested in either writing or contributing to a book,
please visit authors.packtpub.com.
Reviews
Please leave a review. Once you have read and used this book, why not leave a
review on the site that you purchased it from? Potential readers can then see and use
your unbiased opinion to make purchase decisions, we at Packt can understand what
you think about our products, and our authors can see your feedback on their book.
Thank you!
For more information about Packt, please visit packt.com.
[ xxiv ]