pdfcoffee

soumyasankar99
from soumyasankar99 More from this publisher
09.05.2023 Views

Chapter 8Alice, make there some that in at the shills distringulf out to theFroge, and very mind to it were it?' the King was set telm, what's theold all reads talking a minuse. "Where ream put find growned his so," _you 'Fust to tAfter 50 epochs of training, the model still has trouble expressing coherent thought,but has learned to spell reasonably well. What is amazing here is that the model ischaracter-based and has no knowledge of words, yet it learns to spell words thatlook like they might have come from the original text:Alice Vex her," he prope of the very managed by this thill deceed. I willear she a much daid. "I sha?' Nets: "Woll, I should shutpelf, and nowand then, cried, How them yetains, a tround her about in a shy time, Ipashng round the sandle, droug" shrees went on what he seting that," saidAlice. "Was this will resant again. Alice stook of in a faid.' 'It's ale.So they wentle shall kneeltie-and which herfer--the about the heald inpum little each the UKECE P@TTRUST GITE Ever been my hever pertanced tobecristrdphariok, and your pringing that why the King as I to the Kingremark, but very only all Project Grizly: thentiused about doment,' Alicewith go ould, are wayings for handsn't replied as mave about to LISTE!'(If the UULE 'TARY-HAVE BUY DIMADEANGNE'G THING NOOT,' be this plam roundan any bar here! No, you're alard to be a good aftered of the sam--Icanon't?" said Alice. 'It's one eye of the olleations. Which saw do itjust opened hardly deat, we hastowe. 'Of coum, is tried try slowingGenerating the next character or next word in the text isn't the only thing you cando with this sort of model. Similar models have been built to make stock pricepredictions [3] or generate classical music [4]. Andrej Karpathy covers a few otherfun examples, such as generating fake Wikipedia pages, algebraic geometry proofs,and Linux source code in his blog post [5].The full code for this example is available in alice_text_generator.py in thesource code folder for this chapter. It can be run from the command line using thefollowing command:$ python alice_text_generator.pyOur next example will show an implementation of a many-to-one network forsentiment analysis.[ 299 ]

Recurrent Neural NetworksExample ‒ Many-to-One – Sentiment AnalysisIn this example, we will use a many-to-one network that takes a sentence as inputand predicts its sentiment as being either positive or negative. Our dataset is theSentiment labeled sentences dataset on the UCI Machine Learning Repository [20],a set of 3,000 sentences from reviews on Amazon, IMDb, and Yelp, each labeledwith 0 if it expresses a negative sentiment, or 1 if it expresses a positive sentiment.As usual, we will start with our imports:import numpy as npimport osimport shutilimport tensorflow as tffrom sklearn.metrics import accuracy_score, confusion_matrixThe dataset is provided as a zip file, which expands into a folder containing threefiles of labeled sentences, one for each provider, with one sentence and label perline, with the sentence and label separated by the tab character. We first downloadthe zip file, then parse the files into a list of (sentence, label) pairs:def download_and_read(url):local_file = url.split('/')[-1]local_file = local_file.replace("%20", " ")p = tf.keras.utils.get_file(local_file, url,extract=True, cache_dir=".")local_folder = os.path.join("datasets", local_file.split('.')[0])labeled_sentences = []for labeled_filename in os.listdir(local_folder):if labeled_filename.endswith("_labelled.txt"):with open(os.path.join(local_folder, labeled_filename), "r") as f:for line in f:sentence, label = line.strip().split('\t')labeled_sentences.append((sentence, label))return labeled_sentenceslabeled_sentences = download_and_read("https://archive.ics.uci.edu/ml/machine-learning-databases/" +"00331/sentiment%20labelled%20sentences.zip")sentences = [s for (s, l) in labeled_sentences]labels = [int(l) for (s, l) in labeled_sentences][ 300 ]

Recurrent Neural Networks

Example ‒ Many-to-One – Sentiment Analysis

In this example, we will use a many-to-one network that takes a sentence as input

and predicts its sentiment as being either positive or negative. Our dataset is the

Sentiment labeled sentences dataset on the UCI Machine Learning Repository [20],

a set of 3,000 sentences from reviews on Amazon, IMDb, and Yelp, each labeled

with 0 if it expresses a negative sentiment, or 1 if it expresses a positive sentiment.

As usual, we will start with our imports:

import numpy as np

import os

import shutil

import tensorflow as tf

from sklearn.metrics import accuracy_score, confusion_matrix

The dataset is provided as a zip file, which expands into a folder containing three

files of labeled sentences, one for each provider, with one sentence and label per

line, with the sentence and label separated by the tab character. We first download

the zip file, then parse the files into a list of (sentence, label) pairs:

def download_and_read(url):

local_file = url.split('/')[-1]

local_file = local_file.replace("%20", " ")

p = tf.keras.utils.get_file(local_file, url,

extract=True, cache_dir=".")

local_folder = os.path.join("datasets", local_file.split('.')[0])

labeled_sentences = []

for labeled_filename in os.listdir(local_folder):

if labeled_filename.endswith("_labelled.txt"):

with open(os.path.join(

local_folder, labeled_filename), "r") as f:

for line in f:

sentence, label = line.strip().split('\t')

labeled_sentences.append((sentence, label))

return labeled_sentences

labeled_sentences = download_and_read(

"https://archive.ics.uci.edu/ml/machine-learning-databases/" +

"00331/sentiment%20labelled%20sentences.zip")

sentences = [s for (s, l) in labeled_sentences]

labels = [int(l) for (s, l) in labeled_sentences]

[ 300 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!