Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Natural Language Processing with TensorFlow
  • Table Of Contents Toc
Natural Language Processing with TensorFlow

Natural Language Processing with TensorFlow - Second Edition

By : Thushan Ganegedara
4.6 (17)
close
close
Natural Language Processing with TensorFlow

Natural Language Processing with TensorFlow

4.6 (17)
By: Thushan Ganegedara

Overview of this book

Learning how to solve natural language processing (NLP) problems is an important skill to master due to the explosive growth of data combined with the demand for machine learning solutions in production. Natural Language Processing with TensorFlow, Second Edition, will teach you how to solve common real-world NLP problems with a variety of deep learning model architectures. The book starts by getting readers familiar with NLP and the basics of TensorFlow. Then, it gradually teaches you different facets of TensorFlow 2.x. In the following chapters, you then learn how to generate powerful word vectors, classify text, generate new text, and generate image captions, among other exciting use-cases of real-world NLP. TensorFlow has evolved to be an ecosystem that supports a machine learning workflow through ingesting and transforming data, building models, monitoring, and productionization. We will then read text directly from files and perform the required transformations through a TensorFlow data pipeline. We will also see how to use a versatile visualization tool known as TensorBoard to visualize our models. By the end of this NLP book, you will be comfortable with using TensorFlow to build deep learning models with many different architectures, and efficiently ingest data using TensorFlow Additionally, you’ll be able to confidently use TensorFlow throughout your machine learning workflow.
Table of Contents (15 chapters)
close
close
12
Other Books You May Enjoy
13
Index

Tasks of Natural Language Processing

NLP has a multitude of real-world applications. A good NLP system is one that performs many NLP tasks. When you search for today’s weather on Google or use Google Translate to find out how to say, “How are you?” in French, you rely on a subset of such tasks in NLP. We will list some of the most ubiquitous tasks here, and this book covers most of these tasks:

  • Tokenization: Tokenization is the task of separating a text corpus into atomic units (for example, words or characters). Although it may seem trivial for a language like English, tokenization is an important task. For example, in the Japanese language, words are not delimited by spaces or punctuation marks.
  • Word-Sense Disambiguation (WSD): WSD is the task of identifying the correct meaning of a word. For example, in the sentences, The dog barked at the mailman and Tree bark is sometimes used as a medicine, the word bark has two different meanings. WSD is critical for tasks such as question answering.
  • Named Entity Recognition (NER): NER attempts to extract entities (for example, person, location, and organization) from a given body of text or a text corpus. For example, the sentence, John gave Mary two apples at school on Monday will be transformed to [John]name gave [Mary]name [two]number apples at [school]organization on [Monday]time. NER is an imperative topic in fields such as information retrieval and knowledge representation.
  • Part-of-Speech (PoS) tagging: PoS tagging is the task of assigning words to their respective parts of speech. It can either be basic tags such as noun, verb, adjective, adverb, and preposition, or it can be granular such as proper noun, common noun, phrasal verb, verb, and so on. The Penn Treebank project, a popular project focusing PoS, defines a comprehensive list of PoS tags at https://www.ling.upenn.edu/courses/ling001/penn_treebank_pos.html.
  • Sentence/synopsis classification: Sentence or synopsis (for example, movie reviews) classification has many use cases such as spam detection, news article classification (for example, political, technology, and sport), and product review ratings (that is, positive or negative). This is achieved by training a classification model with labeled data (that is, reviews annotated by humans, with either a positive or negative label).
  • Text generation: In text generation, a learning model (for example, a neural network) is trained with text corpora (a large collection of textual documents), and it then predicts new text that follows. For example, language modeling can output an entirely new science fiction story by using existing science fiction stories for training.

Recently, OpenAI released a language model known as OpenAI-GPT-2, which can generate incredibly realistic text. Furthermore, this task plays a very important role in understanding language, which helps a downstream decision-support model get off the ground quickly.

  • Question Answering (QA): QA techniques possess a high commercial value, and such techniques are found at the foundation of chatbots and VA (for example, Google Assistant and Apple Siri). Chatbots have been adopted by many companies for customer support. Chatbots can be used to answer and resolve straightforward customer concerns (for example, changing a customer’s monthly mobile plan), which can be solved without human intervention. QA touches upon many other aspects of NLP such as information retrieval and knowledge representation. Consequently, all this makes developing a QA system very difficult.
  • Machine Translation (MT): MT is the task of transforming a sentence/phrase from a source language (for example, German) to a target language (for example, English). This is a very challenging task, as different languages have different syntactical structures, which means that it is not a one-to-one transformation. Furthermore, word-to-word relationships between languages can be one-to-many, one-to-one, many-to-one, or many-to-many. This is known as the word alignment problem in MT literature.

Finally, to develop a system that can assist a human in day-to-day tasks (for example, VA or a chatbot) many of these tasks need to be orchestrated in a seamless manner. As we saw in the previous example where the user asks, “Can you show me a good Italian restaurant nearby?” several different NLP tasks, such as speech-to-text conversion, semantic and sentiment analyses, question answering, and machine translation, need to be completed. In Figure 1.1, we provide a hierarchical taxonomy of different NLP tasks categorized into several different types. It is a difficult task to attribute an NLP task to a single classification. Therefore, you can see some tasks spanning multiple categories. We will split the categories into two main types: language-based (light-colored with black text) and problem formulation-based (dark-colored with white text). The linguistic breakdown has two categories: syntactic (structure-based) and semantic (meaning-based). The problem formulation-based breakdown has three categories: preprocessing tasks (tasks that are performed on text data before feeding to a model), discriminative tasks (tasks where we attempt to assign an input text to one or more categories from a set of predefined categories) and generative tasks (tasks where we attempt to generate a new textual output). Of course, this is one classification among many. But it will show how difficult it is to assign a specific NLP task to a specific category.

Figure 1.1: A taxonomy of the popular tasks of NLP categorized under broader categories

Having understood the various tasks in NLP, let us now move on to understand how we can solve these tasks with the help of machines. We will discuss both the traditional method and the deep- learning-based approach.

CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
Natural Language Processing with TensorFlow
notes
bookmark Notes and Bookmarks search Search in title playlist Add to playlist download Download options font-size Font size

Change the font size

margin-width Margin width

Change margin width

day-mode Day/Sepia/Night Modes

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY

Submit Your Feedback

Modal Close icon
Modal Close icon
Modal Close icon