Book Image

Java Deep Learning Projects

Book Image

Java Deep Learning Projects

Overview of this book

Java is one of the most widely used programming languages. With the rise of deep learning, it has become a popular choice of tool among data scientists and machine learning experts. Java Deep Learning Projects starts with an overview of deep learning concepts and then delves into advanced projects. You will see how to build several projects using different deep neural network architectures such as multilayer perceptrons, Deep Belief Networks, CNN, LSTM, and Factorization Machines. You will get acquainted with popular deep and machine learning libraries for Java such as Deeplearning4j, Spark ML, and RankSys and you’ll be able to use their features to build and deploy projects on distributed computing environments. You will then explore advanced domains such as transfer learning and deep reinforcement learning using the Java ecosystem, covering various real-world domains such as healthcare, NLP, image classification, and multimedia analytics with an easy-to-follow approach. Expert reviews and tips will follow every project to give you insights and hacks. By the end of this book, you will have stepped up your expertise when it comes to deep learning in Java, taking it beyond theory and be able to build your own advanced deep learning systems.
Table of Contents (13 chapters)

Summary

In this chapter, we have seen how to implement and deploy a hands-on deep learning project that classifies review texts as either positive or negative based on the words they contain. We have used a large-scale movie review dataset that contains 50,000 reviews (training plus testing). A combined approach using Word2Vec (that is, a widely used word embedding technique in NLP) and the LSTM network for modeling was applied: the pre-trained Google news vector model was used as the neural word embeddings.

Then, the training vectors, along with the labels, were fed into the LSTM network, which successfully classified them as negative or positive sentiments. Then, it evaluated the trained model on the test set. Additionally, we have also seen how to apply text-based preprocessing techniques such as tokenizer, stop words removal and TF-IDF, as well as word-embedding operations...