Book Image

Getting Started with Python for the Internet of Things

By : Tim Cox, Steven Lawrence Fernandes, Sai Yamanoor, Srihari Yamanoor, Prof. Diwakar Vaish
Book Image

Getting Started with Python for the Internet of Things

By: Tim Cox, Steven Lawrence Fernandes, Sai Yamanoor, Srihari Yamanoor, Prof. Diwakar Vaish

Overview of this book

This Learning Path takes you on a journey in the world of robotics and teaches you all that you can achieve with Raspberry Pi and Python. It teaches you to harness the power of Python with the Raspberry Pi 3 and the Raspberry Pi zero to build superlative automation systems that can transform your business. You will learn to create text classifiers, predict sentiment in words, and develop applications with the Tkinter library. Things will get more interesting when you build a human face detection and recognition system and a home automation system in Python, where different appliances are controlled using the Raspberry Pi. With such diverse robotics projects, you'll grasp the basics of robotics and its functions, and understand the integration of robotics with the IoT environment. By the end of this Learning Path, you will have covered everything from configuring a robotic controller, to creating a self-driven robotic vehicle using Python. • Raspberry Pi 3 Cookbook for Python Programmers - Third Edition by Tim Cox, Dr. Steven Lawrence Fernandes • Python Programming with Raspberry Pi by Sai Yamanoor, Srihari Yamanoor • Python Robotics Projects by Prof. Diwakar Vaish
Table of Contents (37 chapters)
Title Page
Copyright and Credits
About Packt
Contributors
Preface
Index

Building an optical character recognizer using neural networks


This section describes the neural network based optical character identification scheme.

How to do it...

  1. Import the following packages:
import numpy as np 
import neurolab as nl 
  1. Read the input file:
in_file = 'words.data'
  1. Consider 20 data points to build the neural network based system:
# Number of datapoints to load from the input file 
num_of_datapoints = 20
  1. Represent the distinct characters:
original_labels = 'omandig' 
# Number of distinct characters 
num_of_charect = len(original_labels) 
  1. Use 90% of data for training the neural network and the remaining 10% for testing:
train_param = int(0.9 * num_of_datapoints) 
test_param = num_of_datapoints - train_param 
  1. Define the dataset extraction parameters:
s_index = 6 
e_index = -1 
  1. Build the dataset:
information = [] 
labels = [] 
with open(in_file, 'r') as f: 
  for line in f.readlines(): 
    # Split the line tabwise 
    list_of_values = line.split('t') 
  1. Implement an error check to confirm...