Book Image

Hands-On Application Development with PyCharm - Second Edition

By : Bruce M. Van Horn II, Quan Nguyen
5 (1)
Book Image

Hands-On Application Development with PyCharm - Second Edition

5 (1)
By: Bruce M. Van Horn II, Quan Nguyen

Overview of this book

In the quest to develop robust, professional-grade software with Python and meet tight deadlines, it’s crucial to have the best tools at your disposal. In this second edition of Hands-on Application Development with PyCharm, you’ll learn tips and tricks to work at a speed and proficiency previously reserved only for elite developers. To achieve that, you’ll be introduced to PyCharm, the premiere professional integrated development environment for Python programmers among the myriad of IDEs available. Regardless of how Python is utilized, whether for general automation scripting, utility creation, web applications, data analytics, machine learning, or business applications, PyCharm offers tooling that simplifies complex tasks and streamlines common ones. In this book, you'll find everything you need to harness PyCharm's full potential and make the most of Pycharm's productivity shortcuts. The book comprehensively covers topics ranging from installation and customization to web development, database management, and data analysis pipeline development helping you become proficient in Python application development in diverse domains. By the end of this book, you’ll have discovered the remarkable capabilities of PyCharm and how you can achieve a new level of capability and productivity.
Table of Contents (24 chapters)
1
Part 1: The Basics of PyCharm
4
Part 2: Improving Your Productivity
9
Part 3: Web Development in PyCharm
15
Part 4: Data Science with PyCharm
19
Part 5: Plugins and Conclusion

Building a Data Pipeline in PyCharm

The term data pipeline generally denotes a step-wise procedure that entails collecting, processing, and analyzing data. This term is widely used in the industry to express the need for a reliable workflow that takes raw data and converts it into actionable insights. Some data pipelines work at massive scales, such as a marketing technology (MarTech) company ingesting millions of data points from Kafka streams, storing them in large data stores such as Hadoop or Clickhouse, and then cleansing, enriching, and visualizing that data. Other times, the data is smaller but far more impactful, such as the project we’ll be working on in this chapter.

In this chapter, we will learn about the following topics:

  • How to work with and maintain datasets
  • How to clean and preprocess data
  • How to visualize data
  • How to utilize machine learning (ML)

Throughout this chapter, you will be able to apply what you have learned about the...