Book Image

Learning Python Web Penetration Testing

By : Christian Martorella
Book Image

Learning Python Web Penetration Testing

By: Christian Martorella

Overview of this book

Web penetration testing is the use of tools and code to attack a website or web app in order to assess its vulnerability to external threats. While there are an increasing number of sophisticated, ready-made tools to scan systems for vulnerabilities, the use of Python allows you to write system-specific scripts, or alter and extend existing testing tools to find, exploit, and record as many security weaknesses as possible. Learning Python Web Penetration Testing will walk you through the web application penetration testing methodology, showing you how to write your own tools with Python for each activity throughout the process. The book begins by emphasizing the importance of knowing how to write your own tools with Python for web application penetration testing. You will then learn to interact with a web application using Python, understand the anatomy of an HTTP request, URL, headers and message body, and later create a script to perform a request, and interpret the response and its headers. As you make your way through the book, you will write a web crawler using Python and the Scrappy library. The book will also help you to develop a tool to perform brute force attacks in different parts of the web application. You will then discover more on detecting and exploiting SQL injection vulnerabilities. By the end of this book, you will have successfully created an HTTP proxy based on the mitmproxy tool.
Table of Contents (9 chapters)

Web application mapping

Remember in Chapter 1, Introduction to Web Application Penetration Testing, that we learned about the penetration testing process. In that process, the second phase was mapping.

In the mapping phase, we need to build a map or catalog of the application resources and functionalities. As a security tester, we aim to identify all the components and entry points in the app. The main components that we are interested in are the resources that take parameters as input, the forms, and the directories.

The mapping is mainly performed with a crawler. Crawlers are also known as spiders, and usually, they perform scraping tasks, which means that they will also extract interesting data from the application such as emails, forms, comments, hidden fields, and more.

In order to perform application mapping, we have the following options:

  • The first technique is crawling...