Book Image

Learning Python Web Penetration Testing

By : Christian Martorella
Book Image

Learning Python Web Penetration Testing

By: Christian Martorella

Overview of this book

Web penetration testing is the use of tools and code to attack a website or web app in order to assess its vulnerability to external threats. While there are an increasing number of sophisticated, ready-made tools to scan systems for vulnerabilities, the use of Python allows you to write system-specific scripts, or alter and extend existing testing tools to find, exploit, and record as many security weaknesses as possible. Learning Python Web Penetration Testing will walk you through the web application penetration testing methodology, showing you how to write your own tools with Python for each activity throughout the process. The book begins by emphasizing the importance of knowing how to write your own tools with Python for web application penetration testing. You will then learn to interact with a web application using Python, understand the anatomy of an HTTP request, URL, headers and message body, and later create a script to perform a request, and interpret the response and its headers. As you make your way through the book, you will write a web crawler using Python and the Scrappy library. The book will also help you to develop a tool to perform brute force attacks in different parts of the web application. You will then discover more on detecting and exploiting SQL injection vulnerabilities. By the end of this book, you will have successfully created an HTTP proxy based on the mitmproxy tool.
Table of Contents (9 chapters)

Automating SQLi in mitmproxy

In this section, we are going to learn how we can automate a test case for SQL injection in mitmproxy, creating an inline script that we use, the request handler, and some of the things we learned in the previous sections.

SQLi process

The objective of this section is to create an inline script for an mitmproxy, which will allow us to test SQL injection in every URL that has a parameter:

So the process is that, for every URL that has parameters, we need to replace each parameter value with FUZZ while conserving the rest of the parameter values. We do this instead of replacing all the values with FUZZ at once. Then, we replace the FUZZ string in each URL with each value in the injections array...