Book Image

Practical Business Intelligence

Book Image

Practical Business Intelligence

Overview of this book

Business Intelligence (BI) is at the crux of revolutionizing enterprise. Everyone wants to minimize losses and maximize profits. Thanks to Big Data and improved methodologies to analyze data, Data Analysts and Data Scientists are increasingly using data to make informed decisions. Just knowing how to analyze data is not enough, you need to start thinking how to use data as a business asset and then perform the right analysis to build an insightful BI solution. Efficient BI strives to achieve the automation of data for ease of reporting and analysis. Through this book, you will develop the ability to think along the right lines and use more than one tool to perform analysis depending on the needs of your business. We start off by preparing you for data analytics. We then move on to teach you a range of techniques to fetch important information from various databases, which can be used to optimize your business. The book aims to provide a full end-to-end solution for an environment setup that can help you make informed business decisions and deliver efficient and automated BI solutions to any company. It is a complete guide for implementing Business intelligence with the help of the most powerful tools like D3.js, R, Tableau, Qlikview and Python that are available on the market.
Table of Contents (16 chapters)
Practical Business Intelligence
Credits
About the Author
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface

Fusing D3 and CSV


Now that we have some background on how to create components with D3 using hardcoded data in variables, we can continue the process by developing D3 components against data in a CSV file. In order to do so, there are two architectural matters that will need to be addressed before any type of development begins:

  • Creating and exporting a CSV file to a desired location

  • Establishing a server to connect the CSV file to an HTML file to be leveraged by D3

Preparing the CSV file

In Chapter 2, Web Scraping, we scraped data from a GitHub website, which was extracted to a CSV file and then uploaded to MS SQL Server. The file was called DiscountCodebyWeek and contained the following three columns:

  • Index

  • WeekInYear

  • DiscountCode

When the data was originally scraped using R, the contents made it to a CSV file. We can use that same CSV file as our source for this exercise, or we can copy the data from the MS SQL Server database and use that version instead. Either method is fine. Once the data...