Book Image

The Future of Finance with ChatGPT and Power BI

By : James Bryant, Aloke Mukherjee
2.5 (4)
Book Image

The Future of Finance with ChatGPT and Power BI

2.5 (4)
By: James Bryant, Aloke Mukherjee

Overview of this book

In today's rapidly evolving economic landscape, the combination of finance, analytics, and artificial intelligence (AI) heralds a new era of decision-making. Finance and data analytics along with AI can no longer be seen as separate disciplines and professionals have to be comfortable in both in order to be successful. This book combines finance concepts, visualizations through Power BI and the application of AI and ChatGPT to provide a more holistic perspective. After a brief introduction to finance and Power BI, you will begin with Tesla's data-driven financial tactics before moving to John Deere's AgTech strides, all through the lens of AI. Salesforce's adaptation to the AI revolution offers profound insights, while Moderna's navigation through the biotech frontier during the pandemic showcases the agility of AI-focused companies. Learn from Silicon Valley Bank's demise, and prepare for CrowdStrike's defensive maneuvers against cyber threats. With each chapter, you'll gain mastery over new investing ideas, Power BI tools, and integrate ChatGPT into your workflows. This book is an indispensable ally for anyone looking to thrive in the financial sector. By the end of this book, you'll be able to transform your approach to investing and trading by blending AI-driven analysis, data visualization, and real-world applications.
Table of Contents (13 chapters)
Free Chapter
1
Part 1: From Financial Fundamentals to Frontier Tech: Navigating the New Paradigms of Data, EVs, and AgTech
6
Part 2: Pioneers and Protectors: AI Transformations in Software, Finance, Biotech, and Cybersecurity

Understanding and mitigating LLM “hallucinations” in financial analysis and data visualization

LLMs, such as OpenAI’s GPT series, can sometimes generate responses that are referred to as “hallucinations.” These are instances where the output from the model is factually incorrect, it presents information that it could not possibly know (given it doesn’t have access to real-time or personalized data), or it might output something nonsensical or highly improbable.

Let’s explore deeper into what hallucinations are, how to identify them, and what steps can be taken to mitigate their impact, especially in a context where accurate and reliable information is crucial, such as financial analysis, trading, or visual data presentations.

Understanding hallucinations

Let’s look at some examples:

  • Factual inaccuracies: Suppose an LLM provides information stating that Apple Inc. was founded in 1985. This is a clear factual inaccuracy...