4. Abundant data
In the 1990s, data professionals focused on centralizing data. This involved moving it from source systems through complicated extract, transform, and load (ETL) scenarios and data quality processes, and eventually delivering that information to enterprise data warehouses. These data warehouse infrastructures were rigidly maintained and designed to be a single source of truth for data and simple analytic consumption. The frameworks that governed data warehouses lacked agility and often proved costly to maintain. Simple changes to datasets often required the approval of committees and rarely kept up with the requests of the data consumer.
And it’s not just that those data warehouses were strictly controlled; they also lacked the capability to handle diverse sources of information. Indeed, the 80/20 rule has been applied to enterprise data for longer than people care to remember. Countless articles have been published about the hurdles to tapping into that...