Book Image

Get Your Hands Dirty on Clean Architecture - Second Edition

By : Tom Hombergs
4 (1)
Book Image

Get Your Hands Dirty on Clean Architecture - Second Edition

4 (1)
By: Tom Hombergs

Overview of this book

Building for maintainability is key to keep development costs low (and developers happy). The second edition of "Get Your Hands Dirty on Clean Architecture" is here to equip you with the essential skills and knowledge to build maintainable software. Building upon the success of the first edition, this comprehensive guide explores the drawbacks of conventional layered architecture and highlights the advantages of domain-centric styles such as Robert C. Martin's Clean Architecture and Alistair Cockburn's Hexagonal Architecture. Then, the book dives into hands-on chapters that show you how to manifest a Hexagonal Architecture in actual code. You'll learn in detail about different mapping strategies between the layers of a Hexagonal Architecture and see how to assemble the architecture elements into an application. The later chapters demonstrate how to enforce architecture boundaries, what shortcuts produce what types of technical debt, and how, sometimes, it is a good idea to willingly take on those debts. By the end of this second edition, you'll be armed with a deep understanding of the Hexagonal Architecture style and be ready to create maintainable web applications that save money and time. Whether you're a seasoned developer or a newcomer to the field, "Get Your Hands Dirty on Clean Architecture" will empower you to take your software architecture skills to new heights and build applications that stand the test of time.
Table of Contents (18 chapters)

They make parallel work difficult

Management usually expects us to be done with building the software they sponsor on a certain date. Actually, they even expect us to be done within a certain budget as well, but let’s not complicate things here.

Aside from the fact that I have never seen “done” software in my career as a software engineer, to be “done” by a certain date usually implies that multiple people have to work in parallel.

You probably know this famous conclusion from “The Mythical Man-Month,” even if you haven’t read the book: Adding manpower to a late software project makes it later.4

44 The Mythical Man-Month: Essays on Software Engineering by Frederick P. Brooks, Jr., Addison-Wesley, 1995.

This also holds true, to a degree, in software projects that are not (yet) late. You cannot expect a large group of 50 developers to be 5 times faster than a smaller team of 10 developers. If they’re working on a very large application where they can split up into sub-teams and work on separate parts of the software, it may work, but in most contexts, they will step on each other’s feet.

But on a healthy scale, we can certainly expect to be faster with more people on the project. And management is right to expect that of us.

To meet this expectation, our architecture must support parallel work. This is not easy. And a layered architecture doesn’t really help us here.

Imagine we’re adding a new use case to our application. We have three developers available. One can add the needed features to the web layer, one to the domain layer, and the third to the persistence layer, right?

Well, it usually doesn’t work that way in a layered architecture. Since everything builds on top of the persistence layer, the persistence layer must be developed first. Then comes the domain layer and finally the web layer. So only one developer can work on the feature at a time!

“Ah, but the developers can define interfaces first,” you say, “and then each developer can work against these interfaces without having to wait for the actual implementation.”

Sure, this is possible, but only if we haven’t mixed our domain and persistence logic as discussed previously, blocking us from working on each aspect separately.

If we have broad services in our code base, it may even be hard to work on different features in parallel. Working on different use cases will cause the same service to be edited in parallel, which leads to merge conflicts and potentially regressions.