Have you ever seen the movie The Man from Earth?
It is an American drama science fiction film. The plot focuses on John Oldman, a departing university professor, who claims to be a Cro-Magnon (Magdalenian caveman) who has secretly survived for more than 14,000 years. The entire film is set in and around Oldman's house during his farewell party and is composed almost entirely of dialogue. The plot advances through intellectual arguments between Oldman and his fellow faculty members.
I really enjoyed the movie. Especially the scene where one of John’s colleagues asked the following question:
“Do you have some relic, an artifact? A reminder of your early life?”
John then grabs a random pen, shows it and replied brilliantly:
“if you lived 100, 1000 years, would you still have this pen? What would cause you to keep it? Is it a memento of your beginning, even if you don’t have a concept of a beginning? It would be gone, lost. No… I don’t have artifacts”
The reason why John chooses not to keep artifacts is he considers himself as a moving target. In other words, because John knows that he will live for many more years the term value does not apply to him. An artifact that has value now, does not always mean it had value when it was first created. (and vice versa) Evidently, this is also not a guarantee that it will ever have value in the future…
After watching the movie, it got me thinking about architecture in a complex environment. What is the value of a perfect architecture? Is there something as a perfect architecture? Perhaps more importantly, when is the architecture perfect?
If we map this to modern software development, we tend to spend many hours, days, weeks, perhaps even months before coming up with an architecture. The question again, how do you know it's perfect?
The interesting idea here is that product development itself is a moving target due to its complexity. How many times have you been confronted with the legacy codebase and think What the !@%$# happened here!? Who designed this? Who approved this? Well, actually this is a very common phenomenon. One thing we tend to forget is that knowledge also evolves. If you think your current architecture or codebase is future proof, think again... Probably in the future when others, with their latest knowledge, best practice, tooling, frameworks etc., look at your current design (for them it is legacy) will think the same as what you are thinking right now when you look at legacy systems.
Don’t get me wrong, though. I am not saying that upfront architecture is bad. The only thing I try to emphasize is don’t overdo it. Knowing when to stop, keeping your architecture flexible, preventing vendor locking etc. but most importantly accept the fact that things will change over time. Don't waste too much time on designing the perfect architecture, but instead embrace change and let architecture emerge alongside development.
This is an important aspect and that is why it is also embedded in the Agile Manifesto as 1 of the 12 principles.
"The best architectures, requirements, and designs emerge from self-organizing teams" - 1 of the 12 principles of the Agile manifesto
A wise man once said and I quote:
“software development - creating tomorrow's legacy today”
I remember laughing at this sentence, not realizing how true it actually is.