I’ve seen too many articles praising the medallion architecture as the go-to solution for enterprise data quality. At first sight, the structured, three-layered approach sounds like a no-brainer — organize your data into neat bronze, silver, and gold layers, and you have established a perfect data quality enhancement.
But on closer inspection, my aversion to this architectural approach grows ever greater. Sure, it promises consistent, scalable, and centralized information quality improvement. In practice, however, quality problems are constantly rectified too late and rigidly with the same tool, regardless of the context.
Enterprises are complex adaptive systems with wildly different data sources, each with unique challenges regarding its information quality. Why impose the same rigid process on all of them? Forcing them all into the same centralized quality framework will lead to inefficiencies and unnecessary overhead.
I want to challenge the medallion architecture as the supposed best answer to enterprise data quality problems. I’ll make the case for a more tailored, decentralized approach — one inspired by Total Quality Management (TQM) and aligned with…