top of page

The Rise of Compound AI: A Leap Beyond Typical Federal Government System Legacy Modernization

  • Writer: Greg .
    Greg .
  • Feb 26
  • 2 min read

In their February 2024 article, "The Shift from Models to Compound AI Systems," researchers from the Berkeley Artificial Intelligence Research (BAIR) Lab discuss a significant trend in artificial intelligence: the movement from relying solely on monolithic models to developing compound AI systems. These systems integrate multiple interacting components, such as various models, retrieval mechanisms, and external tools, to tackle complex tasks more effectively.

The authors highlight several examples underscoring this shift. For example, In enterprise applications, a significant portion of LLM deployments utilize retrieval-augmented generation (RAG) and multi-step processing chains to enhance performance. Even my simple AI model to beat NFL spread was not one model - but a Compound AI System.


Several factors drive the adoption of compound AI systems:

  1. Enhanced Performance Through System Design

  2. Dynamic and Up-to-Date Responses

  3. Improved Control and Reliability

  4. Customization for Diverse Performance Needs


Considering these Compound AI Systems are also software systems and integrate easily into existing software, low code platforms, and customized software - the world of software design is shifting rapidly towards data-centric thinking over software-centric thinking. In short, Compound AI Systems make modern system design exponentially more valuable, faster to develop, and more customizable. So valuable, so fast, and so customizable that most existing "legacy modernization" strategies in the Federal Government are outdated.


We need to embrace freeing data from legacy systems and embrace REPLACING the legacy system completely - and rapidly. Time to forget the old code and embrace modern new service and outcomes. REPLATFORMING, REFACTORING, and REARCHITECTING are not as relevant in this new loosely coupled software & AI ecosystem. Replacing the legacy is now faster and more valuable of a modernization tool.


and look. I get it. Sometimes we have critical systems in COBOL - and a complete rewrite and replace doesn't flow right. Then do a Blue/Green deployment. This will resolve those cases.


Humans will still need to prioritize the outcomes of these systems, humans will still need to design these systems, and humans will still need to be "in the loop" for decisions these systems make. What we should do is stop slowly migrating these Federal services waiting for "legacy modernization" - and instead make the leap to freeing the data and replacing.

CEO of Flamelit - a start-up Data Science and AI/ML consultancy. Formally the Chief Technology Officer (CTO) and U.S. Digital Services Lead at the EPA. Greg was the first Executive Director and Co-Founder of 18F, a 2013 Presidential Innovation Fellow, Day One Accelerator Fellow, GSA Administrator's Award Recipient, and a The Federal 100 and Fedscoop 50 award recipient. He received a degree in Economics with a concentration in Business from St. Mary’s College of Maryland, a Masters in Management of IT from the University of Virginia, and is currently working on a Masters in Business Analytics and AI from NYU.


Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.

Stay informed, join our monthly newsletter

Thanks for subscribing!

© 2024-2025 Analytics & Insights Alliance

bottom of page