The Next Age of Disruption in Evidence-Based Policymaking

From his vantage point as an economist and analyst for the past four decades, Paul Decker sees the world on the cusp of a new era of policymaking driven by the unprecedented availability of data and by artificial intelligence. Exactly how those policies are developed and applied, and how they help society, will be ultimately grounded in evidence, he said.

Decker was the featured speaker at this week’s Batten Hour. He is president and CEO of Mathematica, an employee-owned data analysis, research, and evaluation firm. He joined the firm straight out of graduate school at John Hopkins University, and since becoming CEO in 2007, he has overseen the company’s growth to 1,900 employees and extended its engagement with partners across the globe. 

From 1968 to 2008, Decker said, analysts had limited resources, technology, and data to formulate evaluations and policy prescriptions. They had to collect data from primary sources, an extremely expensive and tedious process, and the main audience for their analyses was Congress, which meant the policy stakes were high, he explained. If a program proved effective, funding was protected or increased. If not, funding ended. 

“It tended to create this situation where people feared research. Those who were running programs, advocating for programs, they saw research as the enemy,” Decker said. 

Around 2008, he said there were two disruptions in the way analysts did their work. One was an explosion in data. “Data were ubiquitous and highly accessible. A lot of this was driven by digital transactions generating data as exhaust from those transactions,” he added. Government entities were also creating data records, which added to the wealth of information becoming available. 

Second was the advancement in technologies to analyze data in mass quantities, including things like audio, video and photography. Moreover, the data were almost constantly being updated, often in real-time. 

In what Decker calls the Data Science Age, there’s room for experimentation, iteration, and continuous policy and program improvement based on a flow of data, rather than just ditching inefficient programs based on a static dataset.

“So all of this can lead to more effective evidence-based policymaking that adapts to emerging trends and challenges,” he said.

The next age is now arriving, with an unprecedented scope of data acquisition and manipulation, coupled with artificial intelligence. Notwithstanding the sheer power of AI, Decker said that human judgement, informed by intuition, morality, and lived experience, remains critical. He described this age as “agentic AI,” in which AI is the tool, or agent, acting on our behalf, gathering evidence, conducting analyses, and taking action.

Shifting gears to the current disruption occurring in the federal government, Decker said the changes appear to reflect policy priorities, but also a goal to modernize government functions through greater reliance on data and technology. 

“The shock we’re experiencing now might have its own silver lining, to stimulate more rapid and expansive use of new technologies to generate better evidence, and much more efficiently than in the past,” he said.

Decker gave a call-to-action to the Batten students in the audience. “Stay focused on what’s going to be critical in the future. Get as teched up and as trained as you can. Embrace the power of AI and effectively integrate it with the other skills you bring.” 

###

Garrett Hall at Sunset

Stay Up To Date with the Latest Batten News and Events