The Obama 2012 campaign has been hailed as the most tech-savvy and data-driven to date, a sophisticated operation led by CTO Harper Reed. Reed had no problem finding developers eager to join the campaign — the challenge was finding people up for the formidable task before them. A Presidential campaign poses unique challenges: an environment where volatility is the only constant, developers must iterate often, yet there is little margin for error. To meet this challenge, Reed and his team placed an emphasis on people who could resolve problems and learn quickly, rather than focusing on a particular skill set.
Within this heady environment, Chris Gansen, the campaign’s Senior Software Engineer, and Jason Kunesh, Director of User Experience and Product, embraced the methodologies of agile software development.
While much attention was paid to the Obama 2012 tech team’s online voter outreach efforts, its on-the-ground campaign was equally impressive, turning information collected by volunteers knocking on their neighbors’ doors into grist for fine-grained data collection and analytics. This was undertaken by roughly “1000 neighborhood teams with tens of thousand of volunteer leaders,” according to Gansen.
A voter outreach campaign of such size and complexity demanded a bi-directional communication and data collection tool. Gansen and Kunesh were deeply involved in building the resulting product, Dashboard, which gave volunteer leaders a way to connect, organize, and report progress. Dashboard became a gold mine of voter-reported data, providing the campaign with insights and strategies to effectively communicate to voters and coordinate volunteers, at both a macro and micro scale. Gansen describes the lauded voter outreach and analytics platform as “the culmination of many years of trying to unify online and offline campaign organizing.”
Dashboard grew to one of the largest projects within the campaign, boasting 40–50 engineers divided between five separate tracks, or “Work Streams,” as they were known within the campaign.
From Agile Skepticism to Proven Success
Comfortable with the proven, tightly managed, topdown management model of campaigns past, veteran politicos were initially skeptical of agile development practices. “We fought really hard to get people in the campaign to embrace iterative development,” says Gansen. “In government, you tend to see technology coming from vendors, and measured by the quarter. When we would say, ‘we’re going to launch this and there are going to be bugs,’ a lot of people in the campaign hesitated. They hadn’t done this before, and we had to be flawlessly representing the brand of the President.”
“Software isn’t like that,” he says. “It can be messy, and so we had to show them through iterating, and being accountable to what we were doing, that things improved over time.” Despite their initial pushback, the clarity and transparency of this approach won converts among the skeptics. “We had great technologists who’d never worked on campaigns, and great campaigners who’d never participated in shaping software,” says Kunesh. “Before we understood each others’ worlds, agile was a way for us to agree on what we were building and how.”
It also allowed them to prioritize high-value goals and milestones, while being able to quickly change course when necessary. “Since the goals of the campaign change depending upon what phase of campaigning you are in, responding dynamically was a requirement,” says Kunesh. “The things that were important in February are meaningless at the end of summer, but they are the features you needed in the spring. Agile helped us stay focused on shipping the most important features.”
The campaign measured every conceivable metric — voter data, usage and performance at every level of the technology stacks, and project velocity, to name a handful of metrics. Gansen, Kunesh, and their teams turned to Pivotal Tracker to collaborate, track progress, and perhaps most importantly, communicate their workload and progress to other stakeholders in the campaign. Tracker’s emphasis on agile development methodology, prioritization and collaboration was well-suited to the environment. For the Dashboard initiative, each of the five work streams were divided into separate Tracker projects. Every feature on the Dashboard project had a corresponding Tracker story, and they used release markers heavily to represent milestones.
“With very constrained time, we had to decide quickly whether to improve a feature or axe it,” says Gansen. “In campaigns, you never make a decision that isn’t backed up with data, so being able to pair data-driven development with agile delivery helped build products that people actually used.”
Gansen found that Tracker and agile development practices helped the team make the case for decisions, clearly demonstrate what milestones had been reached, and accurately predict when upcoming milestones would be hit. “From a product perspective,” he says, “the two things that were most successful was the transparency and the accountability of quick, weekly releases. We were able to say, ‘this is what we’re going to do,’ and then a week later come back and say, ‘we did what we said we would do, and if we didn’t, here’s what came up.’”
It also eased the process of onboarding new developers, Kunesh says. “It gave us a common platform and methodology to follow,” he says. “We were building our team as we were building our products. Pivotal Tracker offered a built-in workflow that let us move engineers from team to team while still having a common process.”
Battle-tested and Unbowed
The members of that team have gone their separate ways, like the tech industry equivalent of a pack of sage cowboys riding off into the sunset. Products such as Dashboard now lay in hibernation until the next election season. Fortunately, the likes of Gansen and Kunesh are sharing the hard-learned lessons they learned during those intense 18 months.
Gansen, who now consults for the civic technology organization Smart Chicago Collaborative, has developed a healthy obsession with collecting and analyzing all the data one might have available. “Measurement is key because you can’t make decisions on information you don’t have,” he says. “Measure everything, and do it early on, because even with a small amount of data, you can make decisions. Over time, the data only grows more and more valuable.”
Both Gansen and Kunesh evangelize for the value of developing and deploying transparently within an organization, and using that openness to engage nontechnical stakeholders. It’s a matter of “Communication and building trust,” Kunesh says. “Operate openly and transparently with what you’re doing with the people involved in the process,” Gansen adds. “When a technical team is interacting with a nontechnical team,” he says, such clarity and transparency “is essential.”