Note from Beth: As part of my work this year as Visiting Scholar at the David and Lucile Packard Foundation, I’m running several peer learning groups based on the ideas in “Measuring the Networked Nonprofit” and Crawl, Walk, Run, Fly Maturity of Practice model. These peer learning groups are focused on transformational change – to help participants transform into data-informed nonprofits one small step at a time.
The CWRF model has been adopted by nonprofits. I spoke to a group of nonprofit in Seattle a few weeks ago at Pyramid Communications who found the advice for taking incremental steps towards improvement very useful. The model was also used for advanced practitioners as a benchmarking framework at Social Media Week in DC last week.
As a trainer, I always like to check back with participants 6 months to year after a training to see how they are doing. Have they been able to put the ideas into practice? Sure, you can deliver a lot of a great content, but the real joy of being a trainer is seeing how your participants apply what they learned. That’s where learning and transformation begins! Nicole agreed to write a reflection about how she went from crawling in measurement practice to running.
What gets measured gets better – Guest Post by Nicole Lampe
I was part of a peer learning group Beth convened when she was writing Measuring the Networked Nonprofit. I was excited and terrified, and, ultimately, did not complete my action learning project. Work intervened. Flash forward 18 months, and I’ve become a measurement evangelist. It’s still scary to set quantifiable goals for a project and report on whether we reached them. And it’s still hard to carve out time for tracking. But we can’t afford to operate on hunches when there is real data available to inform outreach efforts. Because what gets measured gets better.
There are lots of reasons I struggled to find the right measurement groove at Resource Media . As communications consultants, we often fight the thud factor. A pile of news clips makes a gratifying sound when dropped on a conference table, but did those stories reach the right people, did they say the right thing? The social media revolution has made it possible to count dozens of new indicators that don’t necessarily demonstrate impact. But we’re in the business of creating change, not tallying likes.
Turns out, change happens when you’re paying close attention to results. We have always done this by tracking policy outcomes and media coverage. But now we have a wealth of interim measures that enable us to fine tune tactics along the way.
Start measuring now, wherever you are
When I first met Beth and was struggling to come up with the right pilot project, I kept worrying the timing was off. Few nonprofits are just building a Facebook page or website. We’re all midstream with our communications. We don’t have good baseline data, and didn’t necessarily start with the clearest of goals. So how do you retrofit SMART objectives and a measurement protocol on projects already up and running?
Start where you are. Maybe you don’t have enough information to set numeric goals grounded in past performance. Maybe you haven’t quite narrowed down which metrics will be most important in the long run. That’s okay—it’s a practice that can evolve over time. But you have to start somewhere.
I started with this basic tracking worksheet. I captured baseline data for all my clients at the beginning of our work together, and asked them to commit to quarterly measurement. For campaigns we manage ourselves, I do the tracking monthly.
Zen and the art of routine measurement
While there are lots of slick tools out there that automate measurement like BrandMentions and capture much of the same data as my spreadsheet, part of the magic of measurement is in the practice itself. The numbers of followers, shares or referrals need to be paired with observations about what is driving engagement. Does my community love political cartoons? Kitten photos? Stories from their peers? Does that move our SMART objectives forward?
This feedback loop about what’s working is the path to steady improvement. For Resource Media’s blog and e-newsletter, I pay close attention to the posts that perform, try to isolate the secret sauce, and replicate those successes. I report to colleagues monthly about what our audience responds to, and encourage my clients to do the same.
You might think routine measurement would lead to a slow decline in creativity, but instead, I’ve found the opposite to be true. It gives us license to experiment, knowing we will have hard facts to evaluate the results.
And while measurement does indeed take time, it also liberates us from ongoing communications tasks that aren’t producing the desired result. A small investment in planning and tracking adds up to a major increase in impact. But it only works if you do it; so don’t bite off more than you can regularly chew.
Right-sizing your measurement approach
A person could get lost in Google Analytics, or spend days tracking the trajectory of every link and tweet. But a measurement dashboard the size of Texas is about as useful as an encyclopedia-length communications plan. You want to take a snapshot of outreach results with just enough detail to be able to attribute spikes in activity to external events, great content, or the unpredictable internets.
On Beth’s scale of measurement maturity, I’m now comfortably jogging. The next step for me and for Resource Media is to distill our learning into something that can be shared with the field. And we’re starting this summer with a couple pilot projects that measure how imagery impacts advocacy communications.
What aspects of your organization’s strategy have you improved with measurement? What is the benefit?
Nicole Lampe is Digital Strategy Director at Resource Media. Find her on Twitter @nicole_amber
Leave a Reply