A necessary evil
Reporting is one of the unfortunate necessities of the marketing industry. Monthly (or weekly or quarterly or however you like to do it) reports are a time-honored way to communicate progress to the clients we serve.
Done correctly, they prove to those clients that we are accomplishing valuable things on their behalf, thus building and banking trust for future endeavors. (A little more substantive a tactic than getting together and doing trust falls, although perhaps not less nerve-wracking).
Now, as you may have noticed, here at Mack Web, we’ve gotten a little obsessed this year with conquering, er, completing with valor and perseverance our journey to integrating quantitative and qualitative measurement. Measurement is always a hot issue with digital marketers and for good reason. We measure the results of our efforts so we know what works and what to do about it. It’s a big deal.
But because it is so inextricably intertwined with measurement, reporting often gets short shrift, overlooked like the red-headed stepchild it is. (As a once-strawberry-blonde stepchild, I get to use that phrase without taking any guff, okay guys?).
The sad truth is that, though measurement is what drives our forward movement, if we can’t get the hang of reporting, it’s time to hang it all up and go home. Because reporting – though a royal pain in the neck – is not only how we convince the clients to keep paying us, but how we justify the decisions we make based on what we’ve measured.
It’s a simple equation: bad reporting = no trust = no forward movement = unemployed marketers = no rent, no loan payments, no gummy bear budget.
So, while mastering measurement is important, mastering the art of the report cannot be overlooked.
Which is why, as part of our Quest for Quanlitative Measurement, Mack Web has undertaken the Rep-volution: a long, painstaking, patience-testing evolutionary approach to our reports.
A long time ago, in a galaxy far, far away…
We won’t try your patience by going back too far. Not all the way to the long-dead days when rank in the SERPs was the all-in-all and Mack Web branding involved use of the color lavender. (You’re welcome).
Instead we’ll pick up right before we started to buckle down on this reporting thing. Jump in your time machine, grab the hand of the Ghost of Christmas (uh…Halloween?) Past, rev up the DeLorean. Do whatever you gotta do to join us in a little journey back to yesteryear.
It is October 2013. Mack Web has just completed another reporting cycle. We think we’ve got the process down but look upon the product with an eye of disfavor.
Our reports range from 11 to 16 pages. We use screenshots from Google Analytics to display the data. We have long-since dispensed with the lavender hue.
Our metrics are these:
- Unique visitors for the last 5 months
- Traffic, year over year
- Traffic trends (visitors from search and social), over the last 5 months
- Top traffic sources
- Social media followers, clicks, shares
- Social media engagement (as evidenced by retweets/comments left by influencers or peers)
- Top performing blog posts for the month
The data is presented and unpacked with spartan words. Though we conclude with our intentions moving forward, we do not explicitly connect the metrics to the actions that will follow.
The result? Constant battles for buy-in from the client. The natives are restless and something is clearly rotten in the state of Denmark.
From amoeba to…something more advanced than an amoeba
Jump forward in time to November 2013. A few key things have changed in the Mack Web universe.
We’ve finally published Arthur, our long-suffering Truly Monumental Guide to Building Online Communities, giving us a boost in confidence and motivation.
We’ve also undergone a Strat Ops scrub, a time for company self-reflection in which we concluded both that we needed our reports to work and that they were dismally failing to do so.
We discovered Avinash Kaushik’s famous See, Think, Do framework and decided to model our reports accordingly.
The result? This bad boy:
The report still clocks in at an unwieldy 12 pages, but at least we equate the metrics to specific intent and audience behavior.
See Metrics, indicating brand interest and the beginnings of engagement, include:
- Applause, Amplification, and Conversion on social media and blog posts
- New visitors
- Organic Search results
Think Metrics, reflecting an audience almost ready to commit, include:
- Bounce rate
- Page depth
- Social clickthroughs
- Branded SEO traffic
- Visit duration & pages visited
- Conversation quality
Do Metrics, indicating the visitors who had bought in, whole hog, are:
- Visitor loyalty (indicated by returning visitors)
- Conversion rate
- Form submissions
We are cautiously optimistic about the result.
Minor tweaks (and a change in tense, ‘cuz that was getting weird)
Over the next few months, we kept the framework and started tinkering with it to make it more accessible and valuable. We added a little more analysis and an ‘In a Nutshell’ recap with the unique visitors over time.
Eventually, we prettied it up, matching the sleeker style we were beginning to favor.
We stopped using strictly screenshots and started tentatively creating some of our own charts and graphs:
The unfortunate upshot of these additions was that the reports kept lengthening and, consequently, our clients’ patience with the reporting process was shortening.
We knew they weren’t reading them all the way through.
Alas. What were we to do?
The next evolution fell after yet another Strat Ops scrub in which we realized something mind-boggling:
It’s more important to create a report that the client will actually read than a report that includes every relevant metric.
That doesn’t mean that we weren’t going to make sure that our clients got the most important info. But we were going to make darn sure that they couldn’t ignore it when they got it.
It was a revelation. The next report was cut in half. Clocking in at 6 pages,and leading with the Do Metrics first, it was designed to grab the client’s eye and say:
Look! We accomplished stuff you care about! Now, here’s how we did it.
We added new, visually-oriented charts and then pulled all those charts together into a one-page dashboard:
It was a thing of beauty and we were justifiably proud of it.
Which explains, of course, why we discarded the whole format the next month.
One small step for man…
In retrospect, it seems so obvious now. In the last year or so, Mack Web has grown famous (in our own eyes, at least) for our ‘Goals not Tools’ approach. We were telling everyone we met (seriously, friends, family, strangers on the street) that a sustainable marketing approach had to start with Goals.
Which is why it’s a little embarrassing to admit that it took us so long to get get around to building our reports around the goals we were trying to accomplish.
Suddenly, the ideal framework became so beautifully clear: the metrics reflecting the performance of each campaign across all channels, organized under the goal they were meant to accomplish.
Of course, nothing is that easy. There was still the matter of choosing the right metrics and visualizing them in a way that our clients would see. It meant ensuring that the performance of each campaign was clearly connected to the strategy behind upcoming campaigns.
It also meant greater customization for each client.
Clients are not uniform. Their reports shouldn’t be either.
The client pictured above cared about form submissions as an indicator of success in…uh, increasing form submissions and branded mentions and site metrics as indicators of progress in the realm of thought leadership; other clients cared about other things.
Things like the individual breakdown of engagement on each social network to indicate progress toward identifying and engaging the ideal audience.
Some were more worried about their competitors than others.
If you’re serious about communicating value to your clients…be sure you’re talking to your clients. Not to some anonymized client generalization.
Side Note: To meet or not to meet
While we’re talking about talking to the clients, here’s a thing we’ve gone back and forth on: do we send the report or present the report? Ideally, the report should stand on its own. But how can we be sure they’re really reading and understanding if we aren’t walking them through it?
For now, we meet if there are pressing issues that need to be discussed. We certainly don’t want to waste our time or theirs by reading the report out loud to them. So, if we do in fact have a meeting, we focus on the three most important things that we want them to hear. Sometimes that means an action they need to take, something we want to experiment with, or helping them understand the bigger picture of year-over-year growth instead of just the results of the most current quarter.
Not quite there yet
All told, we’ve come a long way in the field of reporting. Of course, we’re still refining our presentation. For example: charts or line graphs? line graphs or bar graphs?
We’ve continued to streamline the dashboards (Natalie just keeps making them prettier and prettier) so that they quickly and easily communicate progress and keep the focus on the things that are important to the client (and also coupled with the things that we really want them to hear).
And as we hone in on the best quanlitative metrics to be found, our reports will continue evolving to reflect them.
But an incomplete quest is not a wasted endeavor. Rep-volution is all about learning: what causes us to thrive and what leads to a quiet extinction in the grass. And, boy, have we learned.
One lesson is that our drive for perfection, though necessary in some ways, may have also hurt us in others.
Clients like consistency. Too many major reporting overhauls can be disruptive.
Of course, the nature of evolution dictates that there will changes and improvements going forward – new metrics to use, new clarity to share. The only constant is change, etc. But we’ve already decided that we will be more deliberate about it. We’ll try changing one variable at a time.
This is a double win for us. We ease the client into the changes, giving them the consistency they crave. And we also get to collect specific feedback on how they receive each change without muddying the waters. No muddy waters means an easy fix if it turns out the change wasn’t for the better. Fail fast, right?
No movie is complete without a few explosions
But here is possibly the most important lesson, driven home over and over again:
The best reports in the world aren’t going to build the trust you need if you never act on what you report.
We’ve hinted at this throughout our trip down memory lane, but now we’re saying it plainly: the ultimate point of reporting is to gain trust. But unless you’re prepared to take action as a result of what you’ve reported – to ride any momentum forward, to reinvigorate or give a swift death to flagging tactics, to apply gained insights into user behavior and audience triggers – you’re going to snuff out whatever trust your beautiful reports may have earned.
At Mack Web, we’ve created a regular meeting we call Catapult.
In these meetings, we check out any red flags (or green flags or gold stars or skywriting) that turned up in the reporting cycle. (To be clear: this is not the only time we monitor or respond to progress or regress. We deal with things in the moment. Catapult is to make sure the whole team is keeping an eye out for long term trends or patterns.)
We discuss the issue and decide on a response. We walk away from Catapult with a clearly defined list of action items or continued monitoring to be done.
This is the song that never ends…
Yes, it goes on and on, my friends. Metrics and KPIs are refined. Reports are tweaked. We cannot yet crown ourselves Imperial Highness of the Reporting Cycle. But we’re creeping up on that throne. And on that blessed day, we will remember the little people who aided our ascension.
Vive le Rep-volution!
Whaddya say, little people? What have you learned about reporting? What comes next in the Rep-volution?