Content Strategy Frameworks – How To Ensure They Really Work
12 July 2017, Anne Caborn
Back in the day, content strategy involved a sharp twig and a wax tablet. These days there are content strategies and content strategy tools a plenty. But how do you choose the right ones?
You may have been on a marvellous course (like one of mine, for example) and returned to the office full of inspiration, but if the approach and methodologies you deploy don’t fit the culture of the organisation, you might as well revert to the wax tablet.
Approaches that overturn traditional linear models of content creation can be really powerful (both in terms of getting the work done and changing how the broader organisation thinks about and values content).
Linear models pass de facto ownership along the content creation pipe. So, for example, the digital team may be entirely responsible for commissioning content from a writer, but might find their authority usurped by the relevant subject matter expert (SME) at the content approval stage.
New models seek to create shared ownership and control amongst content professionals AND wider stakeholders, such as subject matter experts, throughout the process – from idea and creation to deployment and maintenance.
My personal favourite content collaboration tools that help achieve this are:
The Core Model
Devised by information architect Are Halland and designed to bring content creators, organisation heads and subject matter experts (SMEs) together in establishing the exact focus of any given piece of content.
It marries business objectives to customer tasks and strips away irrelevant content (and irrelevant actions). So rather than starting with large content elements that go through a protracted creation/approval process, there is a content heads of agreement from the outset.
I recently came across a very succinct article about deploying Core on the US DigitalGov website.
This is based on the pair coding way of working and involves pairing an SME with a content creator for real-time content development. They each take it in turns to write, while the other person observes and asks questions about the approach being taken.
This can be really helpful for organisations where the content approval and compliance process is running out of control and particularly for more technical topics where the SME may be reluctant to relinquish control in term of style of delivery and terminology.
Here’s a useful article on Pair Writing.
Content capability workshops
I’m not suggesting setting these up is easy, but it pays to get everybody within an organisation understanding content and the tools at their disposal to create and sustain it. I’ve used this approach to good effect in the third sector where content teams are small – or non-existent.
The workshops break down key areas, such as getting copy print ready, or how to make simple changes to existing web content, into simple tasks, using easy to understand visuals and take away “how to” notes. The workshops are delivered to people across an organisation, not just those in content, comms or marketing areas.
Even if recipients don’t get that hands on, the quality of what they hand over to content professionals tends to improve noticeably.
I’ve just written a personal blog post on this based on some recent projects.
While content collaboration may seem to focus on web content, the approaches above can be used effectively on the broad spectrum of content creation and refresh projects.
These approaches will also give you a greater understanding of where the content weaknesses are in your organisation. It then becomes easier to select the strategy tools you need to overcome these. For example, is a major website overhaul really the answer when internal ownership and understanding of say, SEO and data are, well, pants?
Which brings me on to…
Measuring success – the carrot AND stick approach
Success measures, KPIs, targets… call them what you will, but you need to measure success and failure.
But that shouldn’t simply be grabbing for Google Analytics (GA) and creating a pie chart. Yes, GA is really, really important. But what aren’t you measuring? You’ll get some ideas from a previous Emarketeers blog on content measures.
But how you select what you measure should be just as collaborative as when creating the content they are measuring. The following definition I have borrowed from American academic, author and consultant Kathleen A. Paris.
Measures of success should tell us whether our goals:
- achieved the results we expected
- produced results we didn’t want or expect
- should be changed
- should continue (or not)
- should be measured in other ways.
As she puts it: “The act of identifying measures of success and collecting the data creates a common language and set of shared expectations within a working group. When committees, groups, teams begin talking in concrete, measurable terms about success, it becomes very quickly apparent whether or not people are thinking in the same terms.”
You can read more useful stuff in this paper from her.
It’s also important to “do the knitting” and work out the interplay between, say, the rise and fall web page bounce rates, support line phone calls, and the weather (which may, for example, impact footfall at the high street end).
But however you dress it up, measures should not be grabbed from the shelf or simply handed down by a higher power. They should be created collaboratively and owned collectively. Ownership of their reporting should also be collectively owned, but with named people being responsible for the upkeep and delivery of individual components.
Malcolm Baldrige of the US National Quality Award Office has 10 reasons success measures fail.
- Imposing management measures on the performing group instead of allowing the group to establish the measures.
- Not involving owners in developing the measurement systems.
- Not sharing measurement information and trends.
- Ignoring the suggestions of those who know the most.
- Failing to recognise and reward improvement.
- Fear of exposing good or bad performance in case it risks disrupting the status quo.
- Poorly defining what needs to be measured.
- Spending too much time on data gathering and reporting.
- Not spending enough time on data analysis and action.
- Failing to consider customer requirements.
Certainly something to work on… collaboratively.