As a communicator, it is likely you have been challenged to find a way to do business smarter, faster and more efficiently. Such a challenge creates the perfect opportunity and motivation to rethink what you do and how you do it.
In recent years, many of the world’s most sophisticated brands and their communications teams have chosen a new method for communications analytics. Companies like Merck, Berkshire Hathaway, Exelon and others, all with reputations for using the world’s best talent and technologies, have embraced this new approach. They have found through robust testing that it consistently generates more reliable analytics and much richer insights that make them smarter, more effective and more efficient.
Phase 1: Big books (1950-2000)
Monitoring was done manually, usually with thick clip books filled with physical copies of media coverage. It was not uncommon to wait weeks or months for metrics, and communications impact often was measured simply as the thickness of the clip book. Some organizations also relied on the now-discredited ad value equivalent, or AVE, a misguided attempt to equate news coverage with the value of ads that ran on those same pages or in the same broadcast segment.
Phase 2: Technology cure all (2000-2018)
Communications teams gained access to digital streams of any news content they desired, all delivered in an instant via the internet. Metrics became sometimes obsessively focused on counting mentions and impressions, regardless of the quality, context or sentiment.
Part of this wave is so-called attribution analytics. The problems with this automated attribution approach have been well-documented. In a nutshell, these attribution models use technology to connect dots in ways that do not hold up to the scrutiny of even a high school statistics class.
Over time, there became an increasing awareness and acknowledgement of the severe limits of this all-tech approach. In fact, multiple disciplines came to the same conclusion at the same time as corporate communicators, including software companies, self-driving trucks, delivery robots, medical diagnostics and investment services.
Phase 3: Expert-guided tech (2018- )
We are now in the third generation of media monitoring and analytics, augmenting great technology with talented human resources to deliver the best balance of fast data and valuable, reliable insights. Industry after industry has embraced the marriage of fast technology paired with the unique creative, analytical and introspective nature of human experts.
With this hybrid approach and the resultant quality data and deep insights, these professionals are able to make much better decisions, focus their resources and achieve consistently better results. Expert-guided tech has become the fastest-growing segment in the media monitoring and analytics sector.
With expert-guided tech, this quality data is cited regularly by its proponents as being more focused, less stressful and more efficient with limited resources. They also brag about newfound strength in measuring their impact, being more competitive and proving their worth in the organization.
This newfound sophistication gives communications an equal seat at the table with more data-driven disciplines like marketing, sales and technology. And when all resources are measured, this approach is frequently less expensive than the technology cure all approach.
Things have changed, and you need to make the move to expert-guided tech sooner rather than later – or be left behind.
Eric Koefoot is president and CEO of PublicRelay.