Digital Marketing Goals: Projects for Improvement or Business as Usual

As the year closes out, digital marketing goals should be set for the following year. This can be especially true in digital marketing because campaigns can be repetitive and trainers, consultants, and professional coaches might allow that to lull them into complacency. This can cause swift declines in results due to the digital marketing environment changing so rapidly. As you set your goals they should fall into two categories: Business as Usual (BAU) or Projects for Improvement (PFI).

The reason to break goals into these categories is to ensure that at least a couple fall into PGI. Often times, digital marketing goals look more like a checklist of current activities rather than a list of strategic initiatives. We don’t want the status quo to be the standard moving forward. Each year should have targeted improvement.

So yes, your goals should include your website updates, email marketing outlines, social media schedules, SEO tasks, report/download call to action creation, etc. But these are BAU goals and should make up no more than 75% of your total goals. These goals, while important, should be a given for completion as it’s the action items required to run an effective online marketing campaign.

Put some real thought into PFI goals. What changes can be made to get better results? Does the website need an updated layout or additional sections to drive calls to action? Does the email marketing campaign need updated segments or additional content? Do social media channels need integrated with a particular app to refocus on a particular audience? Whichever PFI goal(s) you choose, it should serve to get marketing tasks done more quickly/efficiently or seek to improve conversions.

PFI goals are often the most difficult to set for digital marketing because they usually involve an element of the unknown. Resist the fear of the unknown and commit to the goal. In doing so you’ll find your campaign improving rather than just happening.

Subjective Digital Marketing Analytics

ID-100211197Metrics are only as good as the analysis. This seems obvious but many trainers’, consultants’, and professional coaches’ marketing campaigns revolve around putting the best face on the results. Rather than spending time “putting lipstick on a pig”, really look at the data to see what channels and efforts are generating results . . . and which aren’t. If you find a particular digital marketing element is lacking, even if it’s one you personally love, implement some changes that are likely to improve it rather than tampering with the data to make it appear better than it is.

Years ago we discussed whether an all-inclusive quarterly newsletter was the best strategy for an email campaign for our client or whether breaking the content into smaller chunks and doing more frequent communication would be more advantageous. We were proposing the idea of less content more frequently because there was a downward trend in the opens and clicks on the newsletter. We wanted to test whether the shorter communications would be more engaging.

Using the email downtrend reports as evidence, we proposed the less content more frequency model. The marketing coordinator (who loved gathering and editing the articles into a publication) rejected the idea flat out. “People love our quarterly newsletter, changing it is a bad idea,” she claimed. In reviewing the numbers, she insisted that the data from the email report was incorrect and pulled up her own report that showed numbers at 60% more than the email platform showed. After going through the reports, we realized that her report came from the website rather than the email marketing platform and showed a second jump in hits and several small increases missing from the email report.

As we discussed the differences in the reports, the marketing coordinator revealed that she posted each of the individual articles to social media individually (which resulted in the series of smaller hits) and that the newsletter was redistributed by an association (which resulted in the second jump in hits).

So the real problem we faced was compiling the reports into a single campaign report and analyzing which channels were doing the best. The expanded data, especially the social media data, further suggested that breaking the newsletter into smaller more focused communications and distributing more frequently would be advantageous.

This is where subjectivity and her love of the quarterly newsletter tempted the marketing coordinator to dress up the data rather than improve the marketing campaign. Her suggestion to combine the reports was to use the email platforms tracking links in the social media posts and in the association redistribution so that all results went to the email platforms report. “After all,” she said, “It doesn’t really matter where they came from, just so long as they are reading the articles.”

Her proposal was to artificially inflate the effectiveness of the email newsletter rather than test ideas that might actually increase its effectiveness. And the reason for it was completely subjective, “Our subscribers want a newsletter with meat. It should be like a magazine, not just a single article. It won’t feel special if we publish it more often.”

Thankfully the marketing coordinator did agree to test a more frequent focused communication and the email marketing metrics improved as well as the other channels from the revamped publishing schedule. Let the metrics tell their story, don’t try to skew them to meet your preferences, and then objectively make a plan from that.

Image courtesy of  Stuart Miles / FreeDigitalPhotos.net