Digital Marketing Campaign Examples: Inspiration or Exaggeration

There is no shortage of great ideas for improving your digital marketing.  Looking to other campaigns is often a valuable resource to see how others are leveraging tactics and technology to optimize their efforts.  However, it’s important for trainers, consultants, and professional coaches to weight the source of the information and think critically about whether a digital marketing strategy makes sense for their firm.

Beware “get rich quick” digital marketing ideas.  These tend to be simplistic suggestions with promises of unbelievable returns.  Digital marketing can be rewarding but it takes focus and consistency to see results.  Any promises that circumvent the need for dedicated work are unlikely to see reliable returns.

It’s often easy to spot exaggerated claims when the motivations for making them are obvious.  If someone is promoting or selling a tool, we often tend to be skeptical of that information.  But what about times when the motivation for exaggerating digital marketing results is less clear?  It’s easier to get caught up in claims of wild success if the source seems unbiased.

Years ago, I encountered this situation with a sales training firm that I work with.  The owner of the firm had attended a conference where the owner of another firm claimed to be running events twice a month, would fill the room each time, and would close eighty percent of attendees right there.  The success of this program was attributed to a digital marketing promotional campaign and a registration process that pre-screened applicants.

My client was blown away by the results he was hearing and wanted to emulate the campaign exactly.  He proposed scrapping an event schedule that we had been running with consistent success and going to the twice a month plan.  Based on the numbers shared at the conference, we could effectively double the amount of leads from events that we were generating from the current event schedule. I set up a digital marketing campaign modeled after the examples we were provided. After three months we found that we started strong but attendance dwindled after the first couple events.  Worse yet we had half as much closed business as we had averaged doing an event every six months.

As you’d expect, we returned to the original examples to see what mistakes we had made.  I was concerned about list exhaustion offering events so rapidly so reviewed the materials and contacted the owner that claimed to have stellar results. He agreed that our campaign seemed to have all the same critical elements that his did and was at a loss to explain why we would experience such significantly different results. So I asked for some metrics on the other owner’s digital marketing campaigns to compare individual elements to see where we might be off base.  The other owner’s helpfulness ended there and he was unwilling to provide anything other than high-level general information.

My client and I tried to work backwards through the analytics to see if there was an obvious deficiency and in the process we started adding up numbers.  Based on the high level metrics that the other owner had delivered, we estimated his firm would be bringing in over $20 million a year just on this one digital marketing campaign.  The problem with that was that the conference was for small/mid-size businesses and capped attending firms at $5 million in revenue.

That caused us to look into the other firm and their digital marketing which reveled additional discrepencies with what had been shared. In short, the other owner was unaware or directly lying about his level of success. I never followed up after we found the discrepancies so I don’t know for sure what motivated him to exaggerate the results.  I doubt it was malicious. I suspect it was simply looking like an expert at the conference and the accolades that brought.

The point was that my client and I had wasted a lot of time and effort migrating to a model that appeared to be more productive but actually cost conversions.  Don’t make the same mistake I did.  Other organizations digital marketing can be a great source of inspiration but think critically about any claims of wild success.  If it sounds too good to be true, it probably is. You can waste a lot of time, money and effort chasing those exaggerations.

Image Courtesy of maxpixel.freegreatpicture.com

Data, Not Preference, Is What Drives Digital Marketing Improvement

It’s said that stats can be used to prove anything.  That is a true statement when we allow our preferences to bias how we conduct digital marketing campaigns.  Digital marketing should be data-driven and changes should be honestly tested to see what is most effective. Dictating changes based on preferences will suit your tastes and make you feel like your gut feel is spot on, but data will drive real performance improvement.

The trouble with preference bias is it’s something people often aren’t self-aware of.  Trainers, consultants and professional coaches unknowingly craft experiments that make their preferences shine through as the best way of doing things.

We had an obvious case of this happening recently with a client. The client attributed his email marketing campaigns success to putting questions in the subject line. The problem was that the open rate had been in a noticeable decline over the last twelve months. Our client was resistant to testing other types of subjects because he was certain that wasn’t the problem.  He had used questions in the subject consistently and had several best-practice articles that sited questions as the best converting subjects. In fact, he said, “I tested subject lines that weren’t questions seven months ago and the open rate was worse.”

After experimenting with some other potential causes, including changing email marketing platforms to make sure that delivery was not a problem, we reviewed the test he had run.  It turned out that he had used the subject right after adding a new list from a trade show.  Many of those first time subscribers were lured into signing up for his email list but weren’t motivated to read his campaign, at least not right away. The bounce rate data confirmed that the new contacts were the catalyst for the open rate drop, not the subject line. Upon this realization he agreed to trying subjects without questions.

While the other tests produced small or moderate changes, the updated subject lines produced the most notable improvements.  Of course this is not to say that questions make for bad subject lines.  But it definitely does mean that exclusively using them in this client’s case was negatively impacting his email marketing performance.

There are many potential biases in digital marketing and none of them should be universally adopted without testing.  If you are not implementing changes that are counter to your preference on a semi-regular basis then there’s a fair chance that your preferences are driving your decisions rather than the data.

Photo credit: Wikimedia Commons, Atlasowa