Putting a Face to Your Firm in Digital Marketing

Who’s the face of your firm?  There’s not necessarily a correct answer to this question but it is important to define your intended face for the firm so that it can be consistently relayed to your audience.

There are three options for selecting the face of your firm.

An Individual

Are you a sole practitioner?  Congratulations, you are the face of your firm as long as you are including personal information in your branded messages.

But that’s not the only instance where an individual is the face of the firm. Sometimes either by design or organically, one person becomes the personification of the business.  This is common for:

  • An owner that expands the firm but does not include other individuals in marketing material.
  • A firm that has one person that primarily interacts with prospects and clients and delivers services.

A group of people

When a firm includes many trainers, consultants, and professional coaches, the group might be leveraged as a collective face of the firm. This is typically accomplished by rotating individuals into marketing materials or releasing content by category and assigning it to individuals according to their specialty.

Anonymous

The last option is to present the firm’s brand but leave the individuals anonymous.  This is often a personal decision by professionals that prefer to work “behind the scenes” or a result of a firm growing too large to narrow down the face to a manageable group of individuals.  However, make sure that anonymity is not keeping you from making a personal connection with your target audience. It should be possible to find contact information like an address or phone number so that your firm can be verified as legitimate and approachable.

Common problems that signify there is a problem with how the face of the firm is presented is:

  • An individual face of the organization is perceived to be hogging the limelight and creating resentment with other members of the firm who feel like they are not being given their due.
  • A group of people are selected as the face of the organization but as people come and go, the group expands and contracts organically, and makes a personal connection with your audience difficult.
  • An anonymous face becomes “faceless” and the communications are viewed as contrived or without personality.

Make a deliberate decision on the face of your firm and consistently leverage it in your digital marketing.  When done well, it supports the firm’s brand but enhances it with the people that make up that brand.

Split Testing Through Campaign Evolution

In our last post we covered why A/B testing can be difficult for some companies to effectively implement. But that doesn’t mean the principles of testing should be abandoned completely.  An evolutionary process of consistent improvement is a more gradual way of implementing split tests.

Many trainers, consultants, and professional coaches set up a template for a marketing campaign, run it for a period of time until they get sick of it, and then do a redesign which starts the process over again.  While this keeps them up to date on new trends in marketing and technology, it’s not introducing improvements as the campaign runs like A/B testing will.

A/B testing at its best is a duplicate communication with one specific difference.  That difference can then be tested for effectiveness and the better performing treatment is then adopted. Digital marketing campaigns should have some level of repetitiveness especially in layout and design.  These repeating elements can be leveraged as a control and updated one at a time and compared for effectiveness over time in the same way that A/B tests are.

Making gradual split tests while running a digital marketing campaign avoids the common limiting factors of A/B testing but still allows for ongoing testing for gradual improvement.  However, there are a few restrictions to keep in mind.

Time

Time is the primary limiting factor in doing gradual split tests. Because the sends are more spread out, changes cannot be implemented as quickly.  Make sure you allow enough time on a single change to gather sufficient information.  For example, if you have a monthly newsletter you’ll need to run the change twice to validate a changes effectiveness which means each change will take three months to validate.

One Change at a Time

This is really another limiting factor of time but subtly different.  Split testing relies on testing a single element to know that particular change is responsible for an improvement or decline. Having more than one thing changed to speed up the process only serves to invalidate your test.

Same Audience

Since there is a gap of time between treatments you need to keep consistency with the audience. Too many changes into who receives the communication will serve to invalidate the test.

Content

While many elements are repetitive in digital marketing, content often is not.  If you have small elements of recurring content, like an email subject line with repeating title or commonly used social media tags, then by all means test it.  But most of the content variables will not repeat consistently enough to be tested in a gradual ongoing method.

 

If you plan for these restrictions and formulate gradual split test changes around them, you can gather many of the same insights that A/B tests will provide without dedicating nearly as much time or as many resources.

Why Companies Struggle to Implement A/B Testing in Their Digital Marketing

A/B (split) testing is the most popular and often most effective way of testing multiple versions of an app, email, or webpage to see which version produces better results. However only 27% – 38% of companies actively do split testing. Of the companies that actively do split testing, almost half claim they do it infrequently or inaccurately. So if A/B tests offer the best opportunity to objectively improve digital marketing conversions, why do so many companies skip it entirely?  Split testing often presents technical or resource challenges that smaller companies struggle to overcome.

There are three common limiting factors that prevent trainers, consultants, and professional coaches from successfully implementing and executing A/B tests:

Time

Marketing is often done at a frenzied pace for many smaller firms.  If a marketing campaign is being done rapidly, or worse yet as a fire drill, it’s difficult to consistently produce communications and meet deadlines.  Making time to take on an additional burden of creating a separate version of a communication and reviewing the analytics to glean valuable insight is simply unrealistic.

A/B Testing Tools

There are valuable tools available to facilitate A/B testing.  Some are built in to digital marketing platforms where others can be added on to your existing platform.  However, inclusive platforms or add on components can be technically challenging to implement and incur additional cost.  Increasing the marketing budget or meeting the requirements to leverage the testing tool is often an unsurmountable barrier for smaller firms.

Sample Size

Accurate A/B testing relies on a sufficient sample size.  If a smaller firm’s website traffic or email recipients don’t generate enough raw data then the A/B test will be flawed and runs the risk of providing inaccurate results.

If you are in the majority of companies that don’t do split testing, is it because of a legitimate limitation to execute them?  If so, it doesn’t mean that you can’t objectively assess your digital marketing but it likely does mean that you will need to go about it in a more gradual way. In our next post, we will cover a less robust form of split testing that relies on an evolving digital marketing campaign.

Roll Out Schedule: Single Launch or Phased Releases

Our site update is getting closer to completion and you may have noticed in the last several posts that we’ve released updates in phases.  Four to be exact: blog update, website update, content revision, and SEO element revision.  Hopefully those last two were less obvious or invisible to our visitors but this phased rollout raises the question, why not get all the updates set up and then do a single launch? Neither a single launch nor a phased rollout is appropriate for all situations but each offer unique advantages that trainers, consultants, and professional coaches should consider when rolling out an update.

  • Phased Release 

    Phased releases have the advantage of evaluating elements of your update without the whole project going live. It’s an important aspect of the AGILE process and allows for intermittent testing and analysis. It also allows for individual elements to launch faster rather than waiting for the whole update to be go-live ready.

  • Single Launch 

    Launching an update all at one time is a more traditional method but still offers advantages. Cohesion is the biggest benefit.  For example, if you are updating a layout for an email campaign, it’s best to have the design fully fleshed out rather than launching with a half-developed concept. A single launch can also be used as a promotional tool if the update is significant enough that it might draw attention from your audience.

 

In our case, launching the blog update gave a badly needed refresh to our posts while allowing us to test the template before deploying it to the rest of the site.  While the interim period lacked cohesion between the site and the blog, we were sure to have a post explaining the process. Once the template was deployed site wide, it was an obvious choice to make content and SEO element updates live as they were ready because they were unlikely to be visible to our visitors.

Phased launches are often most beneficial due to their expedited go-live process and ability to test the results. However, a solid production schedule must be defined and followed.  If your digital marketing often gets postponed or you’ve struggled to adhere to deadlines, then a single launch might be a better fit.  A perpetual “under construction” notice or half-baked appearance gives your audience the impression that your marketing, and therefore your product or service, is not your primary focus. A phased rollout that gets stuck mid-change causes confusion, often looks unprofessional, and might negatively impact your processes.

If you can logically break up your project into multiple releases, do a phased launch.  If you can’t see any natural breaks or are uncertain of your ability to consistently move through those releases, do a single launch.

Keep Your Digital Marketing Up to Date with Technology

Digital marketing is a unique blend of communication and technology.  Both aspects need to work in tandem for effective campaigns.  While communication platforms change, the basics for communication, video, text, interaction, and design remain fairly static.  Technology on the other hand . . . changes rapidly. Don’t allow your digital marketing to be undermined by falling behind on technology.

This post is a self-criticism.  Our site, especially the blog, is in dire need of a technology update.  Obviously client projects come first but we’ve allowed this to fall so far behind that it’s impacting our SEO (because we aren’t meeting some of the responsive layout requirements that Google looks for).  It’s a good example of how falling behind in one channel can cascade down to others.

It often takes a concerted effort to get your digital marketing technology up to date.  Our blog for instance has the latest plugins and updates; unfortunately the layout itself has fallen behind the times which limits the entire site’s performance.

Don’t repeat the error we have made here.  Review the technology that your digital marketing is based on at least every other year.  That ensures that you won’t fall far behind current technology and allows you to identify elements that are out of date and map a strategy to update them.

Image courtesy of Joel Penner on Flickr.

Anticipate the Summer Slow Down

Acknowledging headwinds is the first step to overcoming them.  Most trainers, consultants and professional coaches experience a slowdown in their digital marketing over the summer months. Anticipating and preparing for that three month lull is critical to ensure that you meet your marketing targets.

Hope is not a strategy.  Almost every training and consulting market will be less available in the summer months. Unless your business is growing rapidly, chances are you have fewer visitors or a percent slow down compared to previous months due to traditional summer slowdowns.

You shouldn’t panic because of digital marketing performance drops during vacation season. Instead prepare for it in one of two ways:

  1. Pad Performance
    If you’ve experienced a summer slowdown in the past, you’re probably going to again next summer. Plan for the slowdown in your annual marketing goals.  The other seasons need to produce enough leads or sales to overcome the anticipated summer deficiency.Rather than setting a standard monthly target, compare year over year statistics to identify what a typical summer slowdown has been for your digital marketing campaigns. Then build a lower summer conversion into your plan and set benchmarks to pad performance. If summer happens to stay consistent then it’s a great opportunity to outperform annual goals.
  2. Increase Activity
    If you have the time or resources, you can increase your digital marketing activity. Essentially this is casting a wider net or increasing marketing frequency to improve your odds of connecting with those prospects that are available in the summer.  Make sure that the increased activity isn’t overbearing. There’s no benefit to alienating good prospects in an effort to keep summer numbers consistent.

Don’t panic when the summer slowdown hits. As long as you maintain your processes and activity, it won’t be depressed for long.

Frankensteined Digital Marketing

Our last post explored how overusing or poorly deploying tools can limit options and complicates customization.   This same concept can be expanded into, what I call, Frankensteining your digital marketing components.  Piecing together too many disparate elements is a common cause of technical problems and bad user experiences.

Frankensteining happens when you introduce new elements onto a digital marketing channel and there is either a technical breakdown or unintended bad user experience.  Plugins and APIs are the chief culprits when a site goes from well-developed to monstrosity.

Let’s again use a website as an example.  Frankensteined websites are not that uncommon but are often referred to as “cluttered”.  I was recently on a site trying to read an article and I was hit with three calls-to-action as soon as I landed on the page.  The first was a pop up box that ghosted out the background.  As I closed that, I saw a footer bar advertising another offer.  After I scrolled down the page, a pop up appeared from the lower right corner asking if I’d like to start a chat with the sales team.

Any one of these would have been a perfectly acceptable way of introducing a call-to-action.  But having all three pile on me right away was downright annoying. If they had squeezed something into the header the offers literally would have come at me from all angles. It was annoying enough that I dug through the code a bit to see how the page was executing.

It turned out that all three offers were from separate plugins for the site.  I’m certain the admin for the site did not intend for me to have this user experience but frankensteining the components together resulted in this unintended consequence.

It’s not difficult to fall prey to Frankensteined digital marketing.  In the above example, the chat window appeared on every page so I’m certain it was an API driving that component sitewide.  The footer bar appeared on every blog post and is likely an API defined for content pages only.  The pop up window looked to be what the company was featuring at that time and was likely added as the call to action for that page via a plugin.

Frankensteining happens in all marketing channels, not just websites.  Apps and plugins can change a simple social media page into a cluttered nightmare of links and automated “features”.  Even email can get cobbled together with external components that often cause technical incompatibilities.

Be diligent in how you are piecing components together.  If you find that you are cobbling together a lot of components to achieve new objectives, it might be time to redesign how you are delivering your digital marketing.  Often times the redesign will provide a fresh start that results in a cleaner and simpler solution. Piecing too many components together runs the risk of creating a monstrous problem in technical glitches and bad user experiences.

Image Courtesy of dullhunk | flicker.com

Are Your Digital Marketing Tools Limiting Options?

There is a digital marketing tool to help you through almost any task.  Some of these tools are robust in trying to tackle multiple functions where other are specifically designed to do one specific function.  This multitude of tools gives the impression that anything is possible if they can be combined into a cohesive experience.  Unfortunately, there is usually a hidden problem with implementing sets of tools because it limits options and complicates customization.

Content Management Systems (CMS) are a good example of a robust tool.  Most CMS systems allow for tool, template, and administrative customization.

We recently worked with a client who had a CMS set up for their website that offered a pre-set header, navigation, body layout, and installed plugins set.  It was set up so that a site admin could create or edit pages by simply adding text, images, or selecting plugins. The primary focus for the layout was on a responsive design. Additional page elements had been removed to keep the format simple and ensure that the site displayed well on mobile devices.

The client needed a sub-site landing page created for a program they were offering and wanted the sub-site to mimic materials they had already designed.  The design specifically designated:

  • A six column layout in the body of the page.
  • A navigation element specific to the landing pages that would not appear on other pages of the site.

Neither requirement sounds unreasonable, right?

Sometimes tools make simple sounding tasks into complex ones.  The template had been built to meet a maximum of four columns in the body.  Inserting additional columns caused significant layout problems and was not responsive when viewed on a smaller screen.  Since the navigation was pre-defined, there was no way to insert navigation elsewhere on the page or exclusively to the sub-site pages.

The result was a customized development project to not only create these elements but to also integrate them into the CMS. In this case, rather than solving a problem, the tool made the problem significantly more complex.

Carefully select the tools that you plan to use in your digital marketing and be wary of trying to piece too many together.  Think of your tools like a craftsman.  A plumber doesn’t show up with a full set of carpenter’s tools and vice versa.  They have a tool box specifically designed to do the job they need to complete.

Some digital marketing tools appear robust but either offer poorly crafted functionality or offer functions that aren’t useful.  If you limit your tools to a core set it often makes adding options and customizations simpler because the changes don’t have to be compatible with a complex suite of settings.

Digital Marketing Campaign Examples: Inspiration or Exaggeration

There is no shortage of great ideas for improving your digital marketing.  Looking to other campaigns is often a valuable resource to see how others are leveraging tactics and technology to optimize their efforts.  However, it’s important for trainers, consultants, and professional coaches to weight the source of the information and think critically about whether a digital marketing strategy makes sense for their firm.

Beware “get rich quick” digital marketing ideas.  These tend to be simplistic suggestions with promises of unbelievable returns.  Digital marketing can be rewarding but it takes focus and consistency to see results.  Any promises that circumvent the need for dedicated work are unlikely to see reliable returns.

It’s often easy to spot exaggerated claims when the motivations for making them are obvious.  If someone is promoting or selling a tool, we often tend to be skeptical of that information.  But what about times when the motivation for exaggerating digital marketing results is less clear?  It’s easier to get caught up in claims of wild success if the source seems unbiased.

Years ago, I encountered this situation with a sales training firm that I work with.  The owner of the firm had attended a conference where the owner of another firm claimed to be running events twice a month, would fill the room each time, and would close eighty percent of attendees right there.  The success of this program was attributed to a digital marketing promotional campaign and a registration process that pre-screened applicants.

My client was blown away by the results he was hearing and wanted to emulate the campaign exactly.  He proposed scrapping an event schedule that we had been running with consistent success and going to the twice a month plan.  Based on the numbers shared at the conference, we could effectively double the amount of leads from events that we were generating from the current event schedule. I set up a digital marketing campaign modeled after the examples we were provided. After three months we found that we started strong but attendance dwindled after the first couple events.  Worse yet we had half as much closed business as we had averaged doing an event every six months.

As you’d expect, we returned to the original examples to see what mistakes we had made.  I was concerned about list exhaustion offering events so rapidly so reviewed the materials and contacted the owner that claimed to have stellar results. He agreed that our campaign seemed to have all the same critical elements that his did and was at a loss to explain why we would experience such significantly different results. So I asked for some metrics on the other owner’s digital marketing campaigns to compare individual elements to see where we might be off base.  The other owner’s helpfulness ended there and he was unwilling to provide anything other than high-level general information.

My client and I tried to work backwards through the analytics to see if there was an obvious deficiency and in the process we started adding up numbers.  Based on the high level metrics that the other owner had delivered, we estimated his firm would be bringing in over $20 million a year just on this one digital marketing campaign.  The problem with that was that the conference was for small/mid-size businesses and capped attending firms at $5 million in revenue.

That caused us to look into the other firm and their digital marketing which reveled additional discrepencies with what had been shared. In short, the other owner was unaware or directly lying about his level of success. I never followed up after we found the discrepancies so I don’t know for sure what motivated him to exaggerate the results.  I doubt it was malicious. I suspect it was simply looking like an expert at the conference and the accolades that brought.

The point was that my client and I had wasted a lot of time and effort migrating to a model that appeared to be more productive but actually cost conversions.  Don’t make the same mistake I did.  Other organizations digital marketing can be a great source of inspiration but think critically about any claims of wild success.  If it sounds too good to be true, it probably is. You can waste a lot of time, money and effort chasing those exaggerations.

Image Courtesy of maxpixel.freegreatpicture.com

Data, Not Preference, Is What Drives Digital Marketing Improvement

It’s said that stats can be used to prove anything.  That is a true statement when we allow our preferences to bias how we conduct digital marketing campaigns.  Digital marketing should be data-driven and changes should be honestly tested to see what is most effective. Dictating changes based on preferences will suit your tastes and make you feel like your gut feel is spot on, but data will drive real performance improvement.

The trouble with preference bias is it’s something people often aren’t self-aware of.  Trainers, consultants and professional coaches unknowingly craft experiments that make their preferences shine through as the best way of doing things.

We had an obvious case of this happening recently with a client. The client attributed his email marketing campaigns success to putting questions in the subject line. The problem was that the open rate had been in a noticeable decline over the last twelve months. Our client was resistant to testing other types of subjects because he was certain that wasn’t the problem.  He had used questions in the subject consistently and had several best-practice articles that sited questions as the best converting subjects. In fact, he said, “I tested subject lines that weren’t questions seven months ago and the open rate was worse.”

After experimenting with some other potential causes, including changing email marketing platforms to make sure that delivery was not a problem, we reviewed the test he had run.  It turned out that he had used the subject right after adding a new list from a trade show.  Many of those first time subscribers were lured into signing up for his email list but weren’t motivated to read his campaign, at least not right away. The bounce rate data confirmed that the new contacts were the catalyst for the open rate drop, not the subject line. Upon this realization he agreed to trying subjects without questions.

While the other tests produced small or moderate changes, the updated subject lines produced the most notable improvements.  Of course this is not to say that questions make for bad subject lines.  But it definitely does mean that exclusively using them in this client’s case was negatively impacting his email marketing performance.

There are many potential biases in digital marketing and none of them should be universally adopted without testing.  If you are not implementing changes that are counter to your preference on a semi-regular basis then there’s a fair chance that your preferences are driving your decisions rather than the data.

Photo credit: Wikimedia Commons, Atlasowa

1 3 4 5 6 7 18