Reading Between the Lines: A 3-Step Guide to Reviewing Web Page Content

Posted by Jackie.Francis

In SEO, reviewing content is an unavoidable yet extremely important task. As the driving factor that brings people to a page, best practice dictates that we do what we can to ensure that the work we’ve invested hours and resources into creating remains impactful and relevant over time. This requires occasionally going back and re-evaluating our content to identify areas that can be improved.

That being said, if you’ve ever done a content review, you know how surprisingly challenging this is. A large variety of formats and topics alongside the challenge of defining “good” content makes it hard to pick out the core elements that matter. Without these universal focus areas, you may end up neglecting an element (e.g. tone of voice) in one instance but paying special attention to that same element in another.

Luckily there are certain characteristics — like good spelling, appealing layouts, and relevant keywords — that are universally associated with what we would consider “good” content. In this three-step guide, I’ll show you how to use these characteristics (or elements, as I like to call them) to define your target audience, measure the performance of your content using a scorecard, and assess your changes for quality assurance as part of a review process that can be applied to nearly all types of content across any industry.


Step 1: Know your audience

Arguably the most important step mentioned in this post, knowing your target reader will identify the details that should make up the foundation of your content. This includes insight into the reader’s intent, the ideal look and feel of the page, and the goals your content’s message should be trying to achieve.

To get to this point, however, you first need to answer these two questions:

  1. What does my target audience look like?
  2. Why are they reading my content?

What does my target audience look like?

The first question relies on general demographic information such as age, gender, education, and job title. This gives a face to the ideal audience member(s) and the kind of information that would best suit them. For example, if targeting stay-at-home mothers between the ages of 35 and 40 with two or more kids under the age of 5, we can guess that she has a busy daily schedule, travels frequently for errands, and constantly needs to stay vigilant over her younger children. So, a piece that is personable, quick, easy to read on-the-go, and includes inline imagery to reduce eye fatigue would be better received than something that is lengthy and requires a high level of focus.

Why are they reading my content?

Once you have a face to your reader, the second question must be answered to understand what that reader wants from your content and if your current product is effectively meeting those needs. For example, senior-level executives of mid- to large-sized companies may be reading to become better informed before making an important decision, to become more knowledgeable in their field, or to use the information they learn to teach others. Other questions you may want to consider asking:

  • Are they reading for leisure or work?
  • Would they want to share this with their friends on social media?
  • Where will they most likely be reading this? On the train? At home? Waiting in line at the store?
  • Are they comfortable with long blocks of text, or would inline images be best?
  • Do they prefer bite-sized information or are they comfortable with lengthy reports?

You can find the answers to these questions and collect valuable demographic and psychographic information by using a combination of internal resources, like sales scripts and surveys, and third-party audience insight tools such as Google Analytics and Facebook Audience Insights. With these results you should now have a comprehensive picture of your audience and can start identifying the parts of your content that can be improved.


Step 2: Tear apart your existing content

Now that you understand who your audience is, it’s time to get to the real work: assessing your existing content. This stage requires breaking everything apart to identify the components you should keep, change, or discard. However, this task can be extremely challenging because the performance of most components — such as tone of voice, design, and continuity — can’t simply be bucketed into binary categories like “good” or “bad.” Rather, they fall into a spectrum where the most reasonable level of improvement falls somewhere in the middle. You’ll see what I mean by this statement later on, but one of the most effective ways to evaluate and measure the degree of optimization needed for these components is to use a scorecard. Created by my colleague, Ben Estes, this straightforward, reusable, and easy to apply tool can help you objectively review the performance of your content.

Make a copy of the Content Review Grading Rubric

Note: The card sampled here, and the one I personally use for similar projects, is a slightly altered version of the original.

As you can see, the card is divided into two categories: Writing and Design. Listed under each category are elements that are universally needed to create a good content and should be examined. Each point is assigned a grading scale ranging from 1–5, with 1 being the worst score and 5 being best.

To use, start by choosing a part of your page to look at first. Order doesn’t matter, so whether you choose to first check “spelling and grammar” or “continuity” is up to you. Next, assign it a score on a separate Excel sheet (or mark it directly on the rubric) based on its current performance. For example, if the copy has no spelling errors but some minor grammar issues, you would rank “spelling and grammar” as a four (4).

Finally, repeat this process until all elements are graded. Remember to stay impartial to give an honest assessment.

Once you’re done, look at each grade and see where it falls on the scale. Ideally each element should have a score of 4 or greater, although a grade of 5 should only be given out sparingly. Tying back to my spectrum comment from earlier, a 5 is exclusively reserved for top-level work and should be something to strive for but will typically take more effort to achieve than it is worth. A grade of 4 is often the highest and most reasonable goal to attempt for, in most instances.

A grade of 3 or below indicates an opportunity for improvement and that significant changes need to be made.

If working with multiple pieces of content at once, the grading system can also be used to help prioritize your workload. Just collect the average writing or design score and sort them in ascending/descending order. Pages with a lower average indicate poorer performance and should be prioritized over pages whose averages are higher.

Whether you choose to use this scorecard or make your own, what you review, the span of the grading scale, and the criteria for each grade should be adjusted to fit your specific needs and result in a tool that will help you honestly assess your content across multiple applications.

Don’t forget the keywords

With most areas of your content covered by the scorecard, the last element to check before moving to the editing stage is your keywords.

Before I get slack for this, I’m aware that the general rule of creating content is to do your keyword research first. But I’ve found that when it comes to reviews, evaluating keywords last feels more natural and makes the process a lot smoother. When first running through a page, you’re much more likely to notice spelling and design flaws before you pick up whether a keyword is used correctly — why not make note of those details first?

Depending on the outcomes stemming from the re-evaluation of your target audience and content performance review, you will notice one of two things about your currently targeted keywords:

  1. They have not been impacted by the outcomes of the prior analyses and do not need to be altered
  2. They no longer align with the goals of the page or needs of the audience and should be changed

In the first example, the keywords you originally target are still best suited for your content’s message and no additional research is needed. So, your only remaining task is to determine whether or not your keywords are effectively used throughout the page. This means assessing things like title tag, image alt attributes, URL, and copy.

In an attempt to stay on track, I won’t go into further detail on how to optimize keywords but if you want a little more insight, this post by Ken Lyons is a great resource.

If, however, your target keywords are no longer relevant to the goals of your content, before moving to the editing stage you’ll need to re-do your keyword research to identify the terms you should rank for. For insight into keyword research this chapter in Moz’s Beginner’s Guide to SEO is another invaluable resource.


Step 3: Evaluate your evaluation

At this point your initial review is complete and you should be ready to edit.

That’s right. Your initial review.

The interesting thing about assessing content is that it never really ends. As you make edits you’ll tend to deviate more and more from your initial strategy. And while not always a bad thing, you must continuously monitor these changes to ensure that you are on the right track to create a highly valued piece of content.

The best approach would be to reassess all your material when:

  • 50% of the edits are complete
  • 85% of the edits are complete
  • You have finished editing

At the 50% and 85% marks, keep the assessment quick and simple. Look through your revisions and ask the following questions:

  • Am I still addressing the needs of my target audience?
  • Are my target keywords properly integrated?
  • Am I using the right language and tone of voice?
  • Does it look like the information is structured correctly (hierarchically)?

If your answer is “Yes” to all four questions, then you’ve effectively made your changes and should proceed. For any question you answer “No,” go back and make the necessary corrections. The areas targeted here become more difficult to fix the closer you are to completion and ensuring they’re correct throughout this stage will save a lot of time and stress in the long run.

When you’ve finished and think you’re ready to publish, run one last comprehensive review to check the performance status of all related components. This means confirming you’ve properly addressed the needs of your audience, optimized your keywords, and improved the elements highlighted in the scorecard.


Moving forward

No two pieces of content are the same, but that does not mean there aren’t some important commonalities either. Being able to identify these similarities and understand the role they play across all formats and topics will lead the way to creating your own review process for evaluating subjective material.

So, when you find yourself gearing up for your next project, give these steps a try and always keep the following in mind:

  1. Your audience is what makes or breaks you, so keep them happy
  2. Consistent quality is key! Ensure all components of your content are performing at their best
  3. Keep your keywords optimized and be prepared to do additional research if necessary
  4. Unplanned changes will happen. Just remember to remain observant as to keep yourself on track

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Rewriting the Beginner’s Guide to SEO, Chapter 1: SEO 101

Posted by BritneyMuller

Back in mid-November, we kicked off a campaign to rewrite our biggest piece of content: the Beginner’s Guide to SEO. You offered up a huge amount of helpful advice and insight with regards to our outline, and today we’re here to share our draft of the first chapter.

In many ways, the Beginner’s Guide to SEO belongs to each and every member of our community; it’s important that we get this right, for your sake. So without further ado, here’s the first chapter — let’s dive in!


Chapter 1: SEO 101

What is it, and why is it important?

Welcome! We’re excited that you’re here!

If you already have a solid understanding of SEO and why it’s important, you can skip to Chapter 2 (though we’d still recommend skimming the best practices from Google and Bing at the end of this chapter; they’re useful refreshers).

For everyone else, this chapter will help build your foundational SEO knowledge and confidence as you move forward.

What is SEO?

SEO stands for “search engine optimization.” It’s the practice of increasing both the quality and quantity of website traffic, as well as exposure to your brand, through non-paid (also known as “organic”) search engine results.

Despite the acronym, SEO is as much about people as it is about search engines themselves. It’s about understanding what people are searching for online, the answers they are seeking, the words they’re using, and the type of content they wish to consume. Leveraging this data will allow you to provide high-quality content that your visitors will truly value.

Here’s an example. Frankie & Jo’s (a Seattle-based vegan, gluten-free ice cream shop) has heard about SEO and wants help improving how and how often they show up in organic search results. In order to help them, you need to first understand their potential customers:

  • What types of ice cream, desserts, snacks, etc. are people searching for?
  • Who is searching for these terms?
  • When are people searching for ice cream, snacks, desserts, etc.?
    • Are there seasonality trends throughout the year?
  • How are people searching for ice cream?
    • What words do they use?
    • What questions do they ask?
    • Are more searches performed on mobile devices?
  • Why are people seeking ice cream?
    • Are individuals looking for health conscious ice cream specifically or just looking to satisfy a sweet tooth?
  • Where are potential customers located — locally, nationally, or internationally?

And finally — here’s the kicker — how can you help provide the best content about ice cream to cultivate a community and fulfill what all those people are searching for?

Search engine basics

Search engines are answer machines. They scour billions of pieces of content and evaluate thousands of factors to determine which content is most likely to answer your query.

Search engines do all of this by discovering and cataloguing all available content on the Internet (web pages, PDFs, images, videos, etc.) via a process known as “crawling and indexing.”

What are “organic” search engine results?

Organic search results are search results that aren’t paid for (i.e. not advertising). These are the results that you can influence through effective SEO. Traditionally, these were the familiar “10 blue links.”

Today, search engine results pages — often referred to as “SERPs” — are filled with both more advertising and more dynamic organic results formats (called “SERP features”) than we’ve ever seen before. Some examples of SERP features are featured snippets (or answer boxes), People Also Ask boxes, image carousels, etc. New SERP features continue to emerge, driven largely by what people are seeking.

For example, if you search for “Denver weather,” you’ll see a weather forecast for the city of Denver directly in the SERP instead of a link to a site that might have that forecast. And, if you search for “pizza Denver,” you’ll see a “local pack” result made up of Denver pizza places. Convenient, right?

It’s important to remember that search engines make money from advertising. Their goal is to better solve searcher’s queries (within SERPs), to keep searchers coming back, and to keep them on the SERPs longer.

Some SERP features on Google are organic and can be influenced by SEO. These include featured snippets (a promoted organic result that displays an answer inside a box) and related questions (a.k.a. “People Also Ask” boxes).

It’s worth noting that there are many other search features that, even though they aren’t paid advertising, can’t typically be influenced by SEO. These features often have data acquired from proprietary data sources, such as Wikipedia, WebMD, and IMDb.

Why SEO is important

While paid advertising, social media, and other online platforms can generate traffic to websites, the majority of online traffic is driven by search engines.

Organic search results cover more digital real estate, appear more credible to savvy searchers, and receive way more clicks than paid advertisements. For example, of all US searches, only ~2.8% of people click on paid advertisements.

In a nutshell: SEO has ~20X more traffic opportunity than PPC on both mobile and desktop.

SEO is also one of the only online marketing channels that, when set up correctly, can continue to pay dividends over time. If you provide a solid piece of content that deserves to rank for the right keywords, your traffic can snowball over time, whereas advertising needs continuous funding to send traffic to your site.

Search engines are getting smarter, but they still need our help.

Optimizing your site will help deliver better information to search engines so that your content can be properly indexed and displayed within search results.

Should I hire an SEO professional, consultant, or agency?

Depending on your bandwidth, willingness to learn, and the complexity of your website(s), you could perform some basic SEO yourself. Or, you might discover that you would prefer the help of an expert. Either way is okay!

If you end up looking for expert help, it’s important to know that many agencies and consultants “provide SEO services,” but can vary widely in quality. Knowing how to choose a good SEO company can save you a lot of time and money, as the wrong SEO techniques can actually harm your site more than they will help.

White hat vs black hat SEO

“White hat SEO” refers to SEO techniques, best practices, and strategies that abide by search engine rule, its primary focus to provide more value to people.

“Black hat SEO” refers to techniques and strategies that attempt to spam/fool search engines. While black hat SEO can work, it puts websites at tremendous risk of being penalized and/or de-indexed (removed from search results) and has ethical implications.

Penalized websites have bankrupted businesses. It’s just another reason to be very careful when choosing an SEO expert or agency.

Search engines share similar goals with the SEO industry

Search engines want to help you succeed. They’re actually quite supportive of efforts by the SEO community. Digital marketing conferences, such as Unbounce, MNsearch, SearchLove, and Moz’s own MozCon, regularly attract engineers and representatives from major search engines.

Google assists webmasters and SEOs through their Webmaster Central Help Forum and by hosting live office hour hangouts. (Bing, unfortunately, shut down their Webmaster Forums in 2014.)

While webmaster guidelines vary from search engine to search engine, the underlying principles stay the same: Don’t try to trick search engines. Instead, provide your visitors with a great online experience.

Google webmaster guidelines

Basic principles:

  • Make pages primarily for users, not search engines.
  • Don’t deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
  • Think about what makes your website unique, valuable, or engaging.

Things to avoid:

  • Automatically generated content
  • Participating in link schemes
  • Creating pages with little or no original content (i.e. copied from somewhere else)
  • Cloaking — the practice of showing search engine crawlers different content than visitors.
  • Hidden text and links
  • Doorway pages — pages created to rank well for specific searches to funnel traffic to your website.

Full Google Webmaster Guidelines version here.

Bing webmaster guidelines

Basic principles:

  • Provide clear, deep, engaging, and easy-to-find content on your site.
  • Keep page titles clear and relevant.
  • Links are regarded as a signal of popularity and Bing rewards links that have grown organically.
  • Social influence and social shares are positive signals and can have an impact on how you rank organically in the long run.
  • Page speed is important, along with a positive, useful user experience.
  • Use alt attributes to describe images, so that Bing can better understand the content.

Things to avoid:

  • Thin content, pages showing mostly ads or affiliate links, or that otherwise redirect visitors away to other sites will not rank well.
  • Abusive link tactics that aim to inflate the number and nature of inbound links such as buying links, participating in link schemes, can lead to de-indexing.
  • Ensure clean, concise, keyword-inclusive URL structures are in place. Dynamic parameters can dirty up your URLs and cause duplicate content issues.
  • Make your URLs descriptive, short, keyword rich when possible, and avoid non-letter characters.
  • Burying links in Javascript/Flash/Silverlight; keep content out of these as well.
  • Duplicate content
  • Keyword stuffing
  • Cloaking — the practice of showing search engine crawlers different content than visitors.

Guidelines for representing your local business on Google

These guidelines govern what you should and shouldn’t do in creating and managing your Google My Business listing(s).

Basic principles:

  • Be sure you’re eligible for inclusion in the Google My Business index; you must have a physical address, even if it’s your home address, and you must serve customers face-to-face, either at your location (like a retail store) or at theirs (like a plumber)
  • Honestly and accurately represent all aspects of your local business data, including its name, address, phone number, website address, business categories, hours of operation, and other features.

Things to avoid

  • Creation of Google My Business listings for entities that aren’t eligible
  • Misrepresentation of any of your core business information, including “stuffing” your business name with geographic or service keywords, or creating listings for fake addresses
  • Use of PO boxes or virtual offices instead of authentic street addresses
  • Abuse of the review portion of the Google My Business listing, via fake positive reviews of your business or fake negative ones of your competitors
  • Costly, novice mistakes stemming from failure to read the fine details of Google’s guidelines

Fulfilling user intent

Understanding and fulfilling user intent is critical. When a person searches for something, they have a desired outcome. Whether it’s an answer, concert tickets, or a cat photo, that desired content is their “user intent.”

If a person performs a search for “bands,” is their intent to find musical bands, wedding bands, band saws, or something else?

Your job as an SEO is to quickly provide users with the content they desire in the format in which they desire it.

Common user intent types:

Informational: Searching for information. Example: “How old is Issa Rae?”

Navigational: Searching for a specific website. Example: “HBOGO Insecure”

Transactional: Searching to buy something. Example: “where to buy ‘We got y’all’ Insecure t-shirt”

You can get a glimpse of user intent by Googling your desired keyword(s) and evaluating the current SERP. For example, if there’s a photo carousel, it’s very likely that people searching for that keyword search for photos.

Also evaluate what content your top-ranking competitors are providing that you currently aren’t. How can you provide 10X the value on your website?

Providing relevant, high-quality content on your website will help you rank higher in search results, and more importantly, it will establish credibility and trust with your online audience.

Before you do any of that, you have to first understand your website’s goals to execute a strategic SEO plan.

Know your website/client’s goals

Every website is different, so take the time to really understand a specific site’s business goals. This will not only help you determine which areas of SEO you should focus on, where to track conversions, and how to set benchmarks, but it will also help you create talking points for negotiating SEO projects with clients, bosses, etc.

What will your KPIs (Key Performance Indicators) be to measure the return on SEO investment? More simply, what is your barometer to measure the success of your organic search efforts? You’ll want to have it documented, even if it’s this simple:

For the website ________________________, my primary SEO KPI is _______________.

Here are a few common KPIs to get you started:

  • Sales
  • Downloads
  • Email signups
  • Contact form submissions
  • Phone calls

And if your business has a local component, you’ll want to define KPIs for your Google My Business listings, as well. These might include:

  • Clicks-to-call
  • Clicks-to-website
  • Clicks-for-driving-directions

Notice how “Traffic” and “Ranking” are not on the above lists? This is because, for most websites, ranking well for keywords and increasing traffic won’t matter if the new traffic doesn’t convert (to help you reach the site’s KPI goals).

You don’t want to send 1,000 people to your website a month and have only 3 people convert (to customers). You want to send 300 people to your site a month and have 40 people convert.

This guide will help you become more data-driven in your SEO efforts. Rather than haphazardly throwing arrows all over the place (and getting lucky every once in awhile), you’ll put more wood behind fewer arrows.

Grab a bow (and some coffee); let’s dive into Chapter 2 (Crawlers & Indexation).


We’re looking forward to hearing your thoughts on this draft of Chapter 1. What works? Anything you feel could be added or explained differently? Let us know your suggestions, questions, and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

What the ROAS? A practical guide to improving return on ad spend

Contributor Jacob Baadsgaard shows how ROAS, or return on ad spend, can be used to show the effectiveness of advertising campaigns and whether they are worth the money spent on them. The post What the ROAS? A practical guide to improving return on ad spend appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

It’s here! ‘Enterprise SEO Platforms: A Marketer’s Guide’ is all new for 2018.

MarTech Today’s “Enterprise SEO Platforms: A Marketer’s Guide” has been updated for 2018. Compiled from the latest research, this 55-page report is your source for the latest trends, opportunities and challenges facing the market for SEO software tools as seen by industry leaders,…

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

The Complete Guide to Direct Traffic in Google Analytics

Posted by tombennet

When it comes to direct traffic in Analytics, there are two deeply entrenched misconceptions.

The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.

In this article, we’ll be taking a fresh look at direct traffic in modern Google Analytics. As well as exploring the myriad ways in which referrer data can be lost, we’ll look at some tools and tactics you can start using immediately to reduce levels of direct traffic in your reports. Finally, we’ll discover how advanced analysis and segmentation can unlock the mysteries of direct traffic and shed light on what might actually be your most valuable users.

What is direct traffic?

In short, Google Analytics will report a traffic source of “direct” when it has no data on how the session arrived at your website, or when the referring source has been configured to be ignored. You can think of direct as GA’s fall-back option for when its processing logic has failed to attribute a session to a particular source.

To properly understand the causes and fixes for direct traffic, it’s important to understand exactly how GA processes traffic sources. The following flow-chart illustrates how sessions are bucketed — note that direct sits right at the end as a final “catch-all” group.

Broadly speaking, and disregarding user-configured overrides, GA’s processing follows this sequence of checks:

AdWords parameters > Campaign overrides > UTM campaign parameters > Referred by a search engine > Referred by another website > Previous campaign within timeout period > Direct

Note the penultimate processing step (previous campaign within timeout), which has a significant impact on the direct channel. Consider a user who discovers your site via organic search, then returns via direct a week later. Both sessions would be attributed to organic search. In fact, campaign data persists for up to six months by default. The key point here is that Google Analytics is already trying to minimize the impact of direct traffic for you.

What causes direct traffic?

Contrary to popular belief, there are actually many reasons why a session might be missing campaign and traffic source data. Here we will run through some of the most common.

1. Manual address entry and bookmarks

The classic direct-traffic scenario, this one is largely unavoidable. If a user types a URL into their browser’s address bar or clicks on a browser bookmark, that session will appear as direct traffic.

Simple as that.

2. HTTPS > HTTP

When a user follows a link on a secure (HTTPS) page to a non-secure (HTTP) page, no referrer data is passed, meaning the session appears as direct traffic instead of as a referral. Note that this is intended behavior. It’s part of how the secure protocol was designed, and it does not affect other scenarios: HTTP to HTTP, HTTPS to HTTPS, and even HTTP to HTTPS all pass referrer data.

So, if your referral traffic has tanked but direct has spiked, it could be that one of your major referrers has migrated to HTTPS. The inverse is also true: If you’ve migrated to HTTPS and are linking to HTTP websites, the traffic you’re driving to them will appear in their Analytics as direct.

If your referrers have moved to HTTPS and you’re stuck on HTTP, you really ought to consider migrating to HTTPS. Doing so (and updating your backlinks to point to HTTPS URLs) will bring back any referrer data which is being stripped from cross-protocol traffic. SSL certificates can now be obtained for free thanks to automated authorities like LetsEncrypt, but that’s not to say you should neglect to explore the potentially-significant SEO implications of site migrations. Remember, HTTPS and HTTP/2 are the future of the web.

If, on the other hand, you’ve already migrated to HTTPS and are concerned about your users appearing to partner websites as direct traffic, you can implement the meta referrer tag. Cyrus Shepard has written about this on Moz before, so I won’t delve into it now. Suffice to say, it’s a way of telling browsers to pass some referrer data to non-secure sites, and can be implemented as a <meta> element or HTTP header.

3. Missing or broken tracking code

Let’s say you’ve launched a new landing page template and forgotten to include the GA tracking code. Or, to use a scenario I’m encountering more and more frequently, imagine your GTM container is a horrible mess of poorly configured triggers, and your tracking code is simply failing to fire.

Users land on this page without tracking code. They click on a link to a deeper page which does have tracking code. From GA’s perspective, the first hit of the session is the second page visited, meaning that the referrer appears as your own website (i.e. a self-referral). If your domain is on the referral exclusion list (as per default configuration), the session is bucketed as direct. This will happen even if the first URL is tagged with UTM campaign parameters.

As a short-term fix, you can try to repair the damage by simply adding the missing tracking code. To prevent it happening again, carry out a thorough Analytics audit, move to a GTM-based tracking implementation, and promote a culture of data-driven marketing.

4. Improper redirection

This is an easy one. Don’t use meta refreshes or JavaScript-based redirects — these can wipe or replace referrer data, leading to direct traffic in Analytics. You should also be meticulous with your server-side redirects, and — as is often recommended by SEOs — audit your redirect file frequently. Complex chains are more likely to result in a loss of referrer data, and you run the risk of UTM parameters getting stripped out.

Once again, control what you can: use carefully mapped (i.e. non-chained) code 301 server-side redirects to preserve referrer data wherever possible.

5. Non-web documents

Links in Microsoft Word documents, slide decks, or PDFs do not pass referrer information. By default, users who click these links will appear in your reports as direct traffic. Clicks from native mobile apps (particularly those with embedded “in-app” browsers) are similarly prone to stripping out referrer data.

To a degree, this is unavoidable. Much like so-called “dark social” visits (discussed in detail below), non-web links will inevitably result in some quantity of direct traffic. However, you also have an opportunity here to control the controllables.

If you publish whitepapers or offer downloadable PDF guides, for example, you should be tagging the embedded hyperlinks with UTM campaign parameters. You’d never even contemplate launching an email marketing campaign without campaign tracking (I hope), so why would you distribute any other kind of freebie without similarly tracking its success? In some ways this is even more important, since these kinds of downloadables often have a longevity not seen in a single email campaign. Here’s an example of a properly tagged URL which we would embed as a link:

https://builtvisible.com/embedded-whitepaper-url/?…_medium=offline_document&utm_campaign=201711_utm_whitepaper

The same goes for URLs in your offline marketing materials. For major campaigns it’s common practice to select a short, memorable URL (e.g. moz.com/tv/) and design an entirely new landing page. It’s possible to bypass page creation altogether: simply redirect the vanity URL to an existing page URL which is properly tagged with UTM parameters.

So, whether you tag your URLs directly, use redirected vanity URLs, or — if you think UTM parameters are ugly — opt for some crazy-ass hash-fragment solution with GTM (read more here), the takeaway is the same: use campaign parameters wherever it’s appropriate to do so.

6. “Dark social”

This is a big one, and probably the least well understood by marketers.

The term “dark social” was first coined back in 2012 by Alexis Madrigal in an article for The Atlantic. Essentially it refers to methods of social sharing which cannot easily be attributed to a particular source, like email, instant messaging, Skype, WhatsApp, and Facebook Messenger.

Recent studies have found that upwards of 80% of consumers’ outbound sharing from publishers’ and marketers’ websites now occurs via these private channels. In terms of numbers of active users, messaging apps are outpacing social networking apps. All the activity driven by these thriving platforms is typically bucketed as direct traffic by web analytics software.

People who use the ambiguous phrase “social media marketing” are typically referring to advertising: you broadcast your message and hope people will listen. Even if you overcome consumer indifference with a well-targeted campaign, any subsequent interactions are affected by their very public nature. The privacy of dark social, by contrast, represents a potential goldmine of intimate, targeted, and relevant interactions with high conversion potential. Nebulous and difficult-to-track though it may be, dark social has the potential to let marketers tap into elusive power of word of mouth.

So, how can we minimize the amount of dark social traffic which is bucketed under direct? The unfortunate truth is that there is no magic bullet: proper attribution of dark social requires rigorous campaign tracking. The optimal approach will vary greatly based on your industry, audience, proposition, and so on. For many websites, however, a good first step is to provide convenient and properly configured sharing buttons for private platforms like email, WhatsApp, and Slack, thereby ensuring that users share URLs appended with UTM parameters (or vanity/shortened URLs which redirect to the same). This will go some way towards shining a light on part of your dark social traffic.

Checklist: Minimizing direct traffic

To summarize what we’ve already discussed, here are the steps you can take to minimize the level of unnecessary direct traffic in your reports:

  1. Migrate to HTTPS: Not only is the secure protocol your gateway to HTTP/2 and the future of the web, it will also have an enormously positive effect on your ability to track referral traffic.
  2. Manage your use of redirects: Avoid chains and eliminate client-side redirection in favour of carefully-mapped, single-hop, server-side 301s. If you use vanity URLs to redirect to pages with UTM parameters, be meticulous.
  3. Get really good at campaign tagging: Even amongst data-driven marketers I encounter the belief that UTM begins and ends with switching on automatic tagging in your email marketing software. Others go to the other extreme, doing silly things like tagging internal links. Control what you can, and your ability to carry out meaningful attribution will markedly improve.
  4. Conduct an Analytics audit: Data integrity is vital, so consider this essential when assessing the success of your marketing. It’s not simply a case of checking for missing track code: good audits involve a review of your measurement plan and rigorous testing at page and property-level.

Adhere to these principles, and it’s often possible to achieve a dramatic reduction in the level of direct traffic reported in Analytics. The following example involved an HTTPS migration, GTM migration (as part of an Analytics review), and an overhaul of internal campaign tracking processes over the course of about 6 months:

But the saga of direct traffic doesn’t end there! Once this channel is “clean” — that is, once you’ve minimized the number of avoidable pollutants — what remains might actually be one of your most valuable traffic segments.

Analyze! Or: why direct traffic can actually be pretty cool

For reasons we’ve already discussed, traffic from bookmarks and dark social is an enormously valuable segment to analyze. These are likely to be some of your most loyal and engaged users, and it’s not uncommon to see a notably higher conversion rate for a clean direct channel compared to the site average. You should make the effort to get to know them.

The number of potential avenues to explore is infinite, but here are some good starting points:

  • Build meaningful custom segments, defining a subset of your direct traffic based on their landing page, location, device, repeat visit or purchase behavior, or even enhanced e-commerce interactions.
  • Track meaningful engagement metrics using modern GTM triggers such as element visibility and native scroll tracking. Measure how your direct users are using and viewing your content.
  • Watch for correlations with your other marketing activities, and use it as an opportunity to refine your tagging practices and segment definitions. Create a custom alert which watches for spikes in direct traffic.
  • Familiarize yourself with flow reports to get an understanding of how your direct traffic is converting. By using Goal Flow and Behavior Flow reports with segmentation, it’s often possible to glean actionable insights which can be applied to the site as a whole.
  • Ask your users for help! If you’ve isolated a valuable segment of traffic which eludes deeper analysis, add a button to the page offering visitors a free downloadable ebook if they tell you how they discovered your page.
  • Start thinking about lifetime value, if you haven’t already — overhauling your attribution model or implementing User ID are good steps towards overcoming the indifference or frustration felt by marketers towards direct traffic.

I hope this guide has been useful. With any luck, you arrived looking for ways to reduce the level of direct traffic in your reports, and left with some new ideas for how to better analyze this valuable segment of users.

Thanks for reading!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

How to steal the competition’s best keywords: A 3-step guide

Why invest so much into keyword research when your competitors have already done the work? Columnist Jacob Baadsgaard explains how to use competitive research to inform your paid search keyword strategy. The post How to steal the competition’s best keywords: A 3-step guide appeared first on…

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing