How to Find the Stories in Your Data for Compelling Reporting

Google Analytics Data Visualization Adam Singer

Our hyper-connected digital world is defined by an overabundance of data. Everything’s measurable, trackable, and quantifiable. Want to know how many people died on screen in your favorite movie? Or how much ice cream the average American eats per year? The data’s at your fingertips. The ready availability of data is great for marketers. It helps us optimize performance, personalize content, and prove our value to the business. But data in a vacuum isn’t informative or useful. It’s not about the facts and figures themselves; it’s about how we shape that data into compelling stories. As an Analytics Advocate at Google, Adam Singer has years of experience finding and revealing the meaningful narrative in datasets. His presentation at Content Marketing World 2017 was all about how to create clean, informative, compelling data visualizations. Here’s a quick visual summary of his entire presentation, courtesy of Kingman Ink: My favorite part is the lizard that represents your limbic brain. Visuals cut straight to that reflexive part of your brain, making a point quicker than listing facts and drawing conclusions. Here’s how Adam suggests creating data-based visuals that speak directly to our inner lizards.

#1:  Prepare Data for Analysis

Great data visualization starts with…well…data. More than that, it starts with a meaningful and manageable data set. The data you choose to include should be tailored to both the story you want to tell and the audience that’s going to receive it. For an example, when pulling internal data, your CEO might just want to know whether marketing is contributing to revenue. By contrast, your CMO will want revenue, engagement, and sales enablement data. Adam recommends these three steps for data analysis:

  1. Filtering: Make sure you’re getting high quality data. For example, in your website analytics, exclude bot and spam traffic from your traffic reports.
  2. Sorting: Use the sorting that makes the most business sense. In most cases, a combined and weighted sort will be the most useful, organizing data along two variables.
  3. Grouping: In Google Analytics, you can group data into categories. This can help you create more specific, focused visualizations.

#2: Tell Your Data Story

With the data in hand, you can create a visualization. Aim to create an image so simple, specific, and clean that it’s readable at a glance. In other words, the opposite of this:  Notice how your eyes flick back and forth between the legend and the chart, trying to make sense of it all. Compare that chart to this one: There’s a mountain of data behind that visualization, but you can instantly grasp the point: vaccines eliminate diseases. Such a stunning visual doesn’t happen by accident. It takes careful planning. Adam recommends “storyboarding” your visualizations before you even pull the data in. Nail down who you’re talking to, what questions you’re answering, and the story you’re telling before you create a single chart.

#3: Best Practices for Compelling Data Reporting

As with any kind of storytelling, the best way to visualize your data depends on your audience and your story. But there are some consistent best practices to follow. Adam recommends following these guidelines for visualizations in your internal reporting, regardless of audience or intent:

  1. Keep charts and graphs simple. Don’t graph every data point–just enough to show the trend. Focus on what matters most to your story.
  2. Tell the user what the point is. Your audience shouldn’t have to guess at the conclusion you want them to draw: Put it right in the title of your visualization.
  3. Don’t spin the data. Ever. The point of data visualization is to get at the facts, not obscure them. Don’t abuse your audience’s trust with misleading visuals.
  4. Make reporting part of your process. It’s easy to think of reporting as something tacked on to the end of a campaign, a final housekeeping task. Better to see reporting as vital to our ongoing marketing efforts and approach it with dedication and enthusiasm.
  5. Use the right data for the right stakeholder. Make sure you personalize your reports for different audiences, sticking with only the most relevant data for each.
  6. Be creative and have fun. Solutions like Google Data Studio make it easy to pull in data and play with visualizations. Don’t be afraid to experiment!

A Picture Is Worth a Thousand Data Points

When done properly, a single chart or graph can convey paragraphs of information at a single glance. Choose your data carefully, keep your visualizations simple and purposeful, and you can create a report far more compelling than a list of stats and figures could ever be. Speaking of beautiful data visualization, have you seen our interactive influencer marketing infographic?  

The post How to Find the Stories in Your Data for Compelling Reporting appeared first on Online Marketing Blog – TopRank®.


Online Marketing Blog – TopRank®

Google Maps Android app adds ‘find parking’ feature to show you nearest parking garage

Starting today, Google Maps users can tap the “find parking” button on the Android app to see a list of parking garages and lots in 25 US cities. The post Google Maps Android app adds ‘find parking’ feature to show you nearest parking garage appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

How to Find Your Competitor’s Backlinks – Next Level

Posted by BrianChilds

Welcome to the newest installment of our educational Next Level series! In our last episode, Brian Childs equipped copywriters with the tools they need to succeed with SEO. Today, he’s back to share how to use Open Site Explorer to find linking opportunities based upon your competitors’ external inbound links. Read on and level up!


In Moz’s SEO training classes, we discuss how to identify and prioritize sources of backlinks using a mix of tools. One tactic to quickly find high Domain Authority sites that have a history of linking to pages discussing your topic is to study your competitors’ backlinks. This process is covered in-depth during the SEO Link Building Bootcamp.

In this article, I’ll show how to create and export a list of your competitor’s backlinks that you can use for targeting activities. This assumes you’ve already completed keyword research and have identified competitors that rank well in the search results for these queries. Use those competitors for the following analysis.


How to check the backlinks of a site

Step 1: Navigate to Open Site Explorer

Open Site Explorer is a tool used to research the link profile of a website. It will show you the quality of inbound links using metrics like Page Authority, Domain Authority, and Spam Score. You can do a good amount of research with the free version, but to enjoy all its capabilities you’ll need full access; you can get that access for free with a 30-day trial of Moz Pro.

Step 2: Enter your competitor’s domain URL

I suggest opening your competitor’s site in a browser window and then copying the URL. This will reduce any spelling errors and the possibility of incorrectly typing the domain name. An example of a common error is incorrectly adding “www” to the URL when that’s not how it renders for the site.

Step 3: Navigate to the “Inbound Links” tab

The Inbound Links tab will display all of the pages that link to your competitor’s website. In order to identify sources of links that are delivering link equity, I set the parameters above the list as follows: Target This – Root Domain, Links Source – Only External, and Link Type – Link Equity. This will show all external links providing link equity to any page on your competitor’s site.

Step 4: Export results into .csv

Most reports in Open Site Explorer will allow you to export to .csv. Save these results and then repeat for your other competitors.

Step 5: Compile .csv results from all competitors

Once you have Open Site Explorer exports from the top 5–10 competitors, compile them into one spreadsheet.

Step 6: Sort all results by Page Authority

Page Authority is a 1–100 scale developed by Moz that estimates the likelihood of a page’s ability to rank in a search result, based on our understanding of essential ranking factors. Higher numbers suggest the page is more authoritative and therefore has a higher likelihood of ranking. Higher Page Authority pages also will be delivering more link equity to your competitor’s site. Use Page Authority as your sorting criteria.

Step 7: Review all linking sites for opportunities

Now you have a large list of sites linking to your competitors for keywords you are targeting. Go down the list of high Page Authority links and look for sites or authors that show up regularly. Use your preferred outreach strategy to contact these sites and begin developing a relationship.


Want to learn more SEO processes?

If you like these step-by-step SEO processes, you’ll likely enjoy the SEO training classes provided by Moz. These live, instructor-led webinars show you how to use a variety of tools to implement SEO. If you’re in need of comprehensive SEO training, you can save 20% by purchasing the 5-class bundle:

Sign up for the Bootcamp Bundle

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

New Site Crawl: Rebuilt to Find More Issues on More Pages, Faster Than Ever!

Posted by Dr-Pete

First, the good news — as of today, all Moz Pro customer have access to the new version of Site Crawl, our entirely rebuilt deep site crawler and technical SEO auditing platform. The bad news? There isn’t any. It’s bigger, better, faster, and you won’t pay an extra dime for it.

A moment of humility, though — if you’ve used our existing site crawl, you know it hasn’t always lived up to your expectations. Truth is, it hasn’t lived up to ours, either. Over a year ago, we set out to rebuild the back end crawler, but we realized quickly that what we wanted was an entirely re-imagined crawler, front and back, with the best features we could offer. Today, we launch the first version of that new crawler.

Code name: Aardwolf

The back end is entirely new. Our completely rebuilt “Aardwolf” engine crawls twice as fast, while digging much deeper. For larger accounts, it can support up to ten parallel crawlers, for actual speeds of up to 20X the old crawler. Aardwolf also fully supports SNI sites (including Cloudflare), correcting a major shortcoming of our old crawler.

View/search *all* URLs

One major limitation of our old crawler is that you could only see pages with known issues. Click on “All Crawled Pages” in the new crawler, and you’ll be brought to a list of every URL we crawled on your site during the last crawl cycle:

You can sort this list by status code, total issues, Page Authority (PA), or crawl depth. You can also filter by URL, status codes, or whether or not the page has known issues. For example, let’s say I just wanted to see all of the pages crawled for Moz.com in the “/blog” directory…

I just click the [+], select “URL,” enter “/blog,” and I’m on my way.

Do you prefer to slice and dice the data on your own? You can export your entire crawl to CSV, with additional data including per-page fetch times and redirect targets.

Recrawl your site immediately

Sometimes, you just can’t wait a week for a new crawl. Maybe you relaunched your site or made major changes, and you have to know quickly if those changes are working. No problem, just click “Recrawl my site” from the top of any page in the Site Crawl section, and you’ll be on your way…

Starting at our Medium tier, you’ll get 10 recrawls per month, in addition to your automatic weekly crawls. When the stakes are high or you’re under tight deadlines for client reviews, we understand that waiting just isn’t an option. Recrawl allows you to verify that your fixes were successful and refresh your crawl report.

Ignore individual issues

As many customers have reminded us over the years, technical SEO is not a one-sized-fits-all task, and what’s critical for one site is barely a nuisance for another. For example, let’s say I don’t care about a handful of overly dynamic URLs (for many sites, it’s a minor issue). With the new Site Crawl, I can just select those issues and then “Ignore” them (see the green arrow for location):

If you make a mistake, no worries — you can manage and restore ignored issues. We’ll also keep tracking any new issues that pop up over time. Just because you don’t care about something today doesn’t mean you won’t need to know about it a month from now.

Fix duplicate content

Under “Content Issues,” we’ve launched an entirely new duplicate content detection engine and a better, cleaner UI for navigating that content. Duplicate content is now automatically clustered, and we do our best to consistently detect the “parent” page. Here’s a sample from Moz.com:

You can view duplicates by the total number of affected pages, PA, and crawl depth, and you can filter by URL. Click on the arrow (far-right column) for all of the pages in the cluster (shown in the screenshot). Click anywhere in the current table row to get a full profile, including the source page we found that link on.

Prioritize quickly & tactically

Prioritizing technical SEO problems requires deep knowledge of a site. In the past, in the interest of simplicity, I fear that we’ve misled some of you. We attempted to give every issue a set priority (high, medium, or low), when the difficult reality is that what’s a major problem on one site may be deliberate and useful on another.

With the new Site Crawl, we decided to categorize crawl issues tactically, using five buckets:

  • Critical Crawler Issues
  • Crawler Warnings
  • Redirect Issues
  • Metadata Issues
  • Content Issues

Hopefully, you can already guess what some of these contain. Critical Crawler Issues still reflect issues that matter first to most sites, such as 5XX errors and redirects to 404s. Crawler Warnings represent issues that might be very important for some sites, but require more context, such as meta NOINDEX.

Prioritization often depends on scope, too. All else being equal, one 500 error may be more important than one duplicate page, but 10,000 duplicate pages is a different matter. Go to the bottom of the Site Crawl Overview Page, and we’ve attempted to balance priority and scope to target your top three issues to fix:

Moving forward, we’re going to be launching more intelligent prioritization, including grouping issues by folder and adding data visualization of your known issues. Prioritization is a difficult task and one we haven’t helped you do as well as we could. We’re going to do our best to change that.

Dive in & tell us what you think!

All existing customers should have access to the new Site Crawl as of earlier this morning. Even better, we’ve been crawling existing campaigns with the Aardwolf engine for a couple of weeks, so you’ll have history available from day one! Stay tuned for a blog post tomorrow on effectively prioritizing Site Crawl issues, and a webinar on Friday at 9am Pacific.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog