Do iPhone Users Spend More Online Than Android Users?

Posted by MartyMeany

Apple has just launched their latest flagship phones to market and later this year they’ll release their uber-flagship: the iPhone X. The iPhone X is the most expensive iPhone yet, at a cool $999. With so many other smartphones on the market offering similar functionality, it begs the question: Do iPhone users simply spend more money than everyone else?

At Wolfgang Digital, we love a bit of data, so we’ve trawled through a massive dataset of 31 million iPhone and Android sessions to finally answer this question. Of course, we’ve got some actionable nuggets of digital marketing strategy at the end, too!

Why am I asking this question?

Way back when, before joining the online marketing world, I sold mobile phones. I couldn’t get my head around why people bought iPhones. They’re more expensive than their Android counterparts, which usually offer the same, if not increased, functionality (though you could argue the latter is subjective).

When I moved into the e-commerce department of the same phone retailer, my team would regularly grab a coffee and share little nuggets of interesting e-commerce trends we’d found. My personal favorite was a tale about Apple users spending more than desktop users. The story I read talked about how a hotel raised prices for people booking while using an Apple device. Even with the increased prices, conversion rates didn’t budge as the hotel raked in extra cash.

I’ve always said this story was anecdotal because I simply never saw the data to back it up. Still, it fascinated me.

Finding an answer

Fast forward a few years and I’m sitting in Wolfgang Digital behind the huge dataset that powered our 2017 E-Commerce Benchmark KPI Study. It occurred to me that this data could answer some of the great online questions I’d heard over the years. What better place to start than that tale of Apple users spending more money online than others?

The online world has changed a little since I first asked myself this question, so let’s take a fresh 2017 approach.

Do iPhone users spend more than Android users?

When this hypothesis first appeared, people were comparing Mac desktop users and PC desktop users, but the game has changed since then. To give the hypothesis a fresh 2017 look, we’re going to ask whether iPhone users spend more than Android users. Looking through the 31 million sessions on both iOS and Android operating systems, then filtering the data by mobile, it didn’t take long to find the the answer to this question that had followed me around for years. The results were astonishing:

On average, Android users spend $11.54 per transaction. iPhone users, on the other hand, spend a whopping $32.94 per transaction. That means iPhone users will spend almost three times as much as Android users when visiting an e-commerce site.

Slightly smug that I’ve finally answered my question, how do we turn this from being an interesting nugget of information to an actionable insight?

What does this mean for digital marketers?

As soon as you read about iPhone users spending three times more than Android users, I’m sure you started thinking about targeting users specifically based on their operating system. If iOS users are spending more money than their Android counterparts, doesn’t it make sense to shift your spend and targeting towards iOS users?

You’re right. In both Facebook and AdWords, you can use this information to your advantage.

Targeting operating systems within Facebook

Of the “big two” ad platforms, Facebook offers the most direct form of operating system targeting. When creating your ads, Facebook’s Ad Manager will give you the option to target “All Mobile Devices,” “iOS Devices Only,” or “Android Devices Only.” These options mean you can target those high average order value-generating iPhone users.

Targeting operating systems within AdWords

AdWords will allow you to target operating systems for both Display Campaigns and Video Campaigns. When it comes to Search, you can’t target a specific operating system. You can, however, create an OS-based audience using Google Analytics. Once this audience is built, you can remarket to an iOS audience with “iPhone”-oriented ad texts. Speaking at Wolfgang Essentials this year, Wil Reynolds showed clips of people talking through their decision to click in SERPs. It’s incredible to see people skipping over year-old content before clicking an article that mentions “iPhone.” Why? Because that user has an iPhone. That’s the power of relevancy.

You’ll also be able to optimize and personalize your bids in Search, safe in the knowledge that iPhone users are more likely to spend big than Android users.

There you have it. Don’t let those mad stories you hear pass you by. You might just learn something!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How to Turn Low-Value Content Into Neatly Organized Opportunities – Next Level

Posted by jocameron

Welcome to the newest installment of our educational Next Level series! In our last post, Brian Childs offered up a beginner-level workflow to help discover your competitor’s backlinks. Today, we’re welcoming back Next Level veteran Jo Cameron to show you how to find low-quality pages on your site and decide their new fate. Read on and level up!


With an almost endless succession of Google updates fluctuating the search results, it’s pretty clear that substandard content just won’t cut it.

I know, I know — we can’t all keep up with the latest algorithm updates. We’ve got businesses to run, clients to impress, and a strong social media presence to maintain. After all, you haven’t seen a huge drop in your traffic. It’s probably OK, right?

So what’s with the nagging sensation down in the pit of your stomach? It’s not just that giant chili taco you had earlier. Maybe it’s that feeling that your content might be treading on thin ice. Maybe you watched Rand’s recent Whiteboard Friday (How to Determine if a Page is “Low Quality” in Google’s Eyes) and just don’t know where to start.

In this edition of Next Level, I’ll show you how to start identifying your low-quality pages in a few simple steps with Moz Pro’s Site Crawl. Once identified, you can decide whether to merge, shine up, or remove the content.

A quick recap of algorithm updates

The latest big fluctuations in the search results were said to be caused by King Fred: enemy of low-quality pages and champion of the people’s right to find and enjoy content of value.

Fred took the fight to affiliate sites, and low-value commercial sites were also affected.

The good news is that even if this isn’t directed at you, and you haven’t taken a hit yourself, you can still learn from this update to improve your site. After all, why not stay on the right side of the biggest index of online content in the known universe? You’ll come away with a good idea of what content is working for your site, and you may just take a ride to the top of the SERPs. Knowledge is power, after all.

Be a Pro

It’s best if we just accept that Google updates are ongoing; they happen all.the.time. But with a site audit tool in your toolkit like Moz Pro’s Site Crawl, they don’t have to keep you up at night. Our shiny new Rogerbot crawler is the new kid on the block, and it’s hungry to crawl your pages.

If you haven’t given it a try, sign up for a free trial for 30 days:

Start a free trial

If you’ve already had a free trial that has expired, write to me and I’ll give you another, just because I can.

Set up your Moz Pro campaign — it takes 5 minutes tops — and Rogerbot will be unleashed upon your site like a caffeinated spider.

Rogerbot hops from page to page following links to analyze your website. As Rogerbot hops along, a beautiful database of pages is constructed that flag issues you can use to find those laggers. What a hero!

First stop: Thin content

Site Crawl > Content Issues > Thin Content

Thin content could be damaging your site. If it’s deemed to be malicious, then it could result in a penalty. Things like zero-value pages with ads or spammy doorway pages — little traps people set to funnel people to other pages — are bad news.

First off, let’s find those pages. Moz Pro Site Crawl will flag “thin content” if it has less than 50 words (excluding navigation and ads).

Now is a good time to familiarize yourself with Google’s Quality Guidelines. Think long and hard about whether you may be doing this, intentionally or accidentally.

You’re probably not straight-up spamming people, but you could do better and you know it. Our mantra is (repeat after me): “Does this add value for my visitors?” Well, does it?

Ok, you can stop chanting now.

For most of us, thin content is less of a penalty threat and more of an opportunity. By finding pages with thin content, you have the opportunity to figure out if they’re doing enough to serve your visitors. Pile on some Google Analytics data and start making decisions about improvements that can be made.

Using moz.com as an example, I’ve found 3 pages with thin content. Ta-da emoji!

I’m not too concerned about the login page or the password reset page. I am, however, interested to see how the local search page is performing. Maybe we can find an opportunity to help people who land on this page.

Go ahead and export your thin content pages from Moz Pro to CSV.

We can then grab some data from Google Analytics to give us an idea of how well this page is performing. You may want to look at comparing monthly data and see if there are any trends, or compare similar pages to see if improvements can be made.

I am by no means a Google Analytics expert, but I know how to get what I want. Most of the time that is, except when I have to Google it, which is probably every second week.

Firstly: Behavior > Site Content > All Pages > Paste in your URL

  • Pageviews – The number of times that page has been viewed, even if it’s a repeat view.
  • Avg. Time on Page – How long people are on your page
  • Bounce Rate – Single page views with no interaction

For my example page, Bounce Rate is very interesting. This page lives to be interacted with. Its only joy in life is allowing people to search for a local business in the UK, US, or Canada. It is not an informational page at all. It doesn’t provide a contact phone number or an answer to a query that may explain away a high bounce rate.

I’m going to add Pageviews and Bounce Rate a spreadsheet so I can track this over time.

I’ll also added some keywords that I want that page to rank for to my Moz Pro Rankings. That way I can make sure I’m targeting searcher intent and driving organic traffic that is likely to convert.

I’ll also know if I’m being out ranked by my competitors. How dare they, right?

As we’ve found with this local page, not all thin content is bad content. Another example may be if you have a landing page with an awesome video that’s adding value and is performing consistently well. In this case, hold off on making sweeping changes. Track the data you’re interested in; from there, you can look at making small changes and track the impact, or split test some ideas. Either way, you want to make informed, data-driven decisions.

Action to take for tracking thin content pages

Export to CSV so you can track how these pages are performing alongside GA data. Make incremental changes and track the results.

Second stop: Duplicate title tags

Site Crawl > Content Issues > Duplicate Title Tags

Title tags show up in the search results to give human searchers a taste of what your content is about. They also help search engines understand and categorize your content. Without question, you want these to be well considered, relevant to your content, and unique.

Moz Pro Site Crawl flags any pages with matching title tags for your perusal.

Duplicate title tags are unlikely to get your site penalized, unless you’ve masterminded an army of pages that target irrelevant keywords and provide zero value. Once again, for most of us, it’s a good way to find a missed opportunity.

Digging around your duplicate title tags is a lucky dip of wonder. You may find pages with repeated content that you want to merge, or redundant pages that may be confusing your visitors, or maybe just pages for which you haven’t spent the time crafting unique title tags.

Take this opportunity to review your title tags, make them interesting, and always make them relevant. Because I’m a Whiteboard Friday friend, I can’t not link to this title tag hack video. Turn off Netflix for 10 minutes and enjoy.

Pro tip: To view the other duplicate pages, make sure you click on the little triangle icon to open that up like an accordion.

Hey now, what’s this? Filed away under duplicate title tags I’ve found these cheeky pages.

These are the contact forms we have in place to contact our help team. Yes, me included — hi!

I’ve got some inside info for you all. We’re actually in the process of redesigning our Help Hub, and these tool-specific pages definitely need a rethink. For now, I’m going to summon the powerful and mysterious rel=canonical tag.

This tells search engines that all those other pages are copies of the one true page to rule them all. Search engines like this, they understand it, and they bow down to honor the original source, as well they should. Visitors can still access these pages, and they won’t ever know they’ve hit a page with an original source elsewhere. How very magical.

Action to take for duplicate title tags on similar pages

Use the rel=canonical tag to tell search engines that http://ift.tt/2vLlcAU is the original source.

Review visitor behavior and perform user testing on the Help Hub. We’ll use this information to make a plan for redirecting those pages to one main page and adding a tool type drop-down.

More duplicate titles within my subfolder-specific campaign

Because at Moz we’ve got a heck of a lot of pages, I’ve got another Moz Pro campaign set up to track the URL moz.com/blog. I find this handy if I want to look at issues on just one section of my site at a time.

You just have to enter your subfolder and limit your campaign when you set it up.

Just remember we won’t crawl any pages outside of the subfolder. Make sure you have an all-encompassing, all-access campaign set up for the root domain as well.

Not enough allowance to create a subfolder-specific campaign? You can filter by URL from within your existing campaign.

In my Moz Blog campaign, I stumbled across these little fellows:

http://ift.tt/2xw5RBL

http://ift.tt/2i18byE

This is a classic case of new content usurping the old content. Instead of telling search engines, “Yeah, so I’ve got a few pages and they’re kind of the same, but this one is the one true page,” like we did with the rel=canonical tag before, this time I’ll use the big cousin of the rel=canonical, the queen of content canonicalization, the 301 redirect.

All the power is sent to the page you are redirecting to, as well as all the actual human visitors.

Action to take for duplicate title tags with outdated/updated content

Check the traffic and authority for both pages, then add a 301 redirect from one to the other. Consolidate and rule.

It’s also a good opportunity to refresh the content and check whether it’s… what? I can’t hear you — adding value to my visitors! You got it.

Third stop: Duplicate content

Site Crawl > Content Issues > Duplicate Content

When the code and content on a page looks the same are the code and content on another page of your site, it will be flagged as “Duplicate Content.” Our crawler will flag any pages with 90% or more overlapping content or code as having duplicate content.

Officially, in the wise words of Google, duplicate content doesn’t incur a penalty. However, it can be filtered out of the index, so still not great.

Having said that, the trick is in the fine print. One bot’s duplicate content is another bot’s thin content, and thin content can get you penalized. Let me refer you back to our old friend, the Quality Guidelines.

Are you doing one of these things intentionally or accidentally? Do you want me to make you chant again?

If you’re being hounded by duplicate content issues and don’t know where to start, then we’ve got more information on duplicate content on our Learning Center.

I’ve found some pages that clearly have different content on them, so why are these duplicate?

So friends, what we have here is thin content that’s being flagged as duplicate.

There is basically not enough content on the page for bots to distinguish them from each other. Remember that our crawler looks at all the page code, as well as the copy that humans see.

You may find this frustrating at first: “Like, why are they duplicates?? They’re different, gosh darn it!” But once you pass through all the 7 stages of duplicate content and arrive at acceptance, you’ll see the opportunity you have here. Why not pop those topics on your content schedule? Why not use the “queen” again, and 301 redirect them to a similar resource, combining the power of both resources? Or maybe, just maybe, you could use them in a blog post about duplicate content — just like I have.

Action to take for duplicate pages with different content

Before you make any hasty decisions, check the traffic to these pages. Maybe dig a bit deeper and track conversions and bounce rate, as well. Check out our workflow for thin content earlier in this post and do the same for these pages.

From there you can figure out if you want to rework content to add value or redirect pages to another resource.

This is an awesome video in the ever-impressive Whiteboard Friday series which talks about republishing. Seriously, you’ll kick yourself if you don’t watch it.

Broken URLs and duplicate content

Another dive into Duplicate Content has turned up two Help Hub URLs that point to the same page.

These are no good to man or beast. They are especially no good for our analytics — blurgh, data confusion! No good for our crawl budget — blurgh, extra useless page! User experience? Blurgh, nope, no good for that either.

Action to take for messed-up URLs causing duplicate content

Zap this time-waster with a 301 redirect. For me this is an easy decision: add a 301 to the long, messed up URL with a PA of 1, no discussion. I love our new Learning Center so much that I’m going to link to it again so you can learn more about redirection and build your SEO knowledge.

It’s the most handy place to check if you get stuck with any of the concepts I’ve talked about today.

Wrapping up

While it may feel scary at first to have your content flagged as having issues, the real takeaway here is that these are actually neatly organized opportunities.

With a bit of tenacity and some extra data from Google Analytics, you can start to understand the best way to fix your content and make your site easier to use (and more powerful in the process).

If you get stuck, just remember our chant: “Does this add value for my visitors?” Your content has to be for your human visitors, so think about them and their journey. And most importantly: be good to yourself and use a tool like Moz Pro that compiles potential issues into an easily digestible catalogue.

Enjoy your chili taco and your good night’s sleep!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Listen to MozPod, the Free SEO Podcast from Moz

Posted by BrianChilds

We’re marketers. We know from firsthand experience that there aren’t enough hours in the day to do everything that needs to get done. And that’s even more true once you commit to leveling up and learning new skills.

The learning curve for developing digital marketing skills can be steep, and staying informed as things evolve and change (thanks, Google) can feel like a full-time job. Our Moz Training has classes to help accelerate the learning process, but as startup folks ourselves, we understand the importance of multitasking.

Learn SEO on the go

We’re thrilled to introduce MozPod, an SEO podcast focused on sharing lessons from digital marketing experts. Episodes are led by instructors from Moz Academy and we discuss a wide variety of digital marketing concepts, from common terminology to recent changes and best practices.

Check it out on iTunes

Where can I listen in?


Upcoming episodes

Our first series covers conversion rate optimization, PageRank, and link building:

Ep. 1: The Science of Crawling and Indexing

Guest: Neil Martinsen-Burrell of Moz

Dr. Neil Martinsen-Burrell shares his perspective as a statistician on the development of Page Authority and Domain Authority. Great data and interesting stats.

Ep. 2: What’s a Good Conversion Rate?

Guest: Carl Schmidt of Unbounce

Carl discusses the Unbounce Conversion Rate Benchmark Report and what SEOs can learn from an analysis of over 74 million landing page visitors. Great for content writers.

Ep. 3: Link Building Fundamentals

Guest: The PageOnePower team

MozPod interviews PageOnePower about how search engines place value on links. Collin, Cody, and Nicholas share the personal wisdom they’ve gained from working at a link building company.


Want to be a guest on MozPod?

If you’d like to share your recent SEO analysis or have a topic you think MozPod listeners would find valuable, please send us your ideas! MozPod is a place for our community of SEOs and digital marketers to learn. We’d love to hear from you.

Simply fill out this form to share your idea: Be on MozPod


Give it a listen and let us know what topics you’d like to hear about in the comments!

Listen to MozPod on iTunes

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Yes, Competitors Can Edit Your Listing on Google My Business

Posted by JoyHawkins

I decided to write this article in response to a recent article that was published over at CBSDFW. The article was one of many stories about how spammers update legitimate information on Google as a way to send more leads somewhere else. This might shock some readers, but it was old news to me since spam of this nature on Google Maps has been a problem for almost a decade.

What sparked my interest in this article was Google’s response. Google stated:

Merchants who manage their business listing info through Google My Business (which is free to use), are notified via email when edits are suggested. Spammers and others with negative intent are a problem for consumers, businesses, and technology companies that provide local business information. We use automated systems to detect for spam and fraud, but we tend not to share details behind our processes so as not to tip off spammers or others with bad intent.

Someone might read that and feel safe, believing that they have nothing to worry about. However, some of us who have been in this space for a long time know that there are several incorrect and misleading statements in that paragraph. I’m going to point them out below.


“Merchants are notified by email”

  1. Google just started notifying users by email last month. Their statement makes it sound like this has been going on for ages. Before September 2017, there were no emails going to people about edits made to their listings.
  2. Not everyone gets an email about edits that have been made. To test this, I had several people submit an update to a listing I own to change the phone number. When the edit went live, the Google account that was the primary owner on the listing got an email; the Google account that was a manager on the listing did not.

Similarly, I am a manager on over 50 listings and 7 of them currently show as having updates in the Google My Business dashboard. I haven’t received a single email since they launched this feature a month ago.

“Notified […] when edits are suggested”

Merchants are not notified when edits are “suggested.” Any time I’ve ever heard of an email notification in the last month, it went out after the edit was already live.

Here’s a recent case on the Google My Business forum. This business owner got an email when his name was updated because the edit was already live. He currently has a pending edit on his listing to change the hours of operation. Clearly this guy is on top of things, so why hasn’t he denied it? Because he wouldn’t even know about it since it’s pending.

The edit isn’t live yet, so he’s not receiving a notification — either by email or inside the Google My Business dashboard.

Edits show up in the Google My Business dashboard as “Updates from Google.” Many people think that if they don’t “accept” these edits in the Google My Business dashboard, the edits won’t go live. The reality is that by “accepting” them, you’re just confirming something that’s already live on Google. If you “don’t accept,” you actually need to edit the listing to revert it back (there is no “deny” button).

Here’s another current example of a listing I manage inside Google My Business. The dashboard doesn’t show any updates to the website field, yet there’s a pending edit that I can see on the Google Maps app. A user has suggested that the proper website is a different page on the website than what I currently have. The only way to see all types of pending edits is via Check the Facts on Google Maps. No business owner I’ve ever spoken to has any clue what this is, so I think it’s safe to say they wouldn’t be checking there.

Here’s how I would edit that original response from Google to make it more factually correct:

Merchants who manage their business listing info through Google My Business (which is free to use) are notified when edits made by others are published on Google. Sometimes they are notified by email and the updates are also shown inside the Google My Business dashboard. Google allows users (other than the business owner) to make edits to listings on Google, but the edits are reviewed by either automated systems or, in some cases, actual human beings. Although the system isn’t perfect, Google is continually making efforts to keep the map free from spam and malicious editing.


Do you manage listings that have been edited by competitors? What’s your experience been? Share your story in the comments below!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Getting SEO Value from rel=”nofollow” Links – Whiteboard Friday

Posted by randfish

Plenty of websites that make it easy for you to contribute don’t make it easy to earn a followed link from those contributions. While rel=nofollow links reign in the land of social media profiles, comments, and publishers, there’s a few ways around it. In today’s Whiteboard Friday, Rand shares five tactics to help you earn equity-passing followed links using traditionally nofollow-only platforms.

http://ift.tt/2wBCte0

http://ift.tt/1SsY8tZ

How to get SEO value from rel="nofollow" links

Click on the whiteboard image above to open a high-resolution version in a new tab!

http://ift.tt/2yu975X

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about how you can get SEO value from nofollowed links. So in the SEO world, there are followed links. These are the normal ones that you find on almost every website. But then you can have nofollowed links, which you’ll see in the HTML code of a website. You will see the normal thing is a href=somewebsite in here. If you see this rel=nofollow, that means that the search engines — Google, Bing, Yahoo, etc. — will not count this link as passing link equity, at least certainly not in the same way that a followed link would.

So when you see these, you can see them by looking in the source code yourself. You could turn on the MozBar and use the “Show nofollow links” on the Page button and see these.

What sort of links use rel=nofollow?

But the basic story is that you’re not getting the same SEO value from them. But there are ways to get it. Recently you might have seen in the SEO news world that Inc. and Forbes and a few other sites like them, last year it was Huffington Post, started applying nofollow tags to all the links that belong to articles from contributors. So if I go and write an article for Inc. today, the links that I point out from my bio and my snippet on there, they’re not going to pass any value, because they have this nofollow applied.

A) Social media links (Facebook, Twitter, LinkedIn, etc.)

There are a bunch of types of links use this. Social media, so Facebook, Twitter, and LinkedIn, which is one of the reasons why you can’t just boost your linked profile by going to these places and leaving a bunch of links around.

B) Comments (news articles, blogs, forums, etc.)

Comments, so from news articles or blogs or forums where there’s discussion, Q&A sites, those comments, all the links in them that you leave again nofollowed.

C) Open submission content (Quora, Reddit, YouTube, etc.)

Open submission content, so places like Quora where you could write a post, or Reddit, where you could write a post, or YouTube where you could upload a video and have a post and have a link, most of those, in fact almost all of them now have nofollows as do the profile links that are associated. Your Instagram account, for example, that would be a social media one. But it’s not just the pictures you post on Instagram. Your profile link is one of the only places in the Instagram platform where you actually get a real URL that you can send people to, but that is nofollowed on the web.

D) Some publishers with less stringent review systems (Forbes, Buzzfeed, LinkedIn Pulse, etc.)

Some publishers now with these less stringent publishing review systems, so places like Inc., Forbes, BuzzFeed in some cases with their sponsored posts, Huffington Post, LinkedIn’s Pulse platform, and a bunch of others all use this rel=nofollow.

Basic evaluation formula for earning followed links from the above sources

Basic evaluation formula for earning followed links from the above sources

The basic formula that we need to go to here is: How do you contribute to all of these places in ways that will ultimately result in followed links and that will provide you with SEO value? So we’re essentially saying I’m going to do X. I know that’s going to bring a nofollowed link, but that nofollowed link will result in this other thing happening that will then lead to a followed link.

Do X → Get rel=nofollow link → Results in Y → Leads to followed link

5 examples/tactics to start

This other thing happening can be a bunch of different things. It could be something indirect. You post something with your site on one of these places. It includes a nofollow link. Someone finds it. We’ll just call this guy over here, this is our friendly editor who works for a publication and finds it and says, “Hmm, that link was actually quite useful,” or the information it pointed to was useful, the article was useful, your new company seems useful, whatever it is. Later, as that editor is writing, they will link over to your site, and this will be a followed link. Thus, you’re getting the SEO value. You’ve indirectly gained SEO value essentially through amplification of what you were sharing through your link.

Google likes this. They want you to use all of these places to show stuff, and then they’re hoping that if people find it truly valuable, they’ll pick it up, they’ll link to it, and then Google can reward that.

So some examples of places where you might attempt this in the early stages. These are a very small subset of what you could do, and it’s going to be different for every industry and every endeavor.

1. Quora contributions

But Quora contributions, especially those if you have relevant or high value credentials or very unique, specific experiences, that will often get picked up by the online press. There are lots of editors and journalists and publications of all kinds that rely on interesting answers to Quora questions to use in their journalism, and then they’ll cite you as a source, or they’ll ask you to contribute, they’ll ask you for a quote, they’ll point to your website, all that kind of stuff.

2. Early comments on low-popularity blogs

Early comments especially in, I know this is going to sound odd, but low-popularity blogs, rather than high-popularity ones. Why low popularity? Because you will stand out. You’re less likely to be seen as a spammer, especially if you’re an authentic contributor. You don’t get lost in the noise. You can create intrigue, give value, and that will often lead to that writer or that blogger picking you up with followed links in subsequent posts. If you want more on this tactic, by the way, check out our Whiteboard Friday on comment marketing from last year. That was a deep dive into this topic.

3. Following and engaging with link targets on Twitter

Number three, following and engaging with your link targets on Twitter, especially if your link targets are heavily invested in Twitter, like journalists, B2B bloggers and contributors, and authors or people who write for lots of different publications. It doesn’t have to be a published author. It can just be a writer who writes for lots of online pieces. Then sharing your related content with them or just via your Twitter account, if you’re engaging with them a lot, chances are good you can get a follow back, and that will lead to a lot of followed up links with a citation.

4. Link citations from Instagram images

Instagram accounts. When you post images on Instagram, if you use the hashtags — hashtag marketing is kind of one of the only ways to get exposure on Instagram — but if you use hashtags that you know journalists, writers, editors, and publications of any kind in your field are picking up and need, especially travel, activities, current events, stuff that’s in the news, or conferences and events, many times folks will pick up those images and ask you for permission to use them. If you’re willing to give it, you can earn link citations. Another important reason to associate that URL with your site so that people can get in touch with you.

5. Amplify content published on your site by republishing on other platforms

If you’re using some of these platforms that are completely nofollow or platforms that are open contribution and have follow links, but where we suspect Google probably doesn’t count them, Medium being one of the biggest places, you can use republishing tactics. So essentially you’re writing on your own website first. Writing on your own website first, but then you are republishing on some of these other places.

I’m going to go Forbes. I’m going to publish my column on Forbes. I’m going to go to Medium. I’m going to publish in my Medium account. I’m going to contribute Huffington Post with the same piece. I’m republishing across these multiple platforms, and essentially you can think of this as it’s not duplicate content. You’re not hurting yourself, because these places are all pointing back to your original. It’s technically duplicate content, but not the kind that’s going to be bothersome for search engines.

You’re essentially using these the same way you would use your Twitter or Facebook or LinkedIn, where you are pushing it out as a way to say, “Here, check this out if you’re on these platforms, and here’s the original back here.” You can do that with the full article, just like you would do full content in RSS or full content for email subscribers. Then use those platforms for sharing and amplification to get into the hands of people who might link later.


So nofollowed links, not a direct impact, but potentially a very powerful, indirect way to get lots of good links and lots of good SEO value.

All right, everyone, hope you’ve enjoyed this edition of Whiteboard Friday, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

New Findings Show Google Organic Clicks Shifting to Paid

Posted by Brian_W

On the Wayfair SEO team, we keep track of our non-branded click curves: the average click-through rate (CTR) for each ranking position. This helps us accurately evaluate the potential opportunity of keyword clusters.

Over the last two years, the total share of organic clicks on page one of our e-commerce SERPs has dropped 25% on desktop and 55% on mobile.

For the ad-heavy non-local SERPs that we work in, paid ads are likely now earning nearly the same percentage of clicks as organic results — a staggering change from most of the history of Google.

Organic CTR loses 25% of click share on desktop, 55% on mobile

Looking at 2015 vs 2017 data for all keywords ranking organically on the first page, we’ve seen a dramatic change in CTR. Below we’ve normalized our actual CTR on a 1–10 scale, representing a total drop of 25% of click share on desktop and 55% on mobile.

Organic receives 25% less desktop CTR and 55% less mobile CTR compared to two years ago.

The much larger drop on mobile is particularly relevant because we’ve seen large traffic shifts to mobile over the last two years as well. The overall percentage drop plays out somewhat similarly across the first page of results; however, the top four were most heavily impacted.

The first four organic results were most heavily impacted by the CTR shift from organic to paid.

About the data

It’s important to note that this type of CTR change is not true for every SERP. This data is only applicable to e-commerce intent search queries, where ads and PLAs are on nearly every query.

We gather the impression, click, and rank data from Search Console. While Search Console data isn’t quantitatively correct, it does appear to be directionally correct for us (if we see clicks double in Search Console, we also see organic Google traffic double in our analytics), site improvements that lead to meaningful CTR gains appear to be reflected in Search Console, we can roughly verify impressions via ad data, and we can confirm the accuracy of rank. For purposes of this data pull, we excluded any keywords that Search Console reported as a non-integer rank (such as ranking 1.2). We have thousands of page one keywords, including many large head terms comprising millions of combined clicks, which gives us a lot of data for each ranking position.

We remove all branded queries from the data, which hugely skews click curves.

It’s important to note that paid ads are not getting all the clicks that organic is not. In addition to the small number of people who click beyond the first page, a surprising number do not click at all. Our best guess is that all ads combined now get about the same percentage of clicks (for our results) as all organic results combined.

Why is this happening?

It’s no secret to SEOs who work on transactional keywords why we no longer gain as large a share of clicks for our best rankings. We suspect the primary causes are the following:

  • Ads serving on more queries
  • More ads per query
  • Larger ads, with more space given to each ad
  • Google Shopping (which show up on more queries, list more products per query, and take up more space)
  • Subtler ad labeling, making it less obvious that an ad is an ad

At Wayfair, we’ve seen Google Shopping results appear on more and more search queries over the last year. Using Stat Search Analytics, we can track the growth in queries serving Google Shopping results (modified by search volume to give a qualitative visibility score) across the 25,000 keywords we track daily on mobile and desktop. The overall share of voice of Google Shopping has grown nearly 60% in the last year.

Number of transactional queries serving Google Shopping has grown nearly 60% in the last year.

On top of this, we’re often seeing four PPC ads for a typical non-branded commercial term, in addition to the Google Shopping results.

And with the expanded size of ads on mobile, almost none of our queries show anything other than ads without scrolling:

This great image from Edwords shows the steady growth in percent of the desktop page consumed by ads for a query that has only three ad results. We go from seeing five organic results above the scroll, to just one. In more recent years we’ve seen this size growth explode on mobile as well.

At the same time that ads have grown, the labeling of ads has become increasingly subtle. In a 2015 study, Ofcom found that half of adults don’t recognize ads in Google, and about 70% of teenagers didn’t recognize Google ads — and ad labeling has become substantially less obvious since then. For most of its history, Google ads were labeled by a large colored block that was intuitively separate from the non-ad results, though sometimes not visible on monitors with a higher brightness setting.

2000 – Shaded background around all ads:

2010 – Shaded background still exists around ads:

2014 – No background; yellow box label next to each ad (and ads take up a lot more space):

2017 – Yellow box changed to green, the same color as the URL it’s next to (and ads take up even more space):

2017 – Green box changed to a thin green outline the same color as the URL:

What to do about it

The good news is that this is impacting everyone in e-commerce equally, and all those search clicks are still happening — in other words, those users haven’t gone away. The growth in the number of searches each year means that you probably aren’t seeing huge losses in organic traffic; instead, it will show as small losses or anemic growth. The bad news is that it will cost you — as well as your competitors — more money to capture the same overall share of search traffic.

A strong search marketing strategy has always involved organic, paid search, and PLA combined. Sites optimizing for all search channels are already well-positioned to capture search traffic regardless of ad changes to the SERPs: if SEO growth slows, then PLA and paid search growth speeds up. As real estate for one channel shrinks, real estate for others grows.

If you haven’t been strongly invested in search ads or PLAs, then the Chinese proverb on the best time to plant a tree applies perfectly:

The best time to plant a tree was 20 years ago. The second best time is now.

With a similar percentage of clicks going to paid and organic, your investment in each should be similar (unless, of course, you have some catching up to do with one channel).

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Special Notes for SABs Amid Decreased Local Search Visibility

Posted by MiriamEllis

One of the most common complaints I hear from service area business owners, like plumbers, locksmiths, and housekeepers, is that Google has always treated them as an afterthought. If you’re in charge of the digital marketing for these business models, it’s vital to understand just how accurate this complaint is so that you can both empathize with SAB brand owners and create a strategy that honors limitations while also identifying opportunities.

In marketing SABs, you’ve got to learn to make the best of a special situation. In this post, I want to address two of the realities these companies are facing right now that call for careful planning: the unique big picture of SAB local listing management, and the rise of Google’s Home Service Ads.

Let’s talk listings, Moz Local, and SABs

I was fascinated by my appliance repairman — an older German ex-pat with a serious demeanor — the first time he looked at my wall heater and pronounced,

“This puppy is no good.”

Our family went on to form a lasting relationship with this expert who has warned me about everything from lint fires in dryers to mis-branded appliances slapped together in dubious factories. I’m an admiring fan of genuinely knowledgeable service people who come to my doorstep, crawl under my house where possums dwell, ascend to my eerie attic despite spiders, and are professionally dedicated to keeping my old house livable. I work on a computer, surrounded by comforts; these folks know what real elbow grease is all about:

It’s because of my regard for these incredibly hard-working SAB owners and staffers that I’ve always taken issue with the fact that the local Internet tends to treat them in an offhand manner. They do some of the toughest jobs, and I’d like their marketing opportunities to be boundless. But the reality is, the road has been rocky and the limits are real.

Google goofed first

When Google invested heavily in developing their mapped version of the local commercial scene, there was reportedly internal disagreement as to whether a service area business is actually a “place” and deserved of inclusion in Google’s local index. You couldn’t add service area businesses to the now-defunct MapMaker but you could create local listings for them (clear as mud, right?). At a 2008 SMX event, faced with the question as to how SABs could be accurately represented in the local results, a Google rep really goofed in first suggesting that they all get PO boxes, only to have this specific practice subsequently outlawed by Google’s guidelines.

Confusion and spam flowed in

For the record,

  • Both SABs and brick-and-mortar businesses are currently eligible for Google My Business listings if they serve customers face-to-face.
  • SABs must have some form of legitimate street address, even if it’s a home address, to be included
  • Only brick-and-mortar businesses are supposed to have visible addresses on their listings, but Google’s shifting messaging and inconsistent guideline enforcement have created confusion.

Google has shown little zeal for suspending listings that violate the hide-address guidelines, with one notable exception recently mentioned to me by Joy Hawkins of Sterling Sky: SABs who click the Google My Business dashboard box stating that they serve clients at the business’ location in order to get themselves out of no man’s land at the bottom of the Google Home Service ad unit are being completely removed from the map by Google if caught.

Meanwhile, concern has been engendered by past debate over whether hiding the address of a business lowered its local pack rankings. The 2017 Local Search Ranking Factors survey is still finding this to be the #18 negative local pack ranking factor, which might be worthy of further discussion.

All of these factors have created an environment in which legitimate SABs have accidentally incorrectly listed themselves on Google and in which spammers have thrived, intentionally creating multiple listings at non-physical addresses and frequently getting away with it to the detriment of search results uniformity and quality. In this unsatisfactory environment, the advent of Google’s Home Service Ads program may have been inevitable, and we’ll take a look at that in a minute.

Limits made clear in listing options for SABs

Whether the risk of suspension or impact on rankings is great or small, hiding your address on SAB Google My Business listings is the only Google-approved practice. If you want to play it totally safe, you’ll play by the rules, but this doesn’t automatically overcome every challenge.

Google is one of the few high-level local business index requiring hidden SAB addresses. And it’s in this stance that SABs encounter some problems taking advantage of the efficiencies provided by automated location data management tools like Moz Local. There are three main things that have confused our own customers:

  1. Because our SAB customers are required by Google to hide their address, Moz Local can’t then verify the address because… well, it’s hidden. This means that customers need to have a Facebook listing with a visible address on it to get started using Moz Local. Facebook doesn’t require SAB addresses to be hidden.
  2. Once the customer gets started, their ultimate consistency score will generally be lower than what a brick-and-mortar business achieves, again because their hidden GMB listing address can’t be matched to all of the other complete listings Moz Local builds for them. It reads like an inconsistency, and while this in no way impacts their real-world performance, it’s a little sad not to be able to aim for a nifty 100% dashboard metric within Moz Local. Important to mention here that a 100% score isn’t achievable for multi-location business models, either, given that Facebook’s guidelines require adding a modifier to the business name of each branch, rendering it inconsistent. This is in contrast to Google’s policy, which defines the needless addition of keywords or geo-modifiers to the business name as spam! When Google and Facebook fundamentally disagree on a guideline, a small measure of inconsistency is part and parcel of the scenario, and not something worth worrying about.
  3. Finally, for SABs who don’t want their address published anywhere on the Internet, automated citation management simply may not be a good match. Some partners in our network won’t accept address-less distribution from us, viewing it as incomplete data. If an SAB isn’t looking for complete NAP distribution because they want their address to be kept private, automation just isn’t ideal.

So how can SABs use something like Moz Local?

The Moz Local team sides with SABs — we’re not totally satisfied with the above state of affairs and are actively exploring better support options for the future. Given our admiration for these especially hard-working businesses, we feel SABs really deserve to have needless burdens lifted from their shoulders, which is exactly what Moz Local is designed to do. The task of manual local business listing publication and ongoing monitoring is a hefty one — too hefty in so many cases. Automation does the heavy lifting for you. We’re examining better solutions, but right now, what options for automation are open to the SAB?

Option #1: If your business is okay with your address being visible in multiple places, then simply be sure your Facebook listing shows your address and you can sign up for Moz Local today, no problem! We’ll push your complete NAP to the major aggregators and other partners, but know that your Moz Local dashboard consistency score won’t be 100%. This is because we won’t be able to “see” your Google My Business listing with its hidden address, and because choosing service-related categories will also hide your address on Citysearch, Localeze, and sometimes, Bing. Also note that one of our partners, Factual, doesn’t support locksmiths, bail bondsmen or towing companies. So, in using an automated solution like Moz Local, be prepared for a lower score in the dashboard, because it’s “baked into” the scenario in which some platforms show your full street address while others hide it. And, of course, be aware that many of your direct local competitors are in the same boat, facing the same limitations, thus leveling the playing field.

Option #2: If your business can budget for it, consider transitioning from an SAB to a brick-and-mortar business model, and get a real-world office that’s staffed during stated business hours. As Mike Blumenthal and Mary Bowling discuss is in this excellent video chat, smaller SABs need to be sure they can still make a profit after renting an office space, and that may largely be based on rental costs in their part of the country. Very successful virtual brands are exploring traditional retail options and traditional brick-and-mortar business models are setting up virtual showrooms; change is afoot. Having some customers come to the physical location of a typical SAB may require some re-thinking of service. A locksmith could grind keys on-site, a landscaper could virtually showcase projects in the comfort of their office, but what could a plumber do? Any ideas? If you can come up with a viable answer, and can still see profits factoring in the cost of office space, transitioning to brick-and-mortar effectively removes any barriers to how you represent yourself on Google and how fully you can use software like Moz Local.

If neither option works for you, and you need to remain an SAB with a hidden address, you’ll either need to a) build citations manually on sites that support your requirements, like these ones listed out by Phil Rozek, while having a plan for regularly monitoring your listings for emerging inconsistencies, duplicates and incoming reviews or b) hire a company to do the manual development and monitoring for you on the platforms that support hiding your address.

I wish the digital marketing sky could be the limit for SABs, but we’ve got to do the most we can working within parameters defined by Google and other location data platforms.

Now comes HSA: Google’s next SAB move

As service area business owner or marketer, you can’t be faulted for feeling that Google hasn’t handled your commercial scenario terribly well over the years. As we’ve discussed, Google has wobbled on policy and enforcement. Not yet mentioned is that they’ve never offered an adequate solution to the reality that a plumber located in City A equally services Cities B, C, and D, but is almost never allowed to rank in the local packs for these service cities. Google’s historic bias toward physical location doesn’t meet the reality of business models that go to clients to serve. And it’s this apparent lack of interest in SAB needs that may be adding a bit of sting to Google’s latest move: the Home Service Ads (HSA) program.

You’re not alone if you don’t feel totally comfortable with Google becoming a lead gen agent between customers and, to date:

  • Plumbers
  • House cleaners
  • Locksmiths
  • Handymen
  • Contractors
  • Electricians
  • Painters
  • Garage door services
  • HVAC companies
  • Roadside assistance services
  • Auto glass services

in a rapidly increasing number of cities.

Suddenly, SABs have moved to the core of Google’s consciousness, and an unprecedented challenge for these business models is that, while you can choose whether or not to opt into the program, there’s no way to opt out of the impacts it is having on all affected local results.

An upheaval in SAB visibility

If HSA has come to your geo-industry, and you don’t buy into the program, you will find yourself relegated to the bottom of the new HSA ad unit which appears above the traditional 3-pack in the SERPs:

hsa.jpg

Additionally, even if you were #1 in the 3-pack prior to HSA coming to town, if you lack a visible address, your claimed listing appears to have vanished from the pack and finder views.

hsa2.jpg

*I must tip my hat again to Joy Hawkins for helping me understand why that last example hasn’t vanished from the packs — it’s unclaimed. Honestly, this blip tempts me to unclaim an SAB listing and “manage” it via community edits instead of the GMB dashboard to see if I could maintain its local finder visibility… but this might be an overreaction!

If you’re marketing an SAB, have been relegated to the bottom of the HSA ad unit, and have vanished from the local pack/finder view, please share with our community how this has impacted your traffic and conversions. My guess would be that things are not so good.

So, what can SABs do in this new landscape?

I don’t have all of the answers to this question, but I do have these suggestions:

  1. Obviously, if you can budget for it, opt into HSA.
  2. But, bizarrely, understand that in some ways, Google has just made your GMB listing less important. If you have to hide your address and won’t be shown in HSA-impacted local packs and finder views because of this guideline compliance, your GMB listing is likely to become a less important source of visibility for your business.
  3. Be sure, then, that all of your other local business listings are in apple-pie order. If you’re okay with your address being published, you can automate this necessary work with software like Moz Local. If you need to keep your address private, put in the time to manually get listed everywhere you can. A converted lead from CitySearch or Foursquare may even feel like more of a victory than one from Google.
  4. Because diversification has just become a great deal more important, alternatives like those offered by visibility on Facebook are now more appealing than ever. And ramp up your word-of-mouth marketing and review management strategies like never before. If I were marketing an SAB, I’d be taking a serious new look at companies like ZipSprout, which helps establish real-world local relationships via sponsorships, and GetFiveStars, which helps with multiple aspects of managing reviews.
  5. Know that organic visibility is now more of a prize than previously. If you’re not in the packs, you’ve got to show up below them. This means clearly defining local SEO and traditional SEO as inextricably linked, and doing the customary work of keyword research, content development, and link management that have fueled organic SEO from the beginning. I’m personally committing to becoming more intimately familiar with Moz Pro so that I can better integrate into my skill set what software like this can do for local businesses, especially SABs.
  6. Expect change. HSA is still a test, and Google continues to experiment with how it’s displaying its paying customers in relationship to the traditional free packs and organic results. Who knows what’s next? If you’re marketing SABs, an empathetic and realistic approach to both historic and emerging limitations will help you create a strategy designed to ensure brand survival, independent of Google’s developments.

Why is Google doing this?

monopoly.jpg

I need to get some window blinds replaced in my home this fall. When I turned to Google’s (non-HSA) results and began calling local window treatment shops, imagine my annoyance in discovering that fully ½ of the listings in the local finder were for companies not located anywhere near my town. These brands had set up spam listings for a ton of different cities to which they apparently can send a representative, but where they definitely don’t have physical locations. I wasted a great deal of time calling each of them, and only felt better after reporting the listings to Google and seeing them subsequently removed.

I’m sharing this daily-life anecdote because it encapsulates the very best reason for Google rolling out Home Service Ads. Google’s program is meant to ensure that when I use their platform to access service companies, I’m finding vetted, legitimate enterprises with accurate location data and money-back satisfaction guarantees, instead of finding the mess of spam listings Google’s shifting policies and inadequate moderation have created. The HSA ad units can improve results quality while also protecting consumers from spurious providers.

The other evident purpose of HSA is the less civic-minded but no less brilliant one: there’s money to be made and Google’s profit motives are no different than those of any other enterprise. For the same reason that Amazon has gotten into the SAB lead gen business, Google wants a piece of this action. So, okay, no surprise there, and if the Google leads wind up growing the revenue of my wonderful German handyman, more power to them both.

But I hope my plumber, and yours, and your clients in the service markets, will take a step back from the Monopoly board and see this as a moment to reevaluate a game in which Google and Amazon are setting up big red hotels on Boardwalk and Park Place. I do advocate getting qualified for HSA, but I don’t advise a stance of unquestioning loyalty to or dependence on Google, particularly if you haven’t felt especially well-served by their SAB policies over the years. If Google can drive lucrative leads your way, take them, but remember you have one advantage Google, Amazon and other lead generation agencies lack: you are still the one who meets the customer face-to-face.

Opportunity is knocking in having a giant of visibility like Google selling you customers, because those customers, if amazed by your service, have grandmothers, and brothers and co-workers who can be directly referred to your company, completely outside the lead-gen loop. In fact, you might even come up with an incentivization program of your own to be sure that every customer you shake hands with is convinced of your appreciation for every referral they may send your way.

Don’t leave it all up to Google to make your local SAB brand a household word. Strategize for maximum independence via the real-world relationships you build, in the home of every neighbor where the door of welcome is opened in anticipation of the very best service you know how to give.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The SEO Competitive Analysis Checklist

Posted by zeehj

The SEO case for competitive analyses

“We need more links!” “I read that user experience (UX) matters more than everything else in SEO, so we should focus solely on UX split tests.” “We just need more keywords on these pages.”

If you dropped a quarter on the sidewalk, but had no light to look for it, would you walk to the next block with a street light to retrieve it? The obvious answer is no, yet many marketers get tunnel vision when it comes to where their efforts should be focused.

1942 June 3, Florence Morning News, Mutt and Jeff Comic Strip, Page 7, Florence, South Carolina. (NewspaperArchive)

Which is why I’m sharing a checklist with you today that will allow you to compare your website to your search competitors, and identify your site’s strengths, weaknesses, and potential opportunities based on ranking factors we know are important.

If you’re unconvinced that good SEO is really just digital marketing, I’ll let AJ Kohn persuade you otherwise. As any good SEO (or even keyword research newbie) knows, it’s crucial to understand the effort involved in ranking for a specific term before you begin optimizing for it.

It’s easy to get frustrated when stakeholders ask how to rank for a specific term, and solely focus on content to create, or on-page optimizations they can make. Why? Because we’ve known for a while that there are myriad factors that play into search engine rank. Depending on the competitive search landscape, there may not be any amount of “optimizing” that you can do in order to rank for a specific term.

The story that I’ve been able to tell my clients is one of hidden opportunity, but the only way to expose these undiscovered gems is to broaden your SEO perspective beyond search engine results page (SERP) position and best practices. And the place to begin is with a competitive analysis.

Competitive analyses help you evaluate your competition’s strategies to determine their strengths and weakness relative to your brand. When it comes to digital marketing and SEO, however, there are so many ranking factors and best practices to consider that can be hard to know where to begin. Which is why my colleague, Ben Estes, created a competitive analysis checklist (not dissimilar to his wildly popular technical audit checklist) that I’ve souped up for the Moz community.

This checklist is broken out into sections that reflect key elements from our Balanced Digital Scorecard. As previously mentioned, this checklist is to help you identify opportunities (and possibly areas not worth your time and budget). But this competitive analysis is not prescriptive in and of itself. It should be used as its name suggests: to analyze what your competition’s “edge” is.

Methodology

Choosing competitors

Before you begin, you’ll need to identify six brands to compare your website against. These should be your search competitors (who else is ranking for terms that you’re ranking for, or would like to rank for?) in addition to a business competitor (or two). Don’t know who your search competition is? You can use SEMRush and Searchmetrics to identify them, and if you want to be extra thorough you can use this Moz post as a guide.

Sample sets of pages

For each site, you’ll need to select five URLs to serve as your sample set. These are the pages you will review and evaluate against the competitive analysis items. When selecting a sample set, I always include:

  • The brand’s homepage,
  • Two “product” pages (or an equivalent),
  • One to two “browse” pages, and
  • A page that serves as a hub for news/informative content.

Make sure each site has equivalent pages to each other, for a fair comparison.

Scoring

The scoring options for each checklist item range from zero to four, and are determined relative to each competitor’s performance. This means that a score of two serves as the average performance in that category.

For example, if each sample set has one unique H1 tag per page, then each competitor would get a score of two for H1s appear technically optimized. However if a site breaks one (or more) of the below requirements, then it should receive a score of zero or one:

  1. One or more pages within sample set contains more than one H1 tag on it, and/or
  2. H1 tags are duplicated across a brand’s sample set of pages.

Checklist

Platform (technical optimization)

Title tags appear technically optimized. This measurement should be as quantitative as possible, and refer only to technical SEO rather than its written quality. Evaluate the sampled pages based on:

  • Only one title tag per page,
  • The title tag being correctly placed within the head tags of the page, and
  • Few to no extraneous tags within the title (e.g. ideally no inline CSS, and few to no span tags).

H1s appear technically optimized. Like with the title tags, this is another quantitative measure: make sure the H1 tags on your sample pages are sound by technical SEO standards (and not based on writing quality). You should look for:

  • Only one H1 tag per page, and
  • Few to no extraneous tags within the tag (e.g. ideally no inline CSS, and few to no span tags).

Internal linking allows indexation of content. Observe the internal outlinks on your sample pages, apart from the sites’ navigation and footer links. This line item serves to check that the domains are consolidating their crawl budgets by linking to discoverable, indexable content on their websites. Here is an easy-to-use Chrome plugin from fellow Distiller Dom Woodman to see whether the pages are indexable.

To get a score of “2” or more, your sample pages should link to pages that:

  • Produce 200 status codes (for all, or nearly all), and
  • Have no more than ~300 outlinks per page (including the navigation and footer links).

Schema markup present. This is an easy check. Using Google’s Structured Data Testing Tool, look to see whether these pages have any schema markup implemented, and if so, whether it is correct. In order to receive a score of “2” here, your sampled pages need:

  • To have schema markup present, and
  • Be error-free.

Quality of schema is definitely important, and can make the difference of a brand receiving a score of “3” or “4.” Elements to keep in mind are: Organization or Website markup on every sample page, customized markup like BlogPosting or Article on editorial content, and Product markup on product pages.

There is a “home” for newly published content. A hub for new content can be the site’s blog, or a news section. For instance, Distilled’s “home for newly published content” is the Resources section. While this line item may seem like a binary (score of “0” if you don’t have a dedicated section for new content, or score of “2” if you do), there are nuances that can bring each brand’s score up or down. For example:

  • Is the home for new content unclear, or difficult to find? Approach this exercise as though you are a new visitor to the site.
  • Does there appear to be more than one “home” of new content?
  • If there is a content hub, is it apparent that this is for newly published pieces?

We’re not obviously messing up technical SEO. This is partly comprised of each brand’s performance leading up to this line item (mainly Title tags appear technically optimized through Schema markup present).

It would be unreasonable to run a full technical audit of each competitor, but take into account your own site’s technical SEO performance if you know there are outstanding technical issues to be addressed. In addition to the previous checklist items, I also like to use these Chrome extensions from Ayima: Page Insights and Redirect Path. These can provide quick checks for common technical SEO errors.

Content

Title tags appear optimized (editorially). Here is where we can add more context to the overall quality of the sample pages’ titles. Even if they are technically optimized, the titles may not be optimized for distinctiveness or written quality. Note that we are not evaluating keyword targeting, but rather a holistic (and broad) evaluation of how each competitor’s site approaches SEO factors. You should evaluate each page’s titles based on the following:

H1s appear optimized (editorially). The same rules that apply to titles for editorial quality also apply to H1 tags. Review each sampled page’s H1 for:

  • A unique H1 tag per page (language in H1 tags does not repeat),
  • H1 tags that are discrete from their page’s title, and
  • H1s represent the content on the page.

Internal linking supports organic content. Here you must look for internal outlinks outside of each site’s header and footer links. This evaluation is not based on the number of unique internal links on each sampled page, but rather on the quality of the pages to which our brands are linking.

While “organic content” is a broad term (and invariably differs by business vertical), here are some guidelines:

  • Look for links to informative pages like tutorials, guides, research, or even think pieces.
    • The blog posts on Moz (including this very one) are good examples of organic content.
  • Internal links should naturally continue the user’s journey, so look for topical progression in each site’s internal links.
  • Links to service pages, products, RSVP, or email subscription forms are not examples of organic content.
  • Make sure the internal links vary. If sampled pages are repeatedly linking to the same resources, this will only benefit those few pages.
    • This doesn’t mean that you should penalize a brand for linking to the same resource two, three, or even four times over. Use your best judgment when observing the sampled pages’ linking strategies.

Appropriate informational content. You can use the found “organic content” from your sample sets (and the samples themselves) to review whether the site is producing appropriate informational content.

What does that mean, exactly?

  • The content produced obviously fits within the site’s business vertical, area of expertise, or cause.
    • Example: Moz’s SEO and Inbound Marketing Blog is an appropriate fit for an SEO company.
  • The content on the site isn’t overly self-promotional, resulting in an average user not trusting this domain to produce unbiased information.
    • Example: If Distilled produced a list of “Best Digital Marketing Agencies,” it’s highly unlikely that users would find it trustworthy given our inherent bias!

Quality of content. Highly subjective, yes, but remember: you’re comparing brands against each other. Here’s what you need to evaluate here:

  • Are “informative” pages discussing complex topics under 400 words?
  • Do you want to read the content?
  • Largely, do the pages seem well-written and full of valuable information?
    • Conversely, are the sites littered with “listicles,” or full of generic info you can find in millions of other places online?

Quality of images/video. Also highly subjective (but again, compare your site to your competitors, and be brutally honest). Judge each site’s media items based on:

  • Resolution (do the images or videos appear to be high quality? Grainy?),
  • Whether they are unique (do the images or videos appear to be from stock resources?),
  • Whether the photos or videos are repeated on multiple sample pages.

Audience (engagement and sharing of content)

Number of linking root domains. This factor is exclusively based on the total number of dofollow linking root domains (LRDs) to each domain (not total backlinks).

You can pull this number from Moz’s Open Site Explorer (OSE) or from Ahrefs. Since this measurement is only for the total number of LRDs to competitor, you don’t need to graph them. However, you will have an opportunity to display the sheer quantity of links by their domain authority in the next checklist item.

Quality of linking root domains. Here is where we get to the quality of each site’s LRDs. Using the same LRD data you exported from either Moz’s OSE or Ahrefs, you can bucket each brand’s LRDs by domain authority and count the total LRDs by DA. Log these into this third sheet, and you’ll have a graph that illustrates their overall LRD quality (and will help you grade each domain).

Other people talk about our content. I like to use BuzzSumo for this checklist item. BuzzSumo allows you to see what sites have written about a particular topic or company. You can even refine your search to include or exclude certain terms as necessary.

You’ll need to set a timeframe to collect this information. Set this to the past year to account for seasonality.

Actively promoting content. Using BuzzSumo again, you can alter your search to find how many of each domain’s URLs have been shared on social networks. While this isn’t an explicit ranking factor, strong social media marketing is correlated with good SEO. Keep the timeframe to one year, same as above.

Creating content explicitly for organic acquisition. This line item may seem similar to Appropriate informational content, but its purpose is to examine whether the competitors create pages to target keywords users are searching for.

Plug your the same URLs from your found “organic content” into SEMRush, and note whether they are ranking for non-branded keywords. You can grade the competitors on whether (and how many of) the sampled pages are ranking for any non-branded terms, and weight them based on their relative rank positions.

Conversion

You should treat this section as a UX exercise. Visit each competitor’s sampled URLs as though they are your landing page from search. Is it clear what the calls to action are? What is the next logical step in your user journey? Does it feel like you’re getting the right information, in the right order as you click through?

Clear CTAs on site. Of your sample pages, examine what the calls to action (CTAs) are. This is largely UX-based, so use your best judgment when evaluating whether they seem easy to understand. For inspiration, take a look at these examples of CTAs.

Conversions appropriate to several funnel steps. This checklist item asks you to determine whether the funnel steps towards conversion feel like the correct “next step” from the user’s standpoint.

Even if you are not a UX specialist, you can assess each site as though you are a first time user. Document areas on the pages where you feel frustrated, confused, or not. User behavior is a ranking signal, so while this is a qualitative measurement, it can help you understand the UX for each site.

CTAs match user intent inferred from content. Here is where you’ll evaluate whether the CTAs match the user intent from the content as well as the CTA language. For instance, if a CTA prompts a user to click “for more information,” and takes them to a subscription page, the visitor will most likely be confused or irritated (and, in reality, will probably leave the site).


This analysis should help you holistically identify areas of opportunity available in your search landscape, without having to guess which “best practice” you should test next. Once you’ve started this competitive analysis, trends among the competition will emerge, and expose niches where your site can improve and potentially outpace your competition.

Kick off your own SEO competitive analysis and comment below on how it goes! If this process is your jam, or you’d like to argue with it, come see me speak about these competitive analyses and the campaigns they’ve inspired at SearchLove London. Bonus? If you use that link, you’ll get £50 off your tickets.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!