How to Convince Your Boss to Send You to MozCon 2019

Posted by cheryldraper

From networking with your peers to hearing from industry leaders, there are benefits a-plenty to attending conferences. You know that. Your peers know that. But how do you persuade the powers-that-be (aka your boss) that sending you is beneficial for your business? 

To help convince your boss that won’t just be lounging pool-side, sipping cocktails on the company dime, we’ve gathered the goods to help you get your boss to greenlight your MozCon attendance.

How to make the case

Business competition is fiercer than ever. What used to make a splash now feels like it’s barely making ripples. Only those who are able to shift tactics with the changing tides of marketing will be able to come out on top.

And that’s exactly what MozCon is going to help you do.

Covering everything a growing marketer needs for a well-balanced marketing diet (SEO, content, strategy, growth), MozCon delivers top-notch talks from hand-selected speakers over three insightful days in July.

There’s so much in store for you this year. Here’s just a sampling of what you can expect at this year’s MozCon:

Speakers and content

Our speakers are real practitioners and industry leaders. We work with them to ensure they deliver the best content and insights to the stage to set you up for a year of success. No sales pitches or talking heads here!

Networking

You work hard taking notes, learning new insights, and digesting all of that knowledge — that’s why we think you deserve a little fun in the evenings. It’s your chance to decompress with fellow attendees and make new friends in the industry. We host exciting evening networking events that add to the value you’ll get from your day of education. Plus, our Birds of a Feather lunch tables allow you to connect with like-minded peers who share similar interests.

High-quality videos to share with your team

About a month or so after the conference, we’ll send you a link to professionally edited videos of every presentation at the conference. Your colleagues won’t get to partake in the morning Top Pot doughnuts or Starbucks coffee (the #FOMO is real), but they will get a chance to learn everything you did, for free.

An on-going supportive group 

Our MozCon Facebook group is incredibly active, and it’s grown to have a life of its own — marketers ask one another SEO questions, post jobs, look for and offer advice and empathy, and more. It’s a great place to find TAGFEE support and camaraderie long after the conference itself has ended.

Great food on site 

We know that conference food isn’t typically worth mentioning, but at MozCon is notorious for its snacking. You can expect two hot meals a day and loads of snacks from local Seattle vendors — in the past we’ve featured a smorgasbord from the likes of Trophy cupcakes, KuKuRuZa popcorn, Starbucks’ Seattle Reserve cold brew.

Swag

No duds here, we do our homework when it comes to selecting swag worthy of keeping. One-of-a-kind Roger Mozbots, a super-soft t-shirt, and more cool stuff you’ll want to take home and show off.

Wear your heart on your sleeve

MozCon and our attendees give back each year through donating Moz dollars towards a charitable organization.

Discounts for subscribers and groups 

Moz Pro subscribers get a whopping $500 off their ticket cost and there are discounts for groups as well, so make sure to take advantage of savings where you can!

Ticket cost

At MozCon our goal is to breakeven, which means we invest all of your ticket prices back into you. Check out the full breakdown of what your MozCon ticket gets you:

But of course, don’t take our word for it! There are some incredible resources available at your fingertips that tout the benefits of attending conferences:

I’m convinced, now grab my ticket!

Need a little more to get your boss on board? Check out some videos from years past to get a taste for the caliber of our speakers. We’ve also got a call for community speaker pitches (closes at 5 pm PDT on April 15, 2019) so if you’ve been thinking about breaking into the speaking circuit, it could be an amazing opportunity.

Buy ticket, save money, get competitive marketing insights. Everyone wins!

MozCon is one unforgettable experience that lives and grows with you beyond just the three days you spend in Seattle. And there’s no time like the present to pitch MozCon to your boss. If they’re still stuck on the “why”, let them know about our subscriber or group pricing tiers to your boss — you’ll save hundreds of dollars when you do. Just think of all the Keurigs you could get for that communal kitchen! 

Grab your ticket to MozCon!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How Bad Was Google’s Deindexing Bug?

Posted by Dr-Pete

On Friday, April 5, after many website owners and SEOs reported pages falling out of rankings, Google confirmed a bug that was causing pages to be deindexed:

MozCast showed a multi-day increase in temperatures, including a 105° spike on April 6. While deindexing would naturally cause ranking flux, as pages temporarily fell out of rankings and then reappeared, SERP-monitoring tools aren’t designed to separate the different causes of flux.

Can we isolate deindexing flux?

Google’s own tools can help us check whether a page is indexed, but doing this at scale is difficult, and once an event has passed, we no longer have good access to historical data. What if we could isolate a set of URLs, though, that we could reasonably expect to be stable over time? Could we use that set to detect unusual patterns?

Across the month of February, the MozCast 10K daily tracking set had 149,043 unique URLs ranking on page one. I reduced that to a subset of URLs with the following properties:

  1. They appeared on page one every day in February (28 total times)
  2. The query did not have sitelinks (i.e. no clear dominant intent)
  3. The URL ranked at position #5 or better

Since MozCast only tracks page one, I wanted to reduce noise from a URL “falling off” from, say, position #9 to #11. Using these qualifiers, I was left with a set of 23,237 “stable” URLs. So, how did those URLs perform over time?

Here’s the historical data from February 28, 2019 through April 10. This graph is the percentage of the 23,237 stable URLs that appeared in MozCast SERPs:

Since all of the URLs in the set were stable throughout February, we expect 100% of them to appear on February 28 (which the graph bears out). The change over time isn’t dramatic, but what we see is a steady drop-off of URLs (a natural occurrence of changing SERPs over time), with a distinct drop on Friday, April 5th, a recovery, and then a similar drop on Sunday, April 7th.

Could you zoom in for us old folks?

Having just switched to multifocal contacts, I feel your pain. Let’s zoom that Y-axis a bit (I wanted to show you the unvarnished truth first) and add a trendline. Here’s that zoomed-in graph:



The trend-line is in purple. The departure from trend on April 5th and 7th is pretty easy to see in the zoomed-in version. The day-over-day drop on April 5th was 4.0%, followed by a recovery, and then a second, very similar, 4.4% drop.

Note that this metric moved very little during March’s algorithm flux, including the March “core” update. We can’t prove definitively that the stable URL drop cleanly represents deindexing, but it appears to not be impacted much by typical Google algorithm updates.

What about dominant intent?

I purposely removed queries with expanded sitelinks from the analysis, since those are highly correlated with dominant intent. I hypothesized that dominant intent might mask some of the effects, as Google is highly invested in surfacing specific sites for those queries. Here’s the same analysis just for the queries with expanded sitelinks (this yielded a smaller set of 5,064 stable URLs):

Other than minor variations, the pattern for dominant-intent URLs appears to be very similar to the previous analysis. It appears that the impact of deindexing was widespread.

Was it random or systematic?

It’s difficult to determine whether this bug was random, affecting all sites somewhat equally, or was systematic in some way. It’s possible that restricting our analysis to “stable” URLs is skewing the results. On the other hand, trying to measure the instability of inherently-unstable URLs is a bit nonsensical. I should also note that the MozCast data set is skewed toward so-called “head” terms. It doesn’t contain many queries in the very-long tail, including natural-language questions.

One question we can answer is whether large sites were impacted by the bug. The graph below isolates our “Big 3” in MozCast: Wikipedia, Amazon, and Facebook. This reduced us to 2,454 stable URLs. Unfortunately, the deeper we dive, the smaller the data-set gets:

At the same 90–100% zoomed-in scale, you can see that the impact was smaller than across all stable URLs, but there’s still a clear pair of April 5th and April 7th dips. It doesn’t appear that these mega-sites were immune.

Looking at the day-over-day data from April 4th to 5th, it appears that the losses were widely distributed across many domains. Of domains that had 10-or-more stable URLs on April 4th, roughly half saw some loss of ranking URLs. The only domains that experienced 100% day-over-day loss were those that had 3-or-fewer stable URLs in our data set. It does not appear from our data that deindexing systematically targeted specific sites.

Is this over, and what’s next?

As one of my favorite movie quotes says: “There are no happy endings because nothing ever ends.” For now, indexing rates appear to have returned to normal, and I suspect that the worst is over, but I can’t predict the future. If you suspect your URLs have been deindexed, it’s worth manually reindexing in Google Search Console. Note that this is a fairly tedious process, and there are daily limits in place, so focus on critical pages.

The impact of the deindexing bug does appear to be measurable, although we can argue about how “big” 4% is. For something as consequential as sites falling out of Google rankings, 4% is quite a bit, but the long-term impact for most sites should be minimal. For now, there’s not much we can do to adapt — Google is telling us that this was a true bug and not a deliberate change.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Links as a Google Ranking Factor: A 2019 Study

Posted by EricEnge

Do Links Still Matter?

For the fourth year running, Stone Temple (now a part of Perficient Digital) conducted a study on how much links matter as a ranking factor. We did that using Moz’s Link Explorer and in this year’s study, we looked at the largest data set yet — 27,000 queries.

Our study used quadratic mean calculations on the Spearman correlations across all 27K tested queries. Not sure what that means? You can learn more about the study methodology here.

The major study components included:

  • Total number of links to the ranking pages
  • Moz DA of the links to the ranking pages
  • Moz PA of the links to the ranking pages

Slicing these calculations into several sub-categories:

  • Informational vs. commercial queries
  • Medical vs. Financial vs. Technology vs. All Other queries

We were also able to evaluate just how much the Moz link index had grown for a subset of the queries because we have used the same data on 16K of the 27K queries for three years running (this year’s study looked at 9K more queries, but 16K of the queries were in common). In fact, let’s start with that data:

That’s pretty significant growth! Congrats to Moz on that improvement.

Brief commentary on correlations

Correlation studies attempt to measure whether or not two factors are related to one another in any way. We use correlation studies to help us understand whether or not one factor potentially causes the other It’s important to understand that correlation does not prove causation; it simply suggests that it does.

The example I like to share is that there is a strong correlation between the consumption of ice cream and drowning. That does not mean that one causes the other. In fact, the causal factor here is intuitively obvious — hot weather. People eat more ice cream and people do more swimming when it’s hot outside.

But, in the case of links, we also have the fact that Google tells us that links still matter. If that’s not enough for you, Google still penalizes sites for questionable link-building practices. This is not an area they would invest in unless links matter.

So how do correlation scores work? 

A correlation score scale runs from -1 to 1. A score of 1 means a perfect correlation between two items. So if we have two variables (x and y), whenever x increases in value, so does y. A score of -1 means the exact opposite: whenever x increases in value, y decreases in value. A score of 0 means there is no perceivable relationship whatsoever. When x increases in value, y is equally likely to increase or decrease in value.

Search is a complex environment to evaluate. Google claims to use over 200 ranking factors. Therefore, it’s quite unlikely that any one factor will be dominant. High scores are not likely to happen at all and correlation scores of 0.2 or higher already start to suggest (but not prove) the existence of a relationship.

Core study results

Time to dive in! First, let’s take a look at the global view across all 27K queries:

This correlation score comes in at a solid 0.293 score. Considering the complexity of the Google algorithm’s 200+ ranking factors, having one single factor come in at a correlation score that high indicates a strong level of correlation.

Next, let’s take a look at the correlation to Moz DA and Moz PA:

Both DA and PA show strong correlations; in fact, more so than the total number of links to the ranking page.

This is interesting because it does suggest that at some level, the authority of the linking site and the linking page both matter. By the way, in the four years that we’ve conducted this study, this is the first time that the DA and PA scores have been a stronger indicator of ranking potential than the pure link count.

More broadly, from a link-building strategy perspective, this provides support for the notion that getting links from more authoritative sites is how you should focus that strategy.

Finally, let’s take a look at how commercial and informational queries differ:

Now that’s interesting — informational queries show a materially higher level of correlation than commercial ones.

From an interpretative perspective, that does not necessarily mean that they matter less. It may just mean that commercial pages get fewer links, so Google has to depend more heavily on other signals. But should those commercial pages happen to draw links for some reason, the impact of the links may still be as high.

Summary

The data still shows a strong correlation between links and rankings. Google’s public statements and its actions (in implementing penalties) also tell the same story. In short, links still matter. But we also see a clear indication that the nature and the quality of those links matter too!

Want more information? You can see the Stone Temple link study here

Tell us what you think — do links matter as a ranking factor?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

How to Find Your True Local Competitors

Posted by MiriamEllis

Who are your clients’ true competitors?

It’s a question that’s become harder to answer. What felt like a fairly simple triangulation between Google, brand, and searcher in the early days of the local web has multiplied into a geodesic dome of localization, personalization, intent matching, and other facets.

This evolution from a simple shape to a more complex shape has the local SEO industry starting to understand the need to talk about trends and patterns vs. empirical rankings.

For instance, you might notice that you just can’t deliver client reports that say, “Congratulations, you’re #1” anymore. And that’s because the new reality is that there is no #1 for all searchers. A user on the north side of town may see a completely different local pack of results if they go south, or if they modify their search language. An SEO may get a whole different SERP if they search on one rank checking tool vs. another — or even on the same tool, just five minutes later.

Despite all this, you still need to analyze and report — it remains a core task to audit a client’s competitive landscape.

 Today, let’s talk about how we can distill this dynamic, complex environment down to the simplest shapes to understand who your client’s true competitors are. I’ll be sharing a spreadsheet to help you and your clients see the trends and patterns that can create the basis for competitive strategy.

Why are competitive audits necessary…and challenging?

Before we dive into a demo, let’s sync up on what the basic point is of auditing local competitors. Essentially, you’re seeking contrast — you stack up two brands side-by-side to discover the metrics that appear to be making one of them dominant in the local or localized organic SERPs.

From there, you can develop a strategy to emulate the successes of the current winner with the goal of meeting and then surpassing them with superior efforts.

But before you start comparing your brand A to their brand B, you’ve got to know who brand B actually is. What obstacles do you face?

1. SERPs are incredibly diversified

    A recent STAT whitepaper that looked at 1.2 million keywords says it all: every SERP is a local SERP. And since both local packs and organic results are both subject to the whims of geo-location and geo-modification, incorporating them into your tracking strategy is a must.

    To explain, imagine two searchers are sitting on the same couch. One searches for “Mexican restaurant” and the other searches for “Mexican restaurant near me”. Then, they divvy up searching “Mexican restaurant near me” vs. “Mexican restaurant in San Jose”. And, so on. What they see are local packs that are only about 80 percent similar based on Google recognizing different intents. That’s significant variability.

    The scenario gets even more interesting when one of the searchers gets up and travels across town to a different zip code. At that point, the two people making identical queries can see local packs that range from only about 26–65 percent similar. In other words, quite different.

    Now, let’s say your client wants to rank for seven key phrases — like “Mexican restaurant,” “Mexican restaurant near me,” “Mexican restaurant San Jose,” “best Mexican restaurant,” “cheap Mexican restaurant,” etc. Your client doesn’t have just three businesses to compete against in the local pack; they now have multiple multiples of three!

    2) Even good rank tracking tools can be inconsistent

    There are many useful local rank tracking tools out there, and one of the most popular comes to us from BrightLocal. I really like the super easy interface of this tool, but there is a consistency issue with this and other tools I’ve tried, which I’ve captured in a screenshot, below.

    Here I’m performing the same search at 5-minute intervals, showing how the reported localized organic ranking of a single business vary widely across time.

    The business above appears to move from position 5 to position 12. This illustrates the difficulty of answering the question of who is actually the top competitor when using a tool. My understanding is that this type of variability may result from the use of proxies. If you know of a local rank checker that doesn’t do this, please let our community know in the comments.

    In the meantime, what I’ve discovered in my own work is that it’s really hard to find a strong and consistent substitute for manually checking which competitors rank where, on the ground. So, let’s try something out together.

    The simplest solution for finding true competitors

    Your client owns a Mexican restaurant and has seven main keyword phrases they want to compete for. Follow these five easy steps:

    Step 1: Give the client a local pack crash course

    If the client doesn’t already know, teach them how to perform a search on Google and recognize what a local pack is. Show them how businesses in the pack rank 1, 2, and 3. If they have more questions about local packs, how they show up in results, and how Google ranks content, they can check out our updated Beginners Guide to SEO.

    Step 2: Give the client a spreadsheet and a tiny bit of homework

    Give the client a copy of this free spreadsheet, filled out with their most desired keyword phrases. Have them conduct seven searches from a computer located at their place of business* and then fill out the spreadsheet with the names of the three competitors they see for each of the seven phrases. Tell them not to pay attention to any of the other fields of the spreadsheet.

    *Be sure the client does this task from their business’ physical location as this is the best way to see what searchers in their area will see in the local results. Why are we doing this? Because Google weights proximity of the searcher-to-the-business so heavily, we have to pretend we’re a searcher at or near the business to emulate Google’s “thought process”.

    Step 3: Roll up your sleeves for your part of the work

    Now it’s your turn. Look up “directions Google” in Google.

    Enter your client’s business address and the address of their first competitor. Write down the distance in the spreadsheet. Repeat for every entry in each of the seven local packs. This will take you approximately 10–15 minutes to cover all 21 locations, so make sure you’re doing it on company time to ensure you’re on the clock.

    Step 4: Get measuring

    Now, in the 2nd column of the spreadsheet, note down the greatest distance Google appears to be going to fill out the results for each pack.

    Step 5: Identify competitors by strength

    Finally, rate the competitors by the number of times each one appears across all seven local packs. Your spreadsheet should now look something like this:

    Looking at the example sheet above, we’ve learned that:

    • Mi Casa and El Juan’s are the dominant competitors in your client’s market, ranking in 4/7 packs. Plaza Azul is also a strong competitor, with a place in 3/7 packs.
    • Don Pedro’s and Rubio’s are noteworthy with 2/7 pack appearances.
    • All the others make just one pack appearance, making them basic competitors.
    • The radius to which Google is willing to expand to find relevant businesses varies significantly, depending on the search term. While they’re having to go just a couple of miles to find competitors for “Mexican restaurant”, they’re forced to go more than 15 miles for a long tail term like “organic Mexican restaurant”.

    You now know who the client’s direct competitors are for their most desired searches, and how far Google is willing to go to make up a local pack for each term. You have discovered a pattern of most dominant competition across your client’s top phrases, signaling which players need to be audited to yield clues about which elements are making them so strong.

    The pros and cons of the simple search shape

    The old song says that it’s a gift to be simple, but there are some drawbacks to my methodology, namely:

    • You’ll have to depend on the client to help you out for a few minutes, and some clients are not very good at participation, so you’ll need to convince them of the value of their doing the initial searches for you.
    • Manual work is sometimes tedious.
    • Scaling this for a multi-location enterprise would be time-consuming.
    • Some of your clients are going to be located in large cities and will want to know what competitors are showing up for users across town and in different zip codes. Sometimes, it will be possible to compete with these differently-located competitors, but not always. At any rate, our approach doesn’t cover this scenario and you will be stuck with either using tools (with their known inconsistencies), or sending the client across town to search from that locale. This could quickly become a large chore.

    Negatives aside, the positives of this very basic exercise are:

    • Instead of tying yourself to the limited vision of a single local pack and a single set of competitors, you are seeing a trend, a pattern of dominant market-wide competitors.
    • You will have swiftly arrived at a base set of dominant, strong, and noteworthy competitors to audit, with the above-stated goal of figuring out what’s helping them to win so that you can create a client strategy for emulating and surpassing them.
    • Your agency will have created a useful view of your client’s market, understanding the difference between businesses that seem very embedded (like Mi Casa) across multiple packs, vs. those (like Taco Bell) that are only one-offs and could possibly be easier to outpace.
    • You may discover some extremely valuable competitive intel for your client. For example, if Google is having to cast a 15-mile net to find an organic Mexican restaurant, what if your client started offering more organic items on their menu, writing more about this and getting more reviews that mention it? This will give Google a new option, right in town, to consider for local pack inclusion.
    • It’s really quite fast to do for a single-location business.
    • Client buy-in should be a snap for any research they’ve personally helped on, and the spreadsheet should be something they can intuitively and immediately understand.

    My questions for you

    I’d like to close by asking you some questions about your work doing competitive audits for local businesses. I’d be truly interested in your replies as we all work together to navigate the complex shape of Google’s SERPs:

    1. What percentage of your clients “get” that Google’s results have become so dynamic, with different competitors being shown for different queries and different packs being based on searcher location? What percentage of your clients are “there yet” with this concept vs. the old idea of just being #1, period?
    2. I’ve offered you a manual process for getting at trustworthy data on competitors, but as I’ve said, it does take some work. If something could automate this process for you, especially for multi-location clients, would you be interested in hearing more about it?
    3. How often do you do competitive audits for clients? Monthly? Every six months? Annually?

    Thanks for responding, and allow me to wish you and your clients a happy and empowering audit!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    The One-Hour Guide to SEO: Keyword Targeting & On-Page Optimization – Whiteboard Friday

    Posted by randfish

    We’ve covered strategy, keyword research, and how to satisfy searcher intent — now it’s time to tackle optimizing the webpage itself! In the fourth part of the One-Hour Guide to SEO, Rand offers up an on-page SEO checklist to start you off on your way towards perfectly optimized and keyword-targeted pages.

    If you missed them, check out the other episodes in the series so far:

    https://fast.wistia.net/assets/external/E-v1.js

    A picture of the whiteboard. The content is all detailed within the transcript below.

    Click on the whiteboard image above to open a high resolution version in a new tab!

    Video Transcription

    Howdy, Moz fans. Welcome to another edition of our special One-Hour Guide to SEO. We are now on Part IV – Keyword Targeting and On-Page Optimization. So hopefully, you’ve watched Part III, where we talked about searcher satisfaction, how to make sure searchers are happy with the page content that you create and the user experience that you build for them, as well as Part II, where we talked about keyword research and how to make sure that you are targeting the right words and phrases that searchers are actually looking for, that you think you can actually rank for, and that actually get real organic click-through rate, because Google’s zero-click searches are rising.

    A depiction of a site with important on-page SEO elements highlighted, drawn on the whiteboard.

    Now we’re into on-page SEO. So this is essentially taking the words and phrases that we know we want to rank for with the content that we know will help searchers accomplish their task. Now how do we make sure that the page is optimal for ranking in Google?

    On-page SEO has evolved

    Well, this is very different from the way it was years ago. A long time ago, and unfortunately many people still believe this to be true about SEO, it was: How do I stuff my keywords into all the right tags and places on the page? How do I take advantage of things like the meta keywords tag, which hasn’t been used in a decade, maybe two? How do I take advantage of putting all the words and phrases stuffed into my title, my URL, my description, my headline, my H2 through H7 tags, all these kinds of things?

    Most of that does not matter, but some of it still does. Some of it is still important, and we need to run through what those are so that you give yourself the best possible chance for ranking.

    The on-page SEO checklist

    So what I’ve done here is created a sort of brief, on-page SEO checklist. This is not comprehensive, especially on the technical portion, because we’re saving that for Part V, the technical SEO section, which we will get into, of this Guide. In this checklist, some of the most important things are on here. 

    ☑ Descriptive, compelling, keyword-rich title element

    Many of the most important things are on here, and those include things like a descriptive, compelling, keyword-rich but not stuffed title element, also called the page title or a title tag. So, for example, if I am a tool website, like toolsource.com — I made that domain name up, I assume it’s registered to somebody — and I want to rank for the “best online survey tools,” well, “The Best Online Survey Tools for 2019” is a great title tag, and it’s very different from best online survey tools, best online survey software, best online survey software 2019. You’ve seen title tags like that. You’ve seen pages that contain stuff like that. That is no longer good SEO practices.

    So we want that descriptive, compelling, makes me want to click. Remember that this title is also going to show up in the search results as the title of the snippet that your website appears in.

    ☑ Meta description designed to draw the click

    Second, a meta description. This is still used by search engines, not for rankings though. Sort of think of it like ad text. You are drawing a click, or you’re attempting to draw the click. So what you want to do is have a description that tells people what’s on the page and inspires them, incites them, makes them want to click on your result instead of somebody else’s. That’s your chance to say, “Here’s why we’re valuable and useful.”

    ☑ Easy-to-read, sensible, short URL

    An easy-to-read, sensible, short URL. For example, toolsource.com/reviews/best-online-surveys-2019. Perfect, very legible, very readable. I see that in the results, I think, “Okay, I know what that page is going to be.” I see that copied and pasted somewhere on the web, I think, “I know what’s going to be at that URL. That looks relevant to me.”

    Or reviews.best-online-tools.info. Okay, well, first off, that’s a freaking terrible domain name. /oldseqs?ide=17 bunch of weird letters and tab detail equals this, and UTM parameter equals that. I don’t know what this is. I don’t know what all this means. By the way, having more than one or two URL parameters is very poorly correlated with and not recommended for trying to rank in search results. So you want to try and rewrite these to be more friendly, shorter, more sensible, and readable by a human being. That will help Google as well.

    ☑ First paragraph optimized for appearing in featured snippets

    That first paragraph, the first paragraph of the content or the first few words of the page should be optimized for appearing in what Google calls featured snippets. Now, featured snippets is when I perform a search, for many queries, I don’t just see a list of pages. Sometimes I’ll see this box, often with an image and a bunch of descriptive text that’s drawn from the page, often from the first paragraph or two. So if you want to get that featured snippet, you have to be able to rank on page one, and you need to be optimized to answer the query right in your first paragraph. But this is an opportunity for you to be ranking in position three or four or five, but still have the featured snippet answer above all the other results. Awesome when you can do this in SEO, very, very powerful thing. Featured snippet optimization, there’s a bunch of resources on Moz’s website that we can point you to there too.

    ☑ Use the keyword target intelligently in…

    ☑ The headline

    So if I’m trying to rank for “best online survey tools,” I would try and use that in my headline. Generally speaking, I like to have the headline and the title of the piece nearly the same or exactly the same so that when someone clicks on that title, they get the same headline on the page and they don’t get this cognitive dissonance between the two.

    ☑ The first paragraph

    The first paragraph, we talked about. 

    ☑ The page content

    The page’s content, you don’t want to have a page that’s talking about best online survey tools and you never mention online surveys. That would be a little weird. 

    ☑ Internal link anchors

    An internal link anchor. So if other places on your website talk about online survey tools, you should be linking to this page. This is helpful for Google finding it, helpful for visitors finding it, and helpful to say this is the page that is about this on our website.

    A whiteboard drawing depicting how to target one page with multiple keywords vs multiple pages targeting single keywords.

    I do strongly recommend taking the following advice, which is we are no longer in a world where it makes sense to target one keyword per page. For example, best online survey tools, best online survey software, and best online survey tools 2019 are technically three unique keyword phrases. They have different search volumes. Slightly different results will show up for each of them. But it is no longer the case, whereas it was maybe a decade ago, that I would go create a page for each one of those separate things.

    Instead, because these all share the same searcher intent, I want to go with one page, just a single URL that targets all the keywords that share the exact same searcher intent. If searchers are looking to find exactly the same thing but with slightly modified or slight variations in how they phrase things, you should have a page that serves all of those keywords with that same searcher intent rather than multiple pages that try to break those up, for a bunch of reasons. One, it’s really hard to get links to all those different pages. Getting links just period is very challenging, and you need them to rank.

    Second off, the difference between those is going to be very, very subtle, and it will be awkward and seem to Google very awkward that you have these slight variations with almost the same thing. It might even look to them like duplicate or very similar or low-quality content, which can get you down-ranked. So stick to one page per set of shared intent keywords.

    ☑ Leverage appropriate rich snippet options

    Next, you want to leverage appropriate rich snippet options. So, for example, if you are in the recipes space, you can use a schema markup for recipes to show Google that you’ve got a picture of the recipe and a cooking time and all these different details. Google offers this in a wide variety of places. When you’re doing reviews, they offer you the star ratings. Schema.org has a full list of these, and Google’s rich snippets markup page offers a bunch more. So we’ll point you to both of those as well.

    ☑ Images on the page employ…

    Last, but certainly not least, because image search is such a huge portion of where Google’s search traffic comes from and goes to, it is very wise to optimize the images on the page. Image search traffic can now send significant traffic to you, and optimizing for images can sometimes mean that other people will find your images through Google images and then take them, put them on their own website and link back to you, which solves a huge problem. Getting links is very hard. Images is a great way to do it.

    ☑ Descriptive, keyword-rich filenames

    The images on your page should employ descriptive, keyword-rich filenames, meaning if I have one for typeform, I don’t want it to be pick one, two or three. I want it to be typeformlogo or typeformsurveysoftware as the name of the file.

    ☑ Descriptive alt attributes

    The alt attribute or alt tag is part of how you describe that for screen readers and other accessibility-focused devices, and Google also uses that text too. 

    ☑ Caption text (if appropriate)

    Caption text, if that’s appropriate, if you have like a photograph and a caption describing it, you want to be descriptive of what’s actually in the picture.

    ☑ Stored in same domain and subdomain

    These files, in order to perform well, they generally need to be hosted on the same domain and subdomain. If, for example, all your images are stored on an Amazon Web Services domain and you don’t bother rewriting or making sure that the domain looks like it’s on toolsource.com/photos or /images here, that can cause real ranking problems. Oftentimes you won’t perform at all in Google images because they don’t associate the image with the same domain. Same subdomain as well is preferable.

    If you do all these things and you nail searcher intent and you’ve got your keyword research, you are ready to move on to technical SEO and link building and then start ranking. So we’ll see you for that next edition next week. Take care.

    Video transcription by Speechpad.com

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    8 Content Distribution Ideas to Meet Your Brand’s Goals

    Posted by AlliBerry3

    There’s a lot to consider when creating a content strategy in 2019. Not only is there more competition than ever online, but there are so many types of content and ways to reach your target audience. Do you start a blog? Do you podcast? Should you focus on research studies or whitepapers? How do you really know what to do?

    But before you do anything else, you need to define what goals you want to accomplish with your content. 

    I’ve written previously about the importance of having an audience-focused content strategy before — and it’s still relevant. Every single piece of content you create needs to be mapped to a goal, otherwise, it’ll leave your audience wondering why they should care and what to do next, assuming it even reaches your target audience at all.

    But the work doesn’t stop there. Once you have your goals and your brand’s unique angle nailed down, you’ll also need to prioritize your means of content distribution. This is especially important if you’re just starting out — you should zero in on a few key distribution channels and master those before you expand into others, or you risk spreading yourself too thin and sabotage your chances of success in any of them.

    This post will help you zero in on what distribution channels make the most sense for your goals, and how to create content that will perform well in them.

    Content goal: Brand awareness

    If you’re a new brand or a lesser-known brand in your vertical, it’s crucial to expose your audience to your brand and demonstrate how it can solve their problems. There are many distribution options for brand awareness, and they all involve using external platforms in some way to help you connect to a larger audience of people.

    1. Syndication

    If your brand publishes a large volume of daily content that covers broader, news-worthy topics, content syndication can be an effective way to get your brand in front of a new audience.

    I work for a new affiliate marketing venture called The Ascent by The Motley Fool, and our coverage of broad, personal finance topics makes us a natural fit for content syndication. From Flipboard to Google News, major news outlets are always looking for money and finance-related content. Even though the SEO value is limited for content syndication, as links are typically no-followed, this is still an effective way for us to fulfill our brand awareness goal of reaching a wider, qualified audience. Just be sure any syndication partners will provide a canonical tag back to your site to ensure you don’t end up with duplicate content issues. The Fractl team did an impressive piece about understanding the networks of news syndication if you want to learn more.

    Content created for syndication typically has a timely slant to it, as that’s what major news outlets are looking for from syndication partners. Whether it’s a finance topic related to an upcoming holiday (i.e. 7 Personal Finance Lessons Learned in 2018) or something happening in the news (i.e. How to Financially Prepare for the Government Shutdown), it needs to be a gripping headline with information valuable to a reader today. It also needs to be quality content, free of errors, and not miles long.

    Answer the headline entirely, but eliminate the fluff. And don’t forget to include relevant links back to your site, so you can get this larger audience to visit your website.

    Musts for Syndicated Content:

    • A catchy headline
    • A timely topic
    • 1,000 words or less
    • Links in the content back to relevant content on your site

    2. Sponsored content or guest posts

    If your own website doesn’t have a great following, engaging in sponsored content on a more prominent website can be valuable for building brand awareness. The type of sponsored content I’m referring to here is online advertorials or articles  that look like normal articles, but are tagged as “sponsored content,” typically.

    BuzzFeed is a prominent platform for brands. Here’s an example of one of their finest:

    At the bottom, there’s a pitch for Wendy’s with a link:

    Because visitors can see that this content is “sponsored,” they are naturally more skeptical of it — and rightfully so. To create a quality native advertising piece, you’ll want it to be genuinely helpful and not overly promotional. It’s already clear it’s a promotion for your brand, so the content doesn’t need to reinforce that further.

    This above example clearly does not take itself seriously. It provides a quiz that is on-brand with what a BuzzFeed visitor would expect and want to see. There’s no overt promotional play for Wendy’s in the quiz.

    If you don’t want to pay for a sponsored content spot on another website, you could also look for relevant sites that take guest posts. This post you are currently reading is an example of that: I’m not paying, nor am I getting paid to publish this post with Moz. But, I am getting more brand exposure for my team and myself. And Moz is getting unique content with a fresh perspective.

    It’s a win-win!

    If you do pitch a site for a guest post, make sure it’s compelling and in line with what their audience wants. Keep it helpful and not promotional. You will need to establish trust with this new audience.

    Musts for Sponsored Content or Guest Posts:

    • A budget (for sponsored content)
    • Content is not promotional, but helpful or entertaining
    • A pitch and link to your site at the end of the content

    3. Paid advertising

    One of the big advantages of utilizing paid advertising is that you can see results right away and get your content in front of a qualified audience, whereas, organic takes longer to see growth.

    To get your content to perform well in paid search, it’ll need to be more niche and targeted to the keywords you’re bidding on, otherwise, your quality score will suffer. Google, Bing, and Yahoo all have their own forms of a quality score that takes into account a number of factors, including your expected CTR, landing page quality and relevance to your ad, and ad text relevance. This might mean you’ll need to develop more landing pages to cover your topics than you would for a page created for organic search. That’s not an issue from an SEO perspective as long as you no-index your landing pages.

    For example, the query “podcast software” gave me a really relevant ad for Buzzsprout.com, not only using my keyword in the ad but also providing relevant extended links below.

    Once on the landing page, it also gives me exactly what I’m looking for. The language varies slightly to “podcast hosting,” but it clearly answers my intent.

    Similarly, both Facebook and Twitter have a ‘relevancy score’ that acts as the quality score. These social platforms are measuring your expected engagement rate with an ad, which indicates how well your content matches the needs and interests of the audience you’re targeting.

    What this means is that, like with paid search, your content needs to be more niche and customized to your audience for higher performance.

    So many different types of content can work for paid advertising. Visual content can be incredibly powerful for paid advertising — whether it’s through video or images. There’s no better way to know how something will perform in paid marketing than through testing, but it’s important your content has these primary components:

    • A catchy, keyword-aligned headline
    • Standout images or video
    • Content that supports your hyper-target audience and keywords

    Goal: Organic acquisition

    Organic traffic is often an appealing distribution method because prospects qualify themselves through their relevant search queries. Not only do you want to have targeted content for key search queries, but it is also important to build domain authority by acquiring relevant, authoritative external links.

    For this, I have included two important tactics to achieve better results organically for your brand.

    4. Blog posts

    Blog posts are among the most common ways to rank well in organic search and acquire featured snippets. My team has almost exclusively been focused on blog articles up until this point, as it’s relatively easy and efficient to produce at scale.

    There are many types of blog posts you can create, both for more the discovery phase of a prospect, as well as the mid-level, narrowing down phase in the customer journey. Some blog post ideas that tend to perform well include:

    • How-to articles
    • Question and answer articles
    • Comparison articles
    • Best of articles
    • First person stories (ideally from a customer perspective)

    The key to successful blog posts is to have a targeted topic informed by keyword research. The Moz Keyword Explorer or SEMRush Keyword Magic Tool are great places to find topics for your blog posts. I have found both with The Ascent, as well as in my previous role at Kaplan Professional Education is that having blog posts that target specific long-tail keywords tend to perform better, and are more likely to pick up a featured snippet. However, the best way to know for your vertical is to test it yourself.

    In my experience, writing using the inverted pyramid technique works wonders for featured snippets. Answer the query broadly and concisely at the beginning of the article, and then dive into more details further into it. It’s a technique from journalism, so readers are used to it and search engines seem to really take to it.

    Musts for Blog Posts:

    • Have a target keyword/topic
    • Follow the inverted pyramid technique (cover the topic broadly and then narrow)
    • Contain a call-to-action

    5. Original research

    If acquiring external links is one of your SEO goals, conducting original research can be a powerful tactic for achieving success. What makes original research so powerful for link building is that you are the only source of your data. If you publish data that is unique to your organization or conduct your own survey or focus group and report the findings, it provides new data with unique insights to glean from it (assuming your methodology is solid, of course).

    Here is a great example of original research about how frequently brands produce original research (how meta!). It also provides great data on types of original research brands do if you want to learn more. This original data came from a survey of 700 marketers, and it worked. It got linked to by all kinds of prominent industry blogs like Search Engine Journal, Content Marketing Institute, Orbit Media, and now, this one too!

    If you don’t have any data that you can or want to publish from your organization directly and you don’t want to conduct your own surveys, there is also the option of mining official sources in your industry (government or census data work well in many cases) and finding a unique take and interpreting it for your audience to understand. Often, there is rich data buried in technical jargon that people don’t know about, and your original perspective can add a lot of value to your audience.

    For example, my team published this secondary research during the government shutdown in January. All of the government data in this piece is accessible to anyone, but it’s time-consuming to find and difficult to interpret. Our writer’s original take on it surfaced important insights that journalists incorporated in their shutdown coverage.

    Remember: Putting your own research out there won’t necessarily acquire links on its own. Even if you are a well-known resource, your efforts will be better served with outreach to relevant journalists or bloggers. If you’ve got the resources to dedicate to outreach, or the ability to hire an agency to help, this can be an extremely effective strategy that can help to build the authority of your entire site.

    Musts for original research:

    • An original take with supporting data
    • A solid research methodology (explained in the content)
    • An outreach strategy with custom pitches

    Goal: Lead generation

    If generating leads is your goal, your content will need to be compelling enough for a prospect to give you their contact information. They know what’s in store for them by giving you their email or phone number, so they won’t sign themselves up for marketing messaging for just average content.

    6. Whitepapers/E-books

    Although we just talked about original research for link acquisition, original research can also be an amazing way to generate leads if you want to put your research behind a sign-up wall. While the basic principles remain unchanged, find a topic you can create a unique study on, and execute it using a solid methodology. You should focus on the prospective leads you are trying to attain and create a research study or whitepaper that is irresistible to them.

    At Kaplan Financial Education, I developed e-books for each licensing prep product line. Using survey data that I gathered from previous Kaplan students, the intent was to help better prepare future Kaplan students for their journey through licensing and starting their career. The setup for creating this type of lead gen content was pretty simple: I pulled a list of previous customers and sent them a short survey via Survey Monkey. I asked:

    • What do you wish you had known when you were preparing for the licensing test?
    • What advice do you have for new professionals?

    After gathering over 100 responses, I extracted the data and grouped them into themes, pulling direct quotes for future insurance professionals. This is still successful lead gen content because it’s evergreen — it tells real stories from real people who have gone through the licensing process and started a relevant financial career. Prospective students can better understand what they are getting themselves into.

    At the time, this kind of advice from so many qualified professionals didn’t live anywhere else, making the e-book exclusive content. Qualified prospects were willing to download it for it’s exclusivity and saving them the time of having to conduct multiple informational interviews.

    Ideally, when you have lead gen content, you’ll want all of your free content to naturally lead into a call-to-action for your whitepaper or e-book. That way, any traffic that you attain through organic or paid advertising will naturally flow into the download. Creating a pitch at the end of your articles is a good habit to get into, as well as linking within your articles as appropriate.

    It’s also a good practice to only ask for the minimum amount of contact information that will allow you to market to these leads. If you plan to send them emails, only collect their email address, for example. The more information you require, the lower your conversion rate tends to be.

    Musts for whitepapers and e-books:

    • An original take with compelling data specifically targeting prospective leads
    • A solid methodology (explained in the content)
    • Enticing content that leads users to the lead gen download
    • Minimal contact information required to download

    7. Webinars

    Webinars that provide informative content for prospects can be an extremely effective medium for lead generation, particularly if you are using visuals to help explain concepts. The “in person” element also allows prospects to build a relationship (or the illusion of one) with the presenter(s) because they can hear and see the speaker live. You can also play up the exclusivity angle with webinars because the content is only available to those that choose to attend.

    Types of webinars that work particularly well for lead gen:

    • Demonstrations or how-to’s
    • Panel discussions about a relevant, timely topic in your industry
    • An interview with an industry expert
    • An in-depth presentation with a fresh take on a timely topic

    Similar to e-books and whitepapers, you’ll want to collect the minimum possible amount of contact information on your sign up form. If you only need an email address or a phone number, stick to that. The more you ask for a life story, the fewer sign-ups you’ll receive.

    Musts for webinar content:

    • Unique, relevant topic to prospects
    • Content that is designed for a real-time, audio and visual medium
    • Minimal contact information required for sign up

    Goal: Revenue

    Of course, any content program’s ultimate goal is to drive revenue. Content that leads to conversion directly, though, is often not given as much attention as some of other forms of content.

    8. Product pages

    Regardless of whether you sell your products online or not, your product pages on your website should be focused on driving action to purchase.

    To do this, you should keep your pages simple. Each product, no matter how similar, should have a unique product name and description to keep you clear of duplicate content issues. Focus on what the product is and how it will ultimately improve the life of a customer in a brief description. Bullet points in the description help the user scan and digest the important features of the product. Ian Lurie at Portent recently wrote about utilizing Amazon Q&A to inform what common questions people have about your product, and answering those in your product page bullet points. If you can do that, that’s a winning formula.

    Include images of the product, and if necessary, video too for a more holistic view of the product. And add a trust signal. Common trust signals include reviews, a customer quote, or a statistic about how the product helps customers.

    Most importantly, you need a prominent, clear call-to-action. It should stand out, be above the fold, and have clear language about what will happen in the next step.

    Must-haves for these pages:

    • Product Description
    • Visual of product (image, video)
    • Call to Action
    • Trust signal – ie. a quote or review, statistic, etc.

    Of course, these are just some of the most common goals I’ve seen in content strategies — there’s plenty more goals out there. Same goes for types of distribution for each of these goals — I’ve only scratched the surface. But if I listed out every possibility, you wouldn’t have made it this far through the post! 

    Over to you!

    These are just some common goals that have proven effective to me with clients and brands I have worked for. I’d love to know what you think, now: 

    • Do you agree with my points? 
    • Do you have other tactics that work for any of these goals? 
    • What different content goals do you have if they weren’t mentioned?

    If you’ve got other suggestions or ideas, I’d love to hear them in the comments!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Step into the Spotlight as a Community Speaker at MozCon 2019

    Posted by Danielle_Launders

    With MozCon 2019 right around the corner, we’re excited to announce our annual open call for community speakers! Are you the person that everyone in your office goes to for digital marketing advice? Dreaming of breaking into the speaking circuit to share your innovative ideas? Now’s the chance to submit your pitch for an opportunity to join industry leaders on stage in front of 1,500 of your peers. (No pressure!)

    Not sure what a community speaker is?

    At MozCon, we have a speaker selection committee that identifies practitioners at the top of their professional field, with a mean speaking game. But these sessions are by invite only, and we know the community is bursting at the seams with groundbreaking research, hot tips, and SEO tests that drive results.

    Cue our community speaker program! We reserve six 15-minute community speaking slots throughout our three-day event. Now’s the time of the season when we encourage anyone in the SEO community to submit their best and most exciting presentation ideas for MozCon. Not only are these sessions incredibly well-received by our attendees, but they’re also a fantastic way to get your foot in the door when it comes to the SEO speaking circuit.

    Interested in pitching your own idea? Read on for everything you need to know:

    To submit a pitch:

    • Fill out our community speaker submission form to enter.
    • Only one submission per person — make sure to choose the one you’re most passionate about!
    • Your pitch must be related to online marketing and for a topic that can be covered in 15 minutes.
    • Submissions close on Monday, April 15th at 5pm PDT — no exceptions!
    • All decisions are final.
    • All speakers must adhere to the MozCon Code of Conduct.
    • If chosen, you’ll be required to present your winning pitch July 15–17th at MozCon in Seattle, WA.

    I’m ready to submit my idea!

    If you submit a pitch, you’ll hear back from us regardless of your acceptance status, so please be patient until you hear from us — we’ll work hard to make our decisions as quickly as we can!

    As a community speaker you will receive:

    • 15 minutes on the MozCon stage for a keynote-style presentation
    • A free ticket to MozCon (we can issue a refund or transfer if you’e already purchased yours)
    • Four nights of lodging covered by Moz at our partner hotel
    • Reimbursement for your travel — up to $500 for domestic and $750 for international travel
    • An invitation for you and your significant other to join us for the pre-event speakers’ dinner (warning: it’s always delicious.)

    How we select our speakers:

    We have an internal committee of Mozzers that review every pitch. We analyze each topic to make sure there’s no overlap with our current sessions and to confirm that it’s a good fit for our audience. Next, we look at the entirety of the pitch to help us get a comprehensive idea of what to expect from your talk on the MozCon stage. This is where links to previous decks, content, and videos of past presentations is helpful (but isn’t required).

    Here’s how to make your pitch stand out:

    • Keep your pitch focused to online marketing. The more actionable the pitch, the better.
    • Be detailed! We want to know the actual tactics our audience will be learning about — not just a vague reference to them. Remember, we receive a ton of pitches, so the more clearly you can explain, the better you’ll stand out.
    • Review the topics already being presented — we’re looking for sessions that are new and that round out our agenda to add to the stage.
    • Brush up on how to prepare for speaking.
    • No pitches will be evaluated in advance, so please don’t ask 🙂
    • Using social media to lobby your pitch won’t help. Instead, put your time and energy into the actual pitch itself!
    • Linking to a previous example of a slide deck or presentation isn’t required, but it does help the committee a ton.

    Leading up to MozCon:

    If your pitch is selected, the MozCon team is here to support you along the way. It’s our goal to make sure this is your best talk to date, whether it’s your first time under those bright stage lights or you’re a seasoned speaker who feels perfectly at home in front of a big crowd. We’ll answer any questions you may have and work with you to deliver a talk you’ll be proud of. Here are just a handful of ways that we’re here to help:

    • Topic refinement
    • Helping with your session title and description
    • Reviewing any session outlines and drafts
    • Providing plenty of tips around best practices — specifically with the MozCon stage and audience in mind
    • Comprehensive show guide
    • Being available to listen to you practice your talk
    • Reviewing your final deck
    • A full stage tour on the Sunday before MozCon to meet our A/V crew, see your presentation on the big screen, and get a feel for the show
    • An amazing 15-person A/V team to support your presentation every second it’s on the big screen and beyond

    We’ve got our fingers crossed for you. Good luck!

    Submit my pitch!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    How to Diagnose and Solve JavaScript SEO Issues in 6 Steps

    Posted by TomekRu

    It’s rather common for companies to build their websites using modern JavaScript frameworks and libraries like React, Angular, or Vue. It’s obvious by now that the web has moved away from plain HTML and has entered the era of JS.

    While there is nothing unusual with a business willing to take advantage of the latest technologies, we need to address the stark reality of this trend: Most of the migrations to JavaScript frameworks aren’t being planned with users or organic traffic in mind.

    Let’s call it the JavaScript Paradox:

    1. The big brands jump on the JavaScript hype train after hearing all the buzz about JavaScript frameworks creating amazing UXs.
    2. Reality reveals that JavaScript frameworks are really complex.
    3. The big brands completely butcher the migrations to JavaScript. They lose organic traffic and often have to cut corners rather than creating this amazing UX journey for their users (I will mention some examples in this article).

    Since there’s no turning back, SEOs need to learn how to deal with JavaScript websites.

    But that’s easier said than done because making JavaScript websites successful in search engines is a real challenge both for developers and SEOs.

    This article is meant to be a follow-up to my comprehensive Ultimate Guide to JavaScript SEO, and it’s intended to be as easy to follow as possible. So, grab yourself a cup of coffee and let’s have some fun — here are six steps to help you diagnose and solve JavaScript SEO issues.

    Step 1: Use the URL inspection tool to see if Google can render your content

    The URL inspection tool (formerly Google Fetch and Render) is a great free tool that allows you to check if Google can properly render your pages.

    The URL inspection tool requires you to have your website connected to Google Search Console. If you don’t have an account yet, check Google’s Help pages.

    Open Google Search Console, then click on the URL inspection button.


    In the URL form field, type the full URL of a page you want to audit.

    Then click on TEST LIVE URL.

    Once the test is done, click on VIEW TESTED PAGE.

    And finally, click on the Screenshot tab to view the rendered page.

    Scroll down the screenshot to make sure your web page is rendered properly. Ask yourself the following questions:

    • Is the main content visible?
    • Can Google see the user-generated comments?
    • Can Google access areas like similar articles and products?
    • Can Google see other crucial elements of your page?

    Why does the screenshot look different than what I see in my browser? Here are some possible reasons:

    Step 2: Make sure you didn’t block JavaScript files by mistake

    If Google cannot render your page properly, you should make sure you didn’t block important JavaScript files for Googlebot in robots.txt

    TL;DR: What is robots.txt?

    It’s a plain text file that instructs Googlebot or any other search engine bot if they are allowed to request a page/resource.

    Fortunately, the URL Inspection tool points out all the resources of a rendered page that are blocked by robots.txt.

    But how can you tell if a blocked resource is important from the rendering point of view?

    You have two options: Basic and Advanced.

    Basic

    In most cases, it may be a good idea to simply ask your developers about it. They created your website, so they should know it well.

    Obviously, if the name of a script is called content.js or productListing.js, it’s probably relevant and shouldn’t be blocked.

    Unfortunately, as for now, URL Inspection doesn’t inform you about the severity of a blocked JS file. The previous Google Fetch and Render had such an option:

    Advanced

    Now, we can use Chrome Developer Tools for that.

    For educational purposes, we will be checking the following URL: http://botbenchmarking.com/youshallnotpass.html

    Open the page in the most recent version of Chrome and go to Chrome Developers Tools. Then move to the Network tab and refresh the page.

    Finally, select the desired resource (in our case it’s YouShallNotPass.js), right-click, and choose Block request URL.

    Refresh the page and see if any important content disappeared. If so, then you should think about deleting the corresponding rule from your robots.txt file.

    Step 3: Use the URL Inspection tool for fixing JavaScript errors

    If you see Google Fetch and Render isn’t rendering your page properly, it may be due to the JavaScript errors that occurred while rendering.

    To diagnose it, in the URL Inspection tool click on the More info tab.

    Then, show these errors to your developers to let them fix it.

    Just ONE error in the JavaScript code can stop rendering for Google, which in turn makes your website not indexable.

    Your website may work fine in most recent browsers, but if it crashes in older browsers (Google Web Rendering Service is based on Chrome 41), your Google rankings may drop.

    Need some examples?

    • A single error in the official Angular documentation caused Google to be unable to render our test Angular website.
    • Once upon a time, Google deindexed some pages of Angular.io, an official website of Angular 2+.

    If you want to know why it happened, read my Ultimate Guide to JavaScript SEO.

    Side note: If for some reason you don’t want to use the URL Inspection tool for debugging JavaScript errors, you can use Chrome 41 instead.

    Personally, I prefer using Chrome 41 for debugging purposes, because it’s more universal and offers more flexibility. However, the URL Inspection tool is more accurate in simulating the Google Web Rendering Service, which is why I recommend that for people who are new to JavaScript SEO.

    Step 4: Check if your content has been indexed in Google

    It’s not enough to just see if Google can render your website properly. You have to make sure Google has properly indexed your content. The best option for this is to use the site: command.

    It’s a very simple and very powerful tool. Its syntax is pretty straightforward: site:[URL of a website] “[fragment to be searched]”. Just take caution that you didn’t put the space between site: and the URL.

    Let’s assume you want to check if Google indexed the following text “Develop across all platforms” which is featured on the homepage of Angular.io.

    Type the following command in Google: site:angular.io “DEVELOP ACROSS ALL PLATFORMS”

    As you can see, Google indexed that content, which is what you want, but that’s not always the case.

    Takeaway:

    • Use the site: command whenever possible.
    • Check different page templates to make sure your entire website works fine. Don’t stop at one page!

    If you’re fine, go to the next step. If that’s not the case, there may be a couple of reasons why this is happening:

    • Google still didn’t render your content. It should happen up to a few days/weeks after Google visited the URL. If the characteristics of your website require your content to be indexed as fast as possible, implement SSR.
    • Google encountered timeouts while rendering a page. Are your scripts fast? Do they remain responsive when the server load is high?
    • Google is still requesting old JS files. Well, Google tries to cache a lot to save their computing power. So, CSS and JS files may be cached aggressively. If you can see that you fixed all the JavaScript errors and Google still cannot render your website properly, it may be because Google uses old, cached JS and CSS files. To work around it, you can embed a version number in the filename, for example, name it bundle3424323.js. You can read more in Google Guides on HTTP Caching.
    • While indexing, Google may not fetch some resources if it decides that they don’t contribute to the essential page content.

    Step 5: Make sure Google can discover your internal links

    There are a few simple rules you should follow:

    1. Google needs proper <a href> links to discover the URLs on your website.
    2. If your links are added to the DOM only when somebody clicks on a button, Google won’t see it.

    As simple as that is, plenty of big companies make these mistakes.

    Proper link structure

    Googlebot, in order to crawl a website, needs to have traditional “href” links. If it’s not provided, many of your webpages will simply be unreachable for Googlebot!

    I think it was explained well by Tom Greenway (a Google representative) during the Google I/O conference:

    Please note: if you have proper <a href> links, with some additional parameters, like onClick, data-url, ng-href, that’s still fine for Google.

    A common mistake made by developers: Googlebot can’t access the second and subsequent pages of pagination

    Not letting Googlebot discover pages from the second page of pagination and beyond is a common mistake that developers make.

    When you open the mobile versions for Gearbest, Aliexpress and IKEA, you will quickly notice that, in fact, they don’t let Googlebot see the pagination links, which is really weird. When Google enables mobile-first indexing for these websites, these websites will suffer.

    How do you check it on your own?

    If you haven’t already downloaded Chrome 41, get it from Ele.ph/chrome41.

    Then navigate to any page. For the sake of the tutorial, I’m using the mobile version of AliExpress.com. For educational purposes, it’s good if you follow the same example.

    Open the mobile version of the Mobile Phones category of Aliexpress.

    Then, right-click on View More and select the inspect button to see how it’s implemented.

    As you can see, there are no <a href>, nor <link rel> links pointing to the second page of pagination.

    There are over 2,000 products in the mobile phone category on Aliexpress.com. Since mobile Googlebot is able to access only 20 of them, that’s just 1 percent!

    That means 99 percent of the products from that category are invisible for mobile Googlebot! That’s crazy!

    These errors are caused by the wrong implementation of lazy loading. There are many other websites that make similar mistakes. You can read more in my article “Popular Websites that May Fail in Mobile First Indexing”.

    TL;DR: using link rel=”next” alone is too weak a signal for Google

    Note: it’s common to use “link rel=”next’ to indicate pagination series. However, the discoveries from Kyle Blanchette seem to show that “link rel=”next” alone is too weak a signal for Google and should be strengthened by the traditional <a href> links.

    John Mueller discussed this more:

    “We can understand which pages belong together with rel next, rel=”previous”, but if there are no links on the page at all, then it’s really hard for us to crawl from page to page. (…) So using the rel=”next” rel=”previous” in the head of a page is a great idea to tell us how these pages are connected, but you really need to have on-page, normal HTML links.

    Don’t get me wrong — there is nothing wrong with using <link rel=”next”>. On the contrary, they are beneficial, but it’s good to combine these tags with traditional <a href> links.

    Checking if Google can see menu links

    Another important step in auditing a JavaScript website is to make sure Google can see your menu links. To check this, use Chrome 41.

    For the purpose of the tutorial, we will use the case of Target.com:

    To start, open any browser and pick some links from the menu:

    Next, open Chrome 41. In the Chrome Developer Tools (click Ctrl + Shift + J),  navigate to the elements tab.

    The results? Fortunately enough, Google can pick up the menu links of Target.com.

    Now, check if Google can pick up the menu links on your website and see if you’re on target too.

    Step 6: Checking if Google can discover content hidden under tabs

    I have often observed that in the case of many e-commerce stores, Google cannot discover and index their content that is hidden under tabs (product descriptions, opinions, related products, etc). I know it’s weird, but it’s so common.

    It’s a crucial part of every SEO audit to make sure Google can see content hidden under tabs.

    Open Chrome 41 and navigate to any product on Boohoo.com; for instance, Muscle Fit Vest.

    Click on Details & Care to see the product description:

    “DETAILS & CARE

    94% Cotton 6% Elastane. Muscle Fit Vest. Model is 6’1″ and Wears UK Size M.“

    Now, it’s time to check if it’s in the DOM. To do so, go to Chrome Developers Tools (Ctrl + Shift + J) and click on the Network tab.

    Make sure the disable cache option is enabled.

    Click F5 to refresh the page. Once refreshed, navigate to the Elements tab and search for a product description:

    As you can see, in the case of boohoo.com, Google is able to see the product description.

    Perfect! Now take the time and check if your website is fine.

    Wrapping up

    Obviously, JavaScript SEO is a pretty complex subject, but I hope this tutorial was helpful.

    If you are still struggling with Google ranking, you might want to think about implementing dynamic rendering or hybrid rendering. And, of course, feel free to reach out to me on Twitter about this or other SEO needs.

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Now Live for Your SEO Learning Pleasure: The NEW Beginner’s Guide to SEO!

    Posted by FeliciaCrawford

    It feels like it’s been a king’s age since we first began our long journey to rewrite and revamp the Beginner’s Guide to SEO. For all the long months of writing and rewriting, of agonizing over details and deleting/replacing sections every so often as Google threw us for a loop, it’s hard to believe it’s finally ready to share:

    The new Beginner’s Guide to SEO is here!

    What makes this new version so darn special and sparkly, anyway?

    I’m glad you asked! Our design team would breathe a sigh of relief and tell you it’s because this baby is on-brand and ready to rock your eyeballs to next Tuesday with its use of fancy, scalable SVGs and accessible fonts and images complete with embedded text and alt text descriptions. Our team of SEO experts would blot the sweat from their collective brow and tell you it’s because we’ve retooled and completely updated all our recommendations to ensure we’re giving fledgling learners the most accurate push out of the digital marketing nest that we can. Our developers would tell you it’s because it lives on a brand-spankin’-new CMS and they no longer have to glare silently at my thirteenth Slack message of the day asking them to fix the misplaced period on the fourth paragraph from the top in Chapter 7.

    All joking aside, every bit of the above is true, and each perspective pulls together a holistic answer: this version of the Beginner’s Guide represents a new era for the number-one resource for learning SEO, one where we can update it at the drop of a Google algorithm-shaped hat, where it’s easier than ever to access and learn for a greater variety of people, where you can rely on the fact that the information is solid, up-to-date, and molded to best fit the learning journey unique to SEO.

    I notice the structure is a little different, what gives?

    We can’t escape your eagle eyes! We structured the new guide quite differently from the original. Everything is explained in our introduction, but here’s the gist: taking inspiration from Maslow’s hierarchy of needs, we built each chapter based on the core foundation of how one ought to go about doing SEO, covering the most integral needs first before leveling up to the next.

    A pyramid of SEO needs mimicking Maslow's Hierarchy of Needs theory of psychology.

    We affectionately call this “Mozlow’s Hierarchy of Needs.” Please forgive us.

    A small but mighty team

    While it may have taken us a full year and a half to get to this point, there was but a small team behind the effort. We owe a huge amount of gratitude to the following folks for balancing their other priorities with the needs of the new Beginner’s Guide and putting their all into making this thing shine:

    Britney Muller, our brilliant SEO scientist and the brains behind all the new content. Words cannot do justice to the hours she spent alone and after hours before a whiteboard, Post-Its and dry-erase notes making up the bones and muscles and soul of what would someday become this fully-fleshed-out guide. For all the many, many blog comments answered and incorporated, for all the emails and Twitter messages fielded, for all the love and hard work and extra time she spent pouring into the new content, we have to give a heartfelt and extremely loud and boisterous THANK YOU. This guide wouldn’t exist without her expertise, attention to detail, and commitment to excellence.

    Kameron Jenkins, our SEO wordsmith and all-around content superheroine. Her exquisite grasp of the written word and extensive experience as an agency SEO were paramount in pulling together disparate feedback, finessing complicated concepts into simple and understandable terms, and organizing the information in ways most conducive to aiding new learners. Again, this guide wouldn’t be here without her positive attitude and incredible, expert help.

    Trevor Klein, editor extraordinaire. His original vision of organizing it according to the SEO hierarchy of needs provided the insight and architecture necessary to structuring the guide in a completely new and utterly helpful way. Many of the words, voice, and tone therein belong to him, and we deeply appreciate the extra polish and shine he lent to this monumental effort.

    Skye Stewart, talented designer and UX aficionado. All the delightful images you’ll find within the chapters are compliments of her careful handiwork, from the robo-librarian of Chapter 2 to the meat-grinder-turned-code-renderer of Chapter 5. The new Beginner’s Guide would be an infinitely less whimsical experience without her creativity and vision.

    Casey Coates, software engineer and mystical CMS-wizard-come-miracle-maker. I can safely say that there is no way you would be exploring the brand-new Beginner’s Guide in any coherent manner without his help. For all the last-minute additions to CMS deploys, for calmly fielding all the extra questions and asks, for being infinitely responsive and helpful (case in point: adding alt text to the image block less than two minutes after I asked for it) and for much, much more, we are grateful.

    There are a great many other folks who helped get this effort underway: Shelly Matsudaira, Aaron Kitney, Jeff Crump, and Cyrus Shepard for their integral assistance moving this thing past the finish line; Rand Fishkin, of course, for creating the original and longest-enduring version of this guide; and to all of you, our dear community, for all the hours you spent reading our first drafts and sharing your honest thoughts, extremely constructive criticisms, and ever-humbling praise. This couldn’t exist without you!

    Y’all ready for this?

    With tender pride and only a hint of the sort of naturally occurring anxiety that accompanies any big content debut, we’re delighted and excited for you to dive into the brand-new Beginner’s Guide to SEO. The original has been read over ten million times, a mind-boggling and truly humbling number. We can only hope that our newest incarnation is met by a similar number of bright minds eager to dive into the exhilarating, challenging, complex, and lucrative world of SEO.

    Whether you’re just starting out, want to jog your memory on the fundamentals, need to clue in colleagues to the complexity of your work, or are just plain curious about what’s changed, we hope from the bottom of our hearts that you get what you need from the new Beginner’s Guide.

    Dive in and let us know what you think!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Topical SEO: 7 Advanced Concepts of Link Relevance & Google Rankings

    Posted by Cyrus-Shepard

    Links matter for SEO. A lot.

    Most marketers understand that links to websites count as “votes” on the web. Google — and other search engines — use these votes to rank web pages in search results. The more votes a page accumulates, the better that page’s chances of ranking in search results.

    This is the popularity part of Google’s algorithm, described in the original PageRank patent. But Google doesn’t stop at using links for popularity. They’ve invented a number of clever ways to use links to determine relevance and authority — i.e. what is this page about and is it a trusted answer for the user’s search query?

    To rank in Google, it’s not simply the number of votes you receive from popular pages, but the relevance and authority of those links as well.

    The principals Google may use grow complex quickly, but we’ve included a number of simple ways to leverage these strategies for more relevant rankings at the bottom of the post.

    1. Anchor text 

    In the beginning, there was the original PageRank patent, which changed the way search engines worked. It talked about anchor text a lot:

    “Thus, even though the text of the document itself may not match the search terms, if the document is cited by documents whose titles or backlink anchor text match the search terms, the document will be considered a match.”

    In a nutshell, if a page links to you using the anchor text “hipster pizza,” there’s a good chance your page is about pizza — and maybe hipsters.

    If many pages link to you using variations of “pizza”— i.e. pizza restaurant, pizza delivery, Seattle pizza — then Google can see this as a strong ranking signal.

    (In fact, so powerful is this effect, that if you search Google for “hipster pizza” here in Seattle, you’ll see our target for the link above ranking on the first page.)

    Anchor Text

    How to leverage Anchor Text for SEO:

    Volumes could be written on this topic. Google’s own SEO Starter Guide recommends a number of anchor text best practices, among them:

    1. Use (and seek) descriptive anchor text that describes what your page is about
    2. Avoid generic anchor text, off-topic anchor text
    3. Keep anchor text concise – no more than a few words

    While some Google patents discuss ignoring links with irrelevant anchor text, other Google patents propose looking at the text surrounding the anchor text for additional context, so keep that in mind.

    A word of caution: While optimizing your anchor text is good, many SEOs over the years have observed that too much of a good thing can hurt you. Natural anchor text on the web is naturally varied.

    Check out the variety of anchor text to Moz’s page on Domain Authority, illustrated here using Link Explorer.

    Link Explorer Anchor Text

    Over-optimization can signal manipulation to Google, and many SEOs recommend a strategy of anchor text variety for better rankings.

    Additional Resources:

    2. Hub and authority pages

    In the early days of Google, not long after Larry Page figured out how to rank pages based on popularity, the Hilltop algorithm worked out how to rank pages on authority. It accomplished this by looking for “expert” pages linking to them.

    An expert page is a document that links to many other topically relevant pages. If a page is linked to from several expert pages, then it is considered an authority on that topic and may rank higher.

    Authority Pages for SEO

    A similar concept using “hub” and “authority” pages was put forth by Jon Kleinberg, a Cornell professor with grants from Google and other search engines. Kleinberg explains:

    “…a good hub is a page that points to many good authorities; a good authority is a page that is pointed to by many good hubs.”
    Authoritative Sources in a Hyperlinked Environment (PDF)

    While we can’t know the degree to which these concepts are used today, Google acquired the Hilltop algorithm in 2003.

    How to leverage Authority Pages for SEO:

    A common practice of link builders today is to seek links from “Resource Pages.” These are basically Hub/Expert pages that link out to helpful sites around a topic. Scoring links on these pages can often help you a ton.

    Additional Resources: Resource Page Link Building

    3. Reasonable Surfer

    All links are not created equal.

    The idea behind Google’s Reasonable Surfer patent is that certain links on a page are more important than others, and thus assigned increase weight. Examples of more important links include:

    • Prominent links, higher up in the HTML
    • Topically relevant links, related to both the source document and the target document.

    Conversely, less important links include:

    • “Terms of Service” and footer links
    • Banner ads
    • Links unrelated to the document

    Because the important links are more likely to be clicked by a “reasonable surfer,” a topically relevant link can carry more weight than an off-topic one.

    “…when a topical cluster associated with the source document is related to a topical cluster associated with the target document, the link has a higher probability of being selected than when the topical cluster associated with the source document is unrelated to the topical cluster associated with the target document.”
    United States Patent: 7716225

    Reasonable Surfer Google

    How to leverage Reasonable Surfer for SEO:

    The key with leveraging Reasonable Surfer for SEO is simply: work to obtain links that are more likely to get clicked.

    This means that you not only benefit from getting links from prominent areas of high-traffic pages, but the more relevant the link is to the topic of the hosting page, the more benefit it may provide.

    Neither page topics/anchor texts have to be an exact match, but it helps if they are in the same general area. For example, if you were writing about “baseball,” links with relevant anchor text from pages about sports, equipment, athletes, training, exercise, tourism, and more could all help boost rankings more than less relevant links.

    4. Topic-sensitive PageRank

    Despite rumors to the contrary, PageRank is very much alive and well at Google.

    PageRank technology can be used to distribute all kinds of different ranking signals throughout a search index. While the most common examples are popularity and trust, another signal is topical relevance, as laid out in this paper by Taher Haveliwala, who went on to become a Google software engineer.

    The original concept works by grouping “seed pages” by topic (for example, the Politics section of the New York Times). Every link out from these pages passes on a small amount of Topic-Sensitive PageRank, which is passed on through the next set of links, and so on.

    

    In the example above, 2 identical pages target “Football”. Both have the same number of links, but the first one has more relevant Topic-Sensitive PageRank from a linking sports page. Hence, it ranks higher.

    How to leverage topic-sensitive PageRank for SEO:

    The concept is simple. When obtaining links, try to get links from pages that are about the same topic you want to rank for. Also, get links from pages that are themselves linked to by authoritative pages on the same topic.

    5. Phrase-based indexing

    Phrase-based indexing can be a tough concept for SEOs to wrap their heads around.

    What’s important to understand is that phrase-based indexing allows search engines to score the relevancy of any link by looking for related phrases in both the source and target pages. The more related phrases, the higher the score.

    In the example below, the first page with the anchor text link “US President” may carry more weight because the page also contains several other phrases related to “US President” and “John Adams.”

    Phrase-based Indexing

    In addition to ranking documents based on the most relevant links, phrase-based indexing allows search engines to consider less relevant links as well, including:

    1. Discounting spam and off-topic links: For example, an injected spam link to a gambling site from a page about cookie recipes will earn a very low outlink score based on relevancy and would carry less weight.
    2. Fighting “Google Bombing”: For those that remember, Google bombing is the art of ranking a page highly for funny or politically-motivated phrases by “bombing” it with anchor text links, often unrelated to the page itself. Phrase-based indexing can stop Google bombing by scoring the links for relevance against the actual text on the page. This way, irrelevant links can be discounted.

    How to leverage phased-based indexing for SEO:

    Beyond anchor text and the general topic/authority of a page, it’s helpful to seek links from pages with related phrases.

    This is especially helpful for on-page SEO and internal linking — when you optimize your own pages and link to yourself. Some people use LSI keywords for on-page optimization, though evidence that this helps SEO is disputed.

    Solid keyword research typically provides a starting point to identify related keyword phrases. Below are closely related phrases to “best SEO tools” found using Keyword Explorer.

    Related keywords Keyword Explorer

    6. Local inter-connectivity

    Local inter-connectivity refers to a reranking concept that reorders search results based on measuring how often each page is linked to by all the other pages.

    To put it simply, when a page is linked to from a number of high-ranking results, it is likely more relevant than a page with fewer links from the same set of results.

    This also provides a strong hint as to the types of links you should be seeking: pages that already rank highly for your target term.

    Local Inter-connectivity

    How to leverage local inter-connectivity for SEO:

    Quite simply, one of the easiest ways to rank is to obtain topically relevant links from sites that already rank for the term you are targeting.

    Oftentimes, links from page 1 results can be quite difficult to obtain, so it’s helpful to look for links that:

    • Rank for variations of your target terms
    • Are further down in Google’s results pages
    • Rank well for different, but still topically-related terms

    7. The golden question

    If the above concepts seem complex, the good news is you don’t have to actually understand the above concepts when trying to build links to your site.

    To understand if a link is topically relevant to your site, simply ask yourself the golden question of link building: Will this link bring engaged, highly qualified visitors to my website?

    The result of the golden question is exactly what Google engineers are trying to determine when evaluating links, so you can arrive at a good end result without understanding the actual algorithms.

    Link Building Golden Question

    How to leverage the golden question for SEO:

    Above all else, try to build links that bring engaged, high-value visitors to your site.

    If you don’t care about the visitors a link may bring, why should Google care highly about the link?

    SEO tips for topically relevant links

    Consider this advice when thinking about links for SEO:

    1. DO use good, descriptive anchor text for your links. This applies to internal links, outlinks to other sites, and links you seek from non-biased external sites.
    2. DO seek relationships from authoritative, topically relevant sites. These include sites that rank well for your target keyword and “expert” pages that link to many authority sites. (For those interested, Majestic has done some interesting work around Topical Trust Flow.)
    3. DO seek links from relevant pages. This includes examining the title, body, related phrases, and intent of the page to ensure its relevance to your target topic.
    4. DO seek links that people are likely to click. The ideal link is often both topically relevant and placed in a prominent position.
    5. AVOID generic or non-descriptive anchor text.
    6. AVOID over-optimizing your links. This includes repetitive use of exact match anchor text and keyword stuffing.
    7. AVOID manipulative link building. Marie Haynes has written an excellent explanation of the kinds of unnatural links that you likely want to avoid at all costs.

    Finally, DO try to earn and attract links to your site with high quality, topically relevant content.

    Big thanks to Bill Slawski and his blog SEO by the Sea, which acted as a starting point of research for many of these concepts.

    What are your best tips around topically relevant links? Let us know in the comments below!


    Note: A version of this post was published previously, and has since been substantially updated. Big thanks to Bill Slawski and his blog SEO by the Sea, which acted as a starting point of research for many of these concepts.

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!