I decided to write this post when I saw, in late February, this poll on Twitter:
Twitter polls may not be the most scientific way that one could research the state of the industry, but it did remind me how common this is – I routinely receive outreach emails or see RFPs & case studies that discuss link-building in terms of Moz’s DA (Domain Authority) metric – typically how many links of above a certain DA were or could be built.
Before we go any further: I am not objecting to DA as a useful SEO metric. There are plenty of applications it’s perfectly suited to. I’m also not writing here about the recent updates to DA, although I sadly probably do need to clarify that DA is only a third party metric, which is designed to reflect Google – it does not itself impact rankings.
Rather, it’s the specific use-case, of using DA to measure the success of link-building, that I’m ranting about in this post.
I think I get why DA has become popular in this space. In my opinion, it has a couple of really big advantages:
Timeliness – if a journalist writes an article about your brand or your piece, it’ll take a while for page-level metrics to exist. DA will be available instantly.
Critical mass – we’re all really familiar with this metric. A couple of years ago, my colleague Rob Ousbey and I were talking to a prospective client about their site. We’d only been looking at it for about 5 minutes. Rob challenged me to guess the DA, and he did the same. We both got within 5 of the real value. That’s how internalised DA is in the SEO industry – few other metrics come close. Importantly, this has now spread even beyond the SEO industry, too – non SEO-savvy stakeholders can often be expected to be familiar with DA.
So, if in the first week of coverage, you want to report on DA – fine, I guess I’ll forgive you. Similarly, further down the line, if you’re worried your clients will expect to see DA, maybe you can do what we do at Distilled – report it alongside some more useful metrics, and, over time, move expectations in a different direction.
If you’re building links for SEO reasons, then you’re doing it because of PageRank – Google’s original groundbreaking algorithm, which used links as a proxy for the popularity & trustworthiness of a page on the web. (If this is news to you, I recommend this excellent explainer from Majestic’s Dixon Jones.)
Crucially, PageRank works at a page level. Google probably does use some domain-level (or “site”-level) metrics as shortcuts when assessing how a page should rank in its results, but when it comes to passing on value to other pages, via links, Google cares about the strength of a page.
Different pages on a domain can have very different strengths, impacting their ability to rank, and their ability to pass value onwards. This is the exact problem that much of technical SEO is built around. It is not at all simple, and it has significant implications for linkbuilding.
Many editorial sites now include backwater sections, where pages may be 5 or more clicks from the site’s homepage, or even unreachable within the internal navigation. This is admittedly an extreme case, but the fact that the page is on a DA 90+ site is now irrelevant – little strength is being conveyed to the linking page, and the link itself is practically worthless.
The cynic in me says this sort of scenario, where it exists, is intentional – editorial sites are taking SEOs for a ride, knowing we will provide them with content (and, in some cases, cash…) in return for something that is cheap for them to give, and does us no good anyway.
Either way, it makes DA look like a pretty daft metric for evaluating your shiny new links. In the words of Russ Jones, who, as the person who leads the ongoing development of DA, probably knows what he’s talking about:
Here’s a few potential candidates you could move towards:
URL-level Majestic Citation Flow – this is the closest thing left to a third party approximation of PageRank itself.
Moz Page Authority (PA) – if you’re used to DA, this might be the easiest transition for you. However, Russ (mentioned above) warns that PA is designed to estimate the ability of a URL to rank, not the equity it will pass on.
Linking Root Domains to linking page – arguably, the most valuable links we could build are links from pages that themselves are well linked to (for example, multiple sites are referencing a noteworthy news article, which links to our client). Using this metric would push you towards building that kind of link. It’s also the metric in the Moz suite that Russ recommended for this purpose.
Referral traffic through the link – I’ve written before about how the entire purpose of PageRank was to guess who was getting clicks, so why not optimise for what Google is optimising for? The chances are a genuine endorsement, and thus a trustworthy link, is one that is actually sending you traffic. If you use Google Analytics, you can use the Full Referrer secondary dimension in your “Referrals” or “Landing Pages” reports to get deeper insight on this.
For example, here’s the recent referral traffic to my blog post How to Rank for Head Terms:
I still might want to check that these links are followed & remain live, of course!
I’m sure plenty of people are using their own, calculated metrics, to take the best of all worlds. I think there’s merit to this approach, but it’s not one we use at Distilled. This is for two reasons:
“Worst of both” – There’s a potential for a calculated metric to use both domain and page-level metrics as part of its formula. The trouble with this is that you get the downsides of both – the measurement-lag of a page-level metric, with the inaccuracy of a domain-level metric.
Transparency – Our prospective clients should hopefully trust us, but this is still going to be harder for them to explain up their chain of command than a metric from a recognised third party. Given the inherent difficulties and causation fallacies in actually measuring the usefulness of a link, any formula we produce will internalise our own biases and suspicions.
Great! Let me know in the comments, or on Twitter 😉