Google has made large numbers of changes to its search algorithm over the past year or so, most notably the Penguin update. As a result there’s confusion over what counts as clean link building – and what is dodgy.
Video blogs from Matt Cutts, the head of Webspam at Google, as well as his Q&As at conferences, provide us with lots of useful knowledge.
In 2013 many of Cutts’ videos addressed the changing landscape of link building and what’s deemed as acceptable or grounds for penalisation. This blog summarises the most useful link building advice Cutts has given us over the past 11 (and a bit) months.
I’ve also included my own “Barrie says” segment after each video, so you have a real link builder’s perspective upon Cutts’ advice. I’ll discuss how recent changes have affected digital marketing, and what doesn’t “Cutt” it anymore (did you see what I did there?).
Also, each image is a link to the original YouTube video.
Here we go!
1. Don’t abuse guest blogging links
In December, ‘Google baba’ from Mumbai, India, was worried about being penalised by Google for creating spammy guest posts. He asked Matt Cutts:
“I predict that in the future Google will penalize guest blogging sites. Any insights on guest blogging as spam?”
Matt Cutts says don’t abuse guest blogging links
“I’m working on the assumption that you should publish high-quality guest bloggers, people whose words you really, really trust. It’s clear from the way that people are talking about it that there are a lot of low-quality guest blogger sites. And there’s a lot of low-quality guest blogging going on. Any time people are automating that or really trying to make a bunch of links without doing the sort of hard work that would really earn links on the basis of merit or because they’re editorial, then it’s safe to assume that Google will take a closer look at that.
So I wouldn’t recommend that you make it your only way of gathering links. I wouldn’t recommend that you send out thousands of blast emails offering to guest blog. I wouldn’t recommend that you guest blog with the same article on two different blogs. I wouldn’t recommend that you take one article and spin it lots of times. There’s definitely a lot of abuse and growing spam that we see in the guest blogging space. Regardless of the spam technique people are using, we’re always looking at what sort of things are starting to be more and more abused. We’re always willing to respond to that and take the appropriate action to make sure that users get the best set of search results”.
If you’ve got nothing to hide, you’ve got nothing to fear. The low quality sites, the ones that charge a fee to publish content, contain low-quality content and publish lots of unrelated content are the kinds of sites that deserve to be punished.
Getting links from high-quality sites on a subject that you are an authority on isn’t going to be a problem for Google. If you’re producing content on the internet that users want to read, and it’s being published on well-known sites in front of the right audience, then you have nothing to worry about. If, however, you’re publishing content solely for the purpose of getting a link, and maybe the site accepts blog posts from anyone and everyone – then you may very well have problems in the future.
2. You can quickly recover from a penalty
After asking for help in locating links to his site back in July, Adeel from Manchester, England, returned in December to ask Matt for help in recovering his site from the spam links built to it. Here is Adeel’s question in full:
How did Interflora turn their ban in 11 days? Can you explain what kind of penalty they had, how did they fix it, as some of us have spent months try to clean things up after an unclear GWT notification?
Matt Cutts says you can recover quickly from a penalty
Video transcription: “Rather than talking about a specific company, because we typically don’t call out specific companies, we prefer to talk about things in more general terms.
Google tends to look at buying and selling links that pass PageRank as a violation of our guidelines. And if we see that happening multiple times, repeated times, then the actions that we take get more and more severe.
We’re more willing to take stronger action whenever we see repeat violations. If a company were to be caught buying links, it would be interesting, if, for example, you knew that it started in the middle of 2012 and ended in March 2013. If a company were to go back and disavow every single link that they had gotten in 2012, that’s a pretty monumentally epic, large action. That’s the sort of thing where a company is willing to say “We might have had good links for a number of years, and then we just had really bad advice and someone did everything wrong for a few months. So just to be safe, let’s disavow everything in that time frame.”
That’s a pretty radical action. That’s the sort of thing where, if we heard back in a Reconsideration Request that someone had taken that kind of a strong action, then we could look and say this is something that people are taking seriously. It’s not something that I would typically recommend for everybody to disavow every link that you’ve gotten for a period of years. But certainly when people start over with completely new websites that they bought, we have seen a few cases where people will disavow every single link, because they truly want to get a fresh start. It’s a nice-looking domain, but the previous owners had just burned it to a crisp in terms of the amount of web spam that they’ve done.
Typically what we see from a Reconsideration Request, is people starting out and just trying to prune a few links. A good Reconsideration Request is often using the domain query, domain colon, and taking out large amounts of domains which have bad links. I wouldn’t necessarily recommend going and removing everything for the last year or everything for the last year and a half. That sort of large scale action, if taken, can have an impact whenever we’re assessing a domain without a Reconsideration Request”.
The time between Adeel’s two questions suggests to me that he spent four months trying to get his site or his clients’ sites de-penalised from Google.
I can’t say I have had any success in recovering a site in as little as 11 days, but I have certainly followed advice similar to Matt’s above of removing spammy links, adding domains to a disavow file before uploading it and putting together a reconsideration request to get the site back into Google’s rankings. The quantity of inbound links has a big say in how quickly you can succeed – but an experienced SEO consultant should be able to remove a penalty in a few weeks, not several months.
3. Substantial links (on a single page) require substantial content
Seda from London, asked whether there was a limit to the number of links a page can have before it risks penalisation. Seda’s question in full:
How many links on a page we should have? Is there a limit?
Matt Cutts says substantial links on a single page requires substantial content
Video transcription: “It used to be the case that Google Bot and our indexing system would truncate each page at 100 kilobytes (KB) or 101 KB. Anything beyond that wouldn’t even get indexed. What we did, if the page is 101k, then it’s reasonable to expect roughly one link per kilobyte and, therefore, something like 100 links on a page. That was in our technical guidelines. We said “this is what we recommend”. And a lot of people assumed that, if they had 102 links that we would view it as spam and take action. But that was just a rough guideline. Nonetheless, the web changes. It evolves. And in particular, web pages have gotten a lot bigger. There’s more rich media. And so it’s common to have aggregators or various things that might have a lot more links. So we removed that guideline. We basically now say, keep it to a reasonable number, which I think is pretty good guidance.
There may be a limit on the file size that we have now, but it’s much larger. At the same time, the number of links that we can process on a page is much larger. A couple of factors to bear in mind is, when you have Page Rank, the amount of Page Rank that flows through the out links is divided by the number of total out links. So if you have 100 links, you’ll divide your Page Rank by 100. If you have 1,000 links, you’ll divide your Page Rank by 1,000. So if you have a huge amount of links, the amount of Page Rank that’s flowing out on each individual link can become very, very small.
The other thing is it can start to annoy users, or it can start to look spammy, if you have tons and tons and tons of links. So we are willing to take action on the web spam side, if we see so many links that it looks really, really spammy. If you compare our old guidelines of 100 links and you look at what the web looks like now, it’s quite common to have 200, 300 or 400 links on a page, as long as the page is long, it has value and there’s substantial amount of substance and real stuff on that page.
The short answer is, really not to worry about it, or not limit yourself to 100 links anymore. At the same time, it is quite useful to pull in a regular user and just do a very simple user test, and just make sure they don’t view it as strange or spammy or you’re stuffing a ton of links on the page. As long as you meet those kinds of criteria, then it’s not the sort of that I would stress out a lot about”.
I had to double check to make sure this question really was asked/answered in November 2013. At first I thought I may have been listening to a video from 2008 or before. In all my experience of the web and businesses I have worked with and blogs I have written, I cannot recall a time where I have needed to create a page that was simply a list of links. In fact, I have never got anywhere near 100 links in a single blog post. But I guess there are exceptional circumstances when this is the case. Take Matt’s advice; make sure you have substance around these links to avoid trouble.
4. Feel free to disavow links, whether you’ve been penalised or not
In November, James from Bristol, England, had his question regarding the Disavow Tool answered in a video blog from Matt. James’ question in full:
Should webmasters use the disavow tool, even if it is believed that no penalty has been applied? For example, if we believe ‘Negative SEO’ has been attempted, or spammy sites we have contacted have removed links.
Matt’s answer (video transcription): “The primary purpose of the Disavow Tool is, you’ve done some bad SEO, you hired a bad SEO, they messed things up, now you need to clean up. You’ve done as much work as you can to get those links off of the web, but some people don’t respond or for whatever reason you can’t get every single link taken off the web. That’s the perfect use case to do a disavow and say, “OK, these are the links that I’ve tried to get taken down, and I can’t manage to get them taken down.” But at the same time, if you’re at all worried about someone trying to do a negative SEO, or it looks like there’s some weird bot that’s building up a bunch of links to your site, and you have no idea where it came from, that’s a perfect time to use disavow as well.
I wouldn’t worry about going ahead and disavowing links even if you don’t have a message in your Webmaster console. So if you have done the work to keep an active look on your backlinks, and you see something strange going on, you don’t have to wait around. Feel free to just go ahead and pre-emptively say “this is a weird domain, I have nothing to do with it. I don’t know what this particular bot is doing in terms of making links.” So feel free to go ahead and do disavows even on a domain level. The main purpose is if you’ve made some bad links yourself and you need to clean it up. But if you’re at all stressed, if you’re worried, if you’re not able to sleep at night because you think Google might see it, or we [Google] might get a spam report about you, or there might be some misunderstanding, or an algorithm might rank your site lower, I would feel free to just go ahead and disavow those links as well”.
If you, or a link builder you hired in the past, has built bad links to your website then your first protocol should be to try get them removed as quickly as you can. I certainly wouldn’t recommend paying the webmaster money to have them removed, so don’t buy into their pleas for money “because it takes time” etc. But removing as many links as you can instead of simply disavowing them is really important.
Firstly, you’re disassociating yourself with those bad sites – leaving the links live would allow people to see that you’re associated with these sites. Secondly, it shows to Google that you’re acknowledging the mistakes you’ve made in the past and that you’re doing your best to remove them. There’s no 30-second fix for your bad work in the past – uploading a Disavow file without any attempts to remove all of your bad links won’t get you back into Google if you’ve been kicked out.
If you are under negative SEO attacks and you are inheriting bad links, it’s good to update your Disavow file on a regular basis. So if you have any free time each month, updating and uploading a fresh Disavow file regularly will decrease any chances of being penalised by Google.
5. Make your links from blog comments genuine
In November 2013, Chase from Denver, Colorado, wanted to know what the value of forum signature and comment links is. Chase’s full question was:
Google’s Webmaster Guidelines discourage forum signature links but what about links from comments? Is link building by commenting against Google Webmaster Guidelines? What if it’s a topically relevant site and the comment is meaningful?
Matt Cutts says to make your links from blog comments genuine.
Video transcription: “I leave topically relevant comments on topically relevant sites all the time. So if somebody posts an SEO conspiracy theory, and I’m like “no, that’s not right”, I’ll show up. And I’ll leave a comment that says, “Here’s a pointer that shows that that’s not correct.” Or “here’s the official word” or something like that. And I’ll just leave a comment with my name. I’ll often point to my blog rather than to Google Webmaster blog or something like that because I’m representing myself. Lots of people do that all the time. And that’s completely fine.
The sorts of things that I would start to worry about is it’s better, often, to leave your name so someone knows who they’re dealing with, rather than “cheap study tutorials” or “fake driver’s license” or whatever the name of your business is. Often that will get a chillier reception than if you show up with your name.
The other thing that I would say if your primary link building strategy is to leave comments all over the web to the degree that you’ve got a huge fraction of your link portfolio in comments and no real people linking to you, then, at some point, that can be considered a link scheme. At a very high level, we reserve the right to take action on any sort of deceptive or manipulative link schemes that we consider to be distorting our rankings. But if you’re just doing regular organic comments, and you’re not doing it as a “OK, I have to leave this many comments a day, every single day, because that’s what I’m doing to build links to my site”, you should be completely fine. It’s not the sort of thing that I would worry about at all”.
This question make me chuckle. It almost goes without saying – there are a lot of comments, often produced by bots, that leave junk comments on your blog posts (I couldn’t survive without AKISMET on my blog) with the sole intention of getting a link from your website.
I would not recommend comment spamming as a link building method – it won’t get you anywhere. Certainly not to the top of Google. There are plenty of spam-fighting tools that block such trash being left in their comments’ section, and instead, allow useful conversations to take place. As Matt points out, if you’ve got something useful to add in the comments section, then sticking a link to it can be a good thing. Just make sure you’re being genuine and leave a comment to support your link!
6. Guest blog in moderation
In October, Ben Holland from Phoenix, Arizona, asked a simple question on guest blogging and whether there was a risk of his work appearing as though he had paid for it to be published. Ben’s question in full:
How can I guest blog without it looking like I pay for links?
[ youtube https://www.youtube.com/watch?v=OGieiNe6RL4 ]
Video transcription: “Whenever we get a spam report, and we dig into it in the manual webspam team, usually, there’s a pretty clear distinction between an occasional guest blog versus someone who is doing large-scale pay-for-links. So what are the different criteria on that spectrum? If you’re paying for links, it’s more likely that it’s an off-topic or an irrelevant blog post that doesn’t really match the subject of the blog itself. It’s more likely you’ll see keyword-rich anchor text. Whereas a guest blog; it’s more likely to be, hopefully, someone that’s expert. There will usually be a paragraph that talks about who this person is, why you invited them to be on your blog. Hopefully the guest blogger isn’t dropping keywords in their anchors nearly as much as these other sorts of methods of generating links.
Guest blogging seems like it’s the fad of the month a little bit, because we do hear a lot of people who are complaining about tons of people just spraying and praying, sending out invitations; “I’m going to guest blog on all these different things.” And sometimes they’re spinning their guest blogs. They’re not even writing unique content for each blog. I don’t think that’s the best way to build links to your site. And so I wouldn’t recommend that as a tactic. Guest blogging is probably the sort of thing that you should be thinking about doing in moderation”.
If you’re paying for posts and links to be published, it’s only a matter of time before you’ll be penalised. So, it’s worth seeking out new tactics for promoting your site.
If you’re acting honestly in getting good links, you have nothing to worry about. If you’re an expert guest posting on authoritative sites, you have nothing to worry about.
7. Don’t make rankings in Google the centre of your universe
Also in October, Matt answered Shubhamstunter from India’s question regarding a narrow focus on website promotion. Shubhamstunter’s question in full:
Since Google has been actively updating its search results, it is hard for people to trust Google anymore. Should one start focusing on getting leads from social media other than search engine results?
Matt Cutts says don’t make rankings in Google the centre of your universe
Video transcription: “Google has always been actively updating our search results. We always will be working on improving our algorithms and improving our search results. That’s what you signed up for if you’re trying to rank highly in Google. SEO is all about change. There’s going to be more change coming. We think by the end of summer 2013, we think it’s going to be even harder to spam Google and to rank on Google if you’re using black hat tools and techniques.
So we are always going to be updating, always trying to change things to make things better, always trying to innovate the way that we rank our search algorithms. That’s just the nature of the beast. The goal is always the same; to return the best, highest quality set of search results. As long as you’re trying to make a fantastic site that people love, that’s really compelling, that they’re always coming to, that’s the sort of thing that puts you on the same side as Google.
I am all for having eggs in lots of different baskets. Because if your websites goes down, then you could always have a brick and mortar business. If your ranking on Google is not as good, then you could have other channels that you can use from print media advertising to billboards to Twitter to Facebook. You should always have a well-rounded portfolio of ways to get leads, whether it be people walking through your door or Yellow Pages or whatever it is. Because you can’t count on any one channel always working out perfectly.
As long as you have great content, you should do well in Google. But if people are spamming or you hired a bad SEO, that can lead to unpredictable results. The one thing that I can guarantee is Google is going to keep trying to work on making better and better search results. And if you’re an SEO and you’re really interested, or even if you’re a black hat or something like that, there’s always going to be some level of having to adapt or evolve as part of that process. And that’s just the natural way of things”.
Unless you’re running an affiliate site, your business really should not be solely reliant on ranking well in Google’s results to be profitable. There are many other methods you can use to generate leads from your site.
8. Google doesn’t have much time for updating Toolbar PageRank
Matt Cutts revealed in his speech at PubCon, Las Vegas, in October 2013, that Google will no longer be updating the Toolbar PageRank, at least not frequently.
Matt Cutts says Google doesn’t have much time for updating Toolbar PageRank
Video transcription: “A lot of people ask “when are we going to get the next PageRank update?” We have our own internal version of PageRank, it’s always updating; it’s continual. And every single day we have new PageRanks. There’s also an export. And normally it runs once every three months or so. Earlier this year that pipeline broke and we were kinda like “y’know people get a little too obsessed about PageRank anyway, maybe it’s ok to leave that for a little while.” And so we don’t have anybody staffed on trying to revive that pipeline and we don’t want everybody to get too obsessed about PageRank, so we’re probably not going to update PageRank throughout the rest of the year and then we’ll see whether anything happens in 2014″.
It came as no surprise when Matt Cutts announced that Google would stop updating their Toolbar PageRank. PageRank is an old school metric for “judging the quality of a website” and most of our industry have since long moved on from this.
In the past, PageRank had been used when money was exchanged for links. Webmasters used to build sites with the aim of getting links and building up PageRank, solely to sell links to buyers.
In December 2013 Google did update the Toolbar PageRank. But that doesn’t mean PageRank has regained its previous status.
9. There is nothing wrong with NoFollow links
After generating NoFollow links, Tubby Timmy from the UK asked whether these NoFollow links could potentially hurt this site. Timmy’s question in full:
I’m building links, not for SEO but to try and generate direct traffic. If these links are no-follow am I safe from getting any Google penalties? Asked another way, can no-follow links hurt my site?
Matt Cutts says there is nothing wrong with NoFollow links
Video transcription: “Typically NoFollow links cannot hurt your site – so upfront, very quick answer on that point.
That said, let me just mention one weird corner case, which is if you are leaving comments on every blog in the world, even if those links might be NoFollow, if you are doing it so much that people know you and they’re really annoyed by you and people spam report about you, we might take some manual spam action, for example. I remember for a long time on Tech Crunch, any time that people showed up, there was this guy from Anon. TC, would show up and make some nonsensical comment. It was clear that he was just trying to piggyback on the traffic and drive the traffic from people reading the article directly to whatever he was promoting.
Even if those links were NoFollow, if we see enough mass-scale action that we consider deceptive or manipulative, we do reserve the right to take action. So we carve out a little bit of an exception if we see truly high-scale abuse. But for the most part, NoFollow links are dropped out of our link graph as we’re crawling the web, and so those links that are NoFollow should not affect you from an algorithmic point of view.
I always give myself just the smallest out in case we find somebody who’s doing a really creative attack or mass abuse or something like that. But in general, no. As long as you’re doing regular, direct-traffic building and you’re not annoying the entire web or something like that, you should be in good shape”.
Matt Cutts can only recall one case of when a site has been hurt by NoFollow links – and that was more a case of comment spam than NoFollow links. NoFollow links are used when you pay for a link (sponsored text, banner ads etc.) – places where you’re willing to advertise to get referral traffic and conversions and not for the purposes of ranking in Google.
As long as you’re making the links that you purchased NoFollowed and you’re not commenting on every blog on the web, I cannot see how your site will be penalised by a NoFollowed link.
10. Try to remove links before resorting to the Disavow file
In August, Matt answered Jay from Spain’s question regarding bad backlinks to his website. Jay’s question in full:
Recently I found two porn websites linking to my site. I disavowed those links and wrote to admins asking them to remove those links but… what can I do if someone (my competition), is trying to harm me with bad backlinks?
Matt Cutts says to try to remove links before resorting to the Disavow file
Video transcription: “You’ve done exactly the right thing [Jay]; you’ve got in touch with the site owners and said, “Please don’t link to me. I don’t want to have anything to do with your site.” If those folks aren’t receptive, then just go ahead and disavow those links. As long as you’ve taken those steps, you should be in good shape. But if there’s any site you don’t want to be associated with that’s linking to you, and you want to say, “Hey, I got nothing to do with this site”, you can just do a disavow. You can even do it at a domain level. At that point you should be in good shape and I wouldn’t worry about it after that”.
In an ideal world we wouldn’t have competitors building harmful links to our site. But we don’t live in that utopia and there may be people trying to penalise your website as an attempt to beat you in Google’s rankings. It’s always worth keeping an eye on the backlinks to your site, whether that be via Webmaster Tools, Majestic SEO or whatever tool(s) you use.
11. Google WebMaster Tools will give you examples of bad links if your site has been penalised
Ben Holland from Phoenix, Arizona was busy sending in questions to Matt over the course of the year. Here is a second question from Ben that Matt answered:
Will Webmaster Tools ever tell us what links caused a penalty?
Matt Cutts says Google WebMaster Tools messages will give you a couple of examples of bad links if your site has been penalised.
Video transcription: “If you log into the Webmaster Tools console and see a message, that means that there has been some direct manual action by the web spam team that is somehow directly affecting the ranking of your website. In those cases, right now some of those messages have example links or example URLs that are causing issues for us. We wouldn’t necessarily say that those are the only things, because if you have a million URLs that are offending things, we couldn’t send all one million URLs in an email or even a message, because that’s just going to take too much storage.
Over time, we are going to give more and more information in those messages. I wouldn’t be surprised if you see one, two, three, some number of example URLs or links that give you an idea of where to look in order to find the sorts of things that are causing that particular action. We’re going to keep looking at how we can expand the number of example URLs that we include in messages. Then you’ll have a really good idea about where to go and look in order to help diagnose what the issue is”
Webmaster Tools doesn’t give us a good idea what links are causing penalties. So, unless you have a wealth of link building knowledge, you may struggle to detect which links need removing. Hopefully, in future, Google will make it easier for you to know which links to remove.
12. Google are showing more examples in Reconsideration Request messages
Instead of relaxing in the sun in July 2013, Matt took the time to answer Adeel from Manchester, UK’s question regarding examples of bad links pointing to his site. Here is Adeel’s question in full:
Client got unnatural links warning (sic) in September ’12 without any example links; 90% links removed; asked for examples in every RR (reconsideration request) but no reply. Shouldn’t it be better to have live/cached “list” of bad links or penalties in GWT? Think about genuine businesses.
Matt Cutts says Google are working on showing more examples in Reconsideration Request messages.
Video transcription: “We’re working on becoming more transparent and giving more examples with messages. I wouldn’t say “Hey, give me examples in a Reconsideration Request” because a Reconsideration Request – we’ll read what you say, but we can really only give a small number of replies. Basically, yes the Reconsideration Request has been granted, or no, you still have work to do. There’s a very thin middle ground which is your request has been processed. That usually only applies if you have multiple web spam actions. And then maybe one has been cleared, but you might have other ones left. But typically you’ll get a “yes” or a “no” back.
There is no field in that request to say, “OK here are some more examples”, but we will work on trying to put more examples in the messages as they go out, or some way where, for example, it would be great if you could log into Webmaster Tools and see some examples there. What I would say is if you have gotten that message, feel free to stop by the Webmaster forum and see if you could ask for any examples. And if there’s any Googlers hanging out on the forum, then maybe we could check the specific spam incident and see whether we might be able to post or provide a few examples of links within that thread”.
Glad to hear Google are keeping the Reconsideration Request as tough as it currently is. Google wants to see websites with clean backlink profiles. They plan to be more transparent with the Reconsideration Request messages which should make it easier for webmasters to work out what needs tidying up on their own websites.
13. Common mistakes people make when uploading Disavow file
In a short video blog in June, Matt discussed the common mistakes he and his team see people make when they are using the Disavow Tool.
Common mistakes people make when uploading Disavow File.
Here are the common mistakes that Matt discussed:
[list_item]The file that you upload is supposed to be just a regular text file. What we see is people sometimes uploading Word files (.doc) & Excel spreadsheets – that’s the sort of thing that our parser is not built to handle.[/list_item]
[list_item]A lot of the times the first at a Reconsideration Request, you see people really trying to take a scalpel and pick out individual bad links in a very granular way. For better or worse, sometimes when you got a really bad link profile, rather than a scalpel, you might be thinking more of a machete – you need to go a little bit deeper in terms of getting rid of the really bad links.[/list_item]
[list_item]The domain colon needs to have the right syntax. So domain colon and then a domain name. Don’t do domain colon and then HTTP or www..[/list_item]
[list_item]A bunch of people, we sometimes see them putting context, or the story, or the documentation for the Reconsideration Request in the Disavow Links text file that they try to upload. That’s really not the right place for it. The right place to give us the context or to describe what’s going on is in the Reconsideration Request.[/list_item]
[list_item]Sometimes people think that Disavow is the be all and end all, the panacea that’s going to cure all their ills. We do want, if you’ve been doing some bad SEO and you’re trying to cure it, in an ideal world you would actually clean up as many links as you can off the actual web. That’s just a really helpful way for us to see, when you’re doing a Reconsideration Request that you’re putting in the effort to try to make sure that all things have been corrected and cleaned up and are not going to happen again.[/list_item]
I’ve obviously never made a mistake when uploading a Disavow file and cannot relate to what Matt has to say on this subject 😉
14. NoFollow any paid links
In May, Matt discussed in great depth how Google treats advertorials, emphasising the importance of making paid advertorials clear to both the user and search engines alike:
Matt Cutts advises to NoFollow any paid links
Video transcription: “Let’s start with the easiest stuff; editorial content. That’s the meat and potatoes of whatever you’re writing. If you’re a blogger, it’s the primary stuff you’re writing about. If you’re a newspaper, it’s the news articles that you publish, online or in your newspaper. I think people have a pretty good sense about what editorial content is.
So how about advertorial content or native advertising? Well, it’s advertising. But it’s often the sort of advertising that looks a little closer to editorial. But it basically means that someone gave you some money, rather than you writing about this naturally, because you thought it was interesting or because you wanted to.
So why do I care about this? Why are we making a video about this at all? Well, the reason is, certainly within the webspam team, we’ve seen a little bit of problems, where there’s been advertorial or native advertising content or paid content that hasn’t really been disclosed adequately, so that people realize that what they were looking at was paid. So that’s a problem.
We have had longstanding guidance, since at least 2005, that says if you pay for links, those links should not pass PageRank. The reason is that Google, for a very long time, and, in fact, everywhere on the web, people have mostly treated links as editorial votes. They link to something because it inspires passion in them; it’s something that’s interesting; they want to share it with friends. There’s some reason why they want to highlight that particular link. Now, if someone were to come to a newspaper reporter and say, “I’m going to give you some money, can you link within your editorial story that you’re writing, your news article?” That would be deceptive. People would not realize that there was payment involved. And it would really not be fair. So paid links that pass PageRank, change the landscape. It makes it uneven, so that people can’t compete on a level playing field. And that’s what we want to ensure that we have on the web and certainly within Google’s web index.
So what are the guidelines for advertorials or for native advertising? Well there’s two-fold things that you should think about. The first is the on search engine side of things. And search engine wise, you should make sure that, if the links are paid, that is if money changes hands in order for a link to be placed on a website, that it should not flow PageRank. In essence, it shouldn’t affect search engine’s rankings. That’s no different than the guidance we’ve had for years and years and years.
Likewise, if you are doing disclosure, you need to make sure that it’s clear to people. So a good rule of thumb is there should be a clear and conspicuous disclosure. It shouldn’t be the case that people have to dig around, buried in small print, or have to click and look around a long time to find out, “oh this content that I’m reading was actually paid.”
So why are we talking about this now? This isn’t a change in our search engine policies, certainly not in the webspam team. Well, the reason is that we’ve seen some people who have not been doing it correctly. So we’ve seen, for example, in the United Kingdom, a few sites that have been taking money and writing articles that were paid, including keyword rich anchor text in those articles, that flowed PageRank, and then not telling anybody that those were paid articles. That’s the sort of thing where, if a regular user happened to be reading your website and didn’t know that it was paid, they’d really be pretty frustrated and pretty angry when they found out that it was paid. So we’ve taken action on this sort of thing for years and years. And we’re going to keep taking strong action.
We do think it’s important to be able to figure out whether something is paid or not on the web. And it’s not just the webspam team. It’s not just search quality and the websearch results. The Google News team recently published on their blog and said that, if you don’t provide adequate disclosure of paid content, whether it be native advertising, advertorials, whatever, whenever there’s money changing hands, if users don’t realize that sufficiently, because there’s not adequate disclosure, the Google News team mentioned that they might not only remove the paid content, but we’re willing to go up to and including removing the publication from Google News. So I think if you look at Google and you look at our policy on advertorials, it’s been constant for the last several years. But we want to reiterate and make sure that people realize that this can be an issue.
If you are taking money and posting content that people don’t realize is paid or it’s not adequately disclosed, both to people and to search engines, we are willing to take action on that, not just in Google Search results, not just in the webspam team, but also in Google News. And that’s so why it would behoove people to have an abundance of caution whenever they’re considering these things, to just make sure that they do provide adequate disclosure and then it’s abundantly clear to users what’s paid and what’s not paid”.
Matt and Google want to make it clear to NoFollow anything you pay for and he does a good job in this video stressing the importance of it. Failure to cooperate can lead to your website(s) being penalised and as described in the video, even being kicked out of Google News if you’ve been included. If you are paying for advertorials, you’re likely to be doing this for referral traffic and/or brand exposure anyway.
15. Internal links don’t attract penalties
Back in April Matt answered Rob from Los Angeles, California’s question regarding multiple internal links with the same anchor text. Rob’s question in full was:
Do internal website links with exact match keyword anchor text hurt a website? These links help our users navigate our website properly. Are too many internal links with the same anchor text likely to result in a ranking downgrade because of Penguin?
Matt Cutts says there’s no harm in number of internal links as long as it’s natural.
Video transcription: “Typically, internal website links will not cause you any sort of trouble. If you have a normal site – a catalogue site or whatever; you’ve got bread crumbs, you’ve got a normal template there, and that’s just the way people find their way around the site and navigate. You should be totally fine. It’s kind of expected that you’ll have a lot of links that all have the same anchor text that point to a given page. I wouldn’t worry about that”.
Your site’s architecture is crucial to your SEO success. Whilst they may not “typically” hurt you, too many internal links won’t provide you with much benefit, as you’ll be reducing the link juice passed on from each one of them with each link you build. I have seen great uplifts in key rankings by reducing the number of links on the homepage and in the navigation menus.
If you’re concerned about your site’s internal linking structure, it’s worth conducting an SEO audit.
16. Fresh content gets a boost in the rankings (before dropping off)
Back in April Sandeep from India noticed that his new pages consistently appeared on pages 4-6 in the first week before seeing them fall away and never be seen again in Google. He wanted to know why this happens to his website. Sandeep’s question in full:
When we create a new landing page with quality content, Google ranks that page on the top 30-50 for targeted keywords. Then why does the rank get decreased for the next 2 to 3 weeks? If pages didn’t have required quality, then why did it get ranked in the first week?
Matt Cutts says fresh content often sees an initial boost in the rankings before dropping off.
Video transcription: “That’s a fun question because it opens up how writing a search engine is kind of a complex task. You’re basically trying to make sure that you return the best quality result, but you also have to do that with limited information. For example, in the first minute after an earthquake, you might have different people saying different things. 10 minutes after an earthquake, you have more information. An hour after an earthquake, you have a lot more. With any event that has breaking news, it’s the sort of thing where it can be hard to know – even if multiple people are all saying the same thing, and one person might be the original author, one might be using that RSS. It can be difficult to suss out who is – where was this content appearing originally? And over time, over the course of hours or days or weeks, that gets easier. But it can be harder over the course of just minutes or hours. So a lot of the times, whenever you see something ranking for a while, we’re taking our best guess. Then as more information becomes available, we incorporate that. Then eventually, typically, things settle down into a steady state. Then when there’s a steady state, we’re typically able to better guess about how relevant something is.
It is definitely the case that there are some queries that deserve freshness (QDF). There are some queries that are better served by evergreen content that’s been around for a long time. And when there’s a new blog post or when there’s a breaking event or when somebody has just published something on that topic, it can difficult to assess how relevant something is. So a lot of people think, “Oh, there should be one – instead of rankings, it should be completely uniform. Everybody in the world should see the exact same thing.”
The fact is we have different results for people in different countries, even in different cities. And the results can change over time. Not just because links change or because the content on the page changes, but we basically are able to better assess which pages are more relevant. It’s not just the case that you write a static algorithm. Your algorithm also has to deal with limited information and then how does it deal when it has better information and better information. So it is expected that, over time, the rankings will fluctuate. They will change as we’re trying to get a better idea based on the information we have about what the most relevant results are”.
It’s common to see blog posts and new stories jump high into Google’s results and then for them to drop off a few days later, perhaps when they’re not as relevant or when bigger websites have covered the same story. I can’t say I search as low as pages 4-6 to find my results – it’s page 1 or nothing as far as I’m concerned when getting traffic from Google!
17. Links to your site from link sellers have no value
In April, Matt answered Sergey L of New York’s question regarding links from penalised websites. Sergey’s question in full:
If some site that is linking to my site gets penalized for purchasing links, will my site get affected by that penalty?
Matt Cutts says links to your site from link sellers have no value.
Video transcription: “Normally what happens is when we find a site that’s selling links, its PageRank goes down by 30%, 40%, 50%, as a visible indicator that we’ve lost trust in that domain. And it typically also loses its ability to send PageRank going forward. For example, suppose we have a selling site that is selling links to a buying site. And the selling site also happens to link to you. The sort of situation that might happen is we find out that that’s a link seller, and as a result we don’t trust the outgoing links from that site. The most likely scenario is if there is a link selling site and they get caught for selling links, and they just happen to be linking to you, the value of that link that the site was providing, it just goes away. Maybe you were benefitting getting a little bit of PageRank from that site. Now, since we don’t trust that site, you wouldn’t be getting that benefit. Typically, it’s not the sort of thing where you get affected by that penalty in the sense that you get demoted or anything harsh like that. It’s just you no longer get the benefit of the link from that site, because we don’t trust it anymore”.
It doesn’t come as a surprise that you won’t get any benefit from a link from a site that has been penalised. Whether you paid for that link or not you need to be careful in where you’re getting links from. Avoid sites with obvious paid links that break Google’s guidelines. Aim to build relationships with websites that have a good following and that aren’t needing to charge to publish content (sites that make money through other forms and don’t need to rely on charging for guest posting to make a profit).
18. Dodgy link building still works (occasionally)
Matt Cutts decided to answer Jadoray from Idaho’s question on April Fool’s day last year. The question was about seeing sites at number one despite a bad backlink profile and why they were not being moved down the rankings or even penalised. Jadoray’s question in full:
I’ve seen multiple websites that appear in the #1 spot for various keywords, whose backlink profiles are pretty low quality (i.e. lower quality blog pages). Why wouldn’t Penguin have moved these sites further down the rankings?
Matt Cutts admits dodgy link building is still working occasionally
Video transcription: “There’s a lot of possible reasons. One is Penguin is geared for certain types of spam, but it’s not geared for every type of spam. For example, it doesn’t help with hacked sites. So if a site is being propelled up the rankings on the basis of illegal hacking of sites, that’s not something that Penguin attempts to tackle. There are other algorithms that try to tackle that. The simplest explanation might just be that we don’t want that to rank, but the algorithms haven’t gotten good enough yet. We need to make sure that the things get better. If that’s the case, we’re happy to get spam reports. Or if you want to show up on our Webmaster forum and say, “Hey, here’s a site that doesn’t look like it should be ranking”, we’re happy to hear feedback like that.
The other thing could be that unless you were the site owner, Google doesn’t give you all the links that we know of. So you might be getting the links from some other source. You might not get a complete or exhaustive list of links in that case. You might be seeing some of the lower Page Rank links or links that look a little spammy. But it could be the case that there are some good links. Maybe the site has a link from CNN, a link from the New York Times, a link from the New Yorker, a link from the Chicago Tribune. If that link gets missed in the report that you see, then you might look like, “Oh, well, a lot of these links are not as high quality.” But you didn’t realize there were a few high quality links that you weren’t seeing. It can be a lot of different reasons. Again, sometimes you’re doing queries that are based on the content of the page more so than links. So it can vary a lot based whether you’re in the head, whether you’re in the tail – all sorts of different factors.
If you see something that you think might be link spamming or have a really low quality profile and it’s ranking higher than you think it should, especially if it looks pretty egregious or pretty unhappy, feel free to show up in the Webmaster forum or even do a blog post about that. We read the SEO blogs. And when people say “Hey, here’s a site that doesn’t look like it should be ranking where it is”, there are some SEOs who don’t like that, because they don’t want to have people blow the whistle. That’s the sort of stuff where we really enjoy getting those data points and thinking about “did the algorithm miss something? Do we need to improve anything in the next iteration?” Hopefully, over time we’ll continue to find more and more of these sorts of sites and return higher quality sites instead”.
Even in late 2013 I am still finding numerous searches in Google returning low quality sites at the top. I can accept Google isn’t perfect and people are still trying to manipulate their search results in an underhand manner. Sites with terrible backlinks (quite literally some have just article directory links and/or directory links) ranking in positions 1, 2 and 3. Sometimes this seems to be because there’s simply no or very few sites with better than junk backlink profiles. Not always the case, and I’m always trying to figure out why my clients’ sites with good link building profiles are ranking below ones with poor links. If Matt suggests reporting those sites then I guess I should consider that.
19. No link juice is lost in a 301 redirect
In February 2013, Matt put to bed any worries we had regarding lost benefit from 301 redirect pages when he answered Sam Harries from Exeter, England’s question. Sam’s question in full:
Roughly what percentage of PageRank is lost through a 301 redirect?
Matt Cutts says no link juice is lost in a 301 redirect.
Video transcription: “The amount of PageRank that dissipates through a 301 is almost exactly, is currently identical to the amount of PageRank that dissipates through a link. So they are utterly the same in terms of the amount of PageRank that dissipates going through a 301 versus a link. That doesn’t mean “use a 301.” That doesn’t mean “use a link.” It means use whatever is best for your purpose, because you don’t get to horde or conserve any more PageRank if you use a 301, and likewise, it doesn’t hurt you if you use a 301″.
It is reassuring to hear from Matt that know link benefit is lost when 301 redirecting a page.
We look forward to seeing more videos from Matt and the Google Webmaster Help team in 2014.
Thanks to the team behind www.theshortcutts.com for making it quick and easy to locate all of Matt’s videos 🙂
If you’d like to build better links
If you want to start building high quality, ethical links to your website, contact us today, and we’ll advise you on your next steps.