Google has just launched a helpful new manual action viewer feature in Google Webmaster Tools which pretty much does as it says on the tin and lets you pro-actively check if your website currently has any manual actions applied to it for spamming search results; without having to wait for an email to drop in your inbox telling you so.
When Are Manual Actions Taken On Your Website?
Have confirmed that they are willing to take manual action to remove spam (for example malicious content) or in some occasions demote (for example with duplicate content) when there are violations of their website quality guidelines. They also confirm that they will remove content for spam, legal reasons, security reasons, safety etc and judge what is the most relevant manual action dependent on every individual situation.
On top of this Google have listed and provided videos about all the manual actions that you can get, what they think of these and the actions you can take to help remedy these. If you look at them all, at the core is making sure that users get the best and most relevant experience from using google to find pages to match their needs. It may sound counter-intuitive but I always say the secret to successful SEO is to make it appear to search engine bots that you didn’t do any SEO – you have just built natural quality content of relevance.
After remedying any of these manual actions, you will likely want to put in a re-consideration request to google. Whenever you do this, think of google like the taxman in that they like detail and documentation! So to try and re-gain their trust that this issue will not happen again and remove any penalties you need to give them as much detail as possible about why it happened, what you have done to correct this and help reassure them it is unlikely to happen again.
The Manual Actions List:
Cloaking – in a nutshell this is all about not treating google bot any different from a desktop user. Although some people may not have any ‘black hat’ or deceptive reasons for cloaking, regardless of your intention behind it, google does not allow it in most circumstances. If you are concerned whether you are accidentally cloaking then google tools has a ‘fetch as google bot’ feature which can help you with this. If you have your pages translated into several languages for geographic locations then a google bot (currently crawling from USA) will get the page in English compared to a desktop user from Germany who would get the German language page. However google is clear that this isn’t considered cloaking because you are indeed treating google bot the same as desktop users – responding to which translated page you send a user (whether google bot or desktop user) depending on their IP address location.
Hidden Text And Keyword Stuffing – hidden text is text visible to search engines but not visible to desktop users. An obvious example they use is text in white on a white background which our eyes wouldn’t see but a bot would. Keyword stuffing is a catch all term used to encompass many different techniques such as repetition of exact same word, similar words etc. The overarching test for keywords stuffing is whether it would look or read un-natural as a piece of prose to a human. You can simply remedy this by just locating and editing it.
Read more on hidden text and key word stuffing here..
Pure Spam – any regular person can identify as spam which is cluttering their search results, so things like black hat and auto generated gibberish. Usually this type of action is site wide and the penalty is removal. Because of its nature most website users don’t submit re-consideration requests as it is a high benchmark to meet to convince google that you and your content can be trusted. The most likely scenario where you would be successful in such a request is if you take over a website which the previous owner spammed and you wish to clear up and run.
Thin Content With Little Or No Added Value – This is somewhat self-explanatory and about your site having original value added content. So this maybe:
- doorways – a page with one word or a very small difference between various pages – e.g. the name of a location.
- thin affiliates (affiliates are when you refer someone to another website for a service/product which you get a cut from) which is fine but only if you add substance to it, otherwise its considered ‘thin’. To know this ask yourself does it add any information that the user wouldn’t get if they just landed on the product page direct? For example are you reviewing the product or giving a recommendation etc.
- thin syndication (syndication is pulling content in from other pages) which again is fine if its quality and relevant to your page but if not and it’s just to mass populate a site then it is considered ‘thin’ and again not popular with google.
Google suggest removing such weak content or bolstering your site with creating your own original content – this might include some content from other sources but adding your own angles/views to it.
Read more on thin content with no value here..
Un-natural Links From Your Site – this is about google losing trust in your site because of where you are creating links to. If a site is linked by merit or editorial choice this is natural. Un-natural links is often creating mass links through link selling, spamming forums etc. A good question to ask to decide if the links are natural is would you create this link if you weren’t trying to increase someone’s search engine rankings? Firstly you need to identify the un-natural links and then you make the decision which links to remove or retain. You may want to make a decision to retain links for various reasons like sponsorship. In this case the ones you retain you will need to make sure they do not pass page rank through a couple of techniques.
Read more on un-natural links from your site here..
Un-natural Links To Your Site – this is the opposite of the above: google are losing trust in your site because of where you are receiving links from such as blog spamming. To fix this they suggest you reach out to the sites in question and either ask them to remove as many links as possible or make sure they do not pass page rank through a couple of techniques such as disavow or divert. You want to replace such links with those of merit. This is why it’s important to hire SEO services which create tailored quality links rather than link farming en-masse – which is worryingly more common than you think (or others admit!).
Read more on Un-natural links to your site here..
User – Generated Spam – This is more common with forums where users may leave a bunch of spam comments or spam user profiles, but it can also happen with blog comments etc. To avoid this you should regularly monitor the bits of your site where users can interact. But if you do get a manual action then this will usually only apply to the relevant bit of your site if your entire site is of quality. You can fairly easily clean this spam up and I would also suggest preventative maintenance to prevent such users coming back. The video has some good ideas on how you can achieve this.
Read more on user generated spam here..
Google also mentioned manual actions can be taken on spammy free hosts! Now I know free is good but it is not always good. If you are using a free host, please do a considerable due diligence check before signing up to a free host. There are many sites online where you can check the reputation of such hosts. Finally is the Unnatural links to your site—impacts links message which is slightly different to the other two mentioned above, below is a video about Matt explaining it;
Share this with has many webmaster and webmaster resources you know, there is no excuse not to read and digest this information. If you have any question(s), please drop me a comment.
Regards