Latest SEO & IM News
Google issues unusual warning to repeat offenders of their guidelines concerning manual actions
Google has always tried not to communicate with webmasters who break their guidelines deliberately, but on the 18th of Sept they did just that. The post on The Official Google Webmaster Central Blog specifically talks about the scenario of a website receiving a manual action for a bad link, then once the webmaster has nofollowed the link and the penalty has been lifted they break the guidelines again by removing the nofollow tag from the link.
Quoted directly from The Official Google Webmaster Central Blog
“However, some sites violate the Webmaster Guidelines repeatedly after successfully going through the reconsideration process. For example, a webmaster who received a Manual Action notification based on an unnatural link to another site may nofollow the link, submit a reconsideration request, then, after successfully being reconsidered, delete the nofollow for the link. Such repeated violations may make a successful reconsideration process more difficult to achieve. Especially when the repeated violation is done with a clear intention to spam, further action may be taken on the site.
In order to avoid such situations, we recommend that webmasters avoid violating our webmaster Guidelines, let alone repeating it. We, the Search Quality Team, will continue to protect users by removing spam from our search results.”
When it comes to fighting and pushing against webmasters who break Google’s ever-changing guidelines, Google has always been sketchy with their actions and responses. In my opinion and experience, it is almost like a personal vendetta from the spam team. The spam team is quite secretive and as far as I know, removed from the other departments of Google. They spend a lot of time trying to defeat spammers and I think when a webmaster circumvents their prepared defences they dislike it in a more personal fashion. This warning they’ve given could be them reaching out to Blackhats in good faith to warn them, or it could simply be that many webmasters who don’t quite understand what they’re doing in SEO are falling victim to Google’s ever changing guidelines and harsh and potentially long-lasting penalties. Either way, I don’t see much point in the issue of this warning as Blackhats will simply concentrate on what works, not what Google says; those who don’t have advanced skills or understanding in SEO will not understand what it means or what to do with it correctly.
Google to communicate more on hacked sites in Search Console(Webmaster Tools)
In 2015 Google recorded a 180% increase in the number of hacked sites and 300% increase in hacked site reconsideration requests. It is clear that Google is putting extra focus on dealing with the number of hacked sites in a better way for the webmasters who fall victim. Google condensed changes into three points:
1. Improving communications with webmasters of hacked sites.
2. Better tools including auto-removal of some hacked manual actions.
3. Soliciting your feedback and taking action.
However the most important snippet I took from this article is this:
“we’re beta testing the automated removal of some hacked manual actions. In Search Console if Google sees a “Hacked site” manual action under “Partial matches”, and our systems detect that the hacked content is no longer present, in some cases we will automatically remove that manual action. We still recommend that you submit a reconsideration request if you see any manual actions, but don’t be surprised if a “Hacked site” manual action disappears and saves you the trouble!”
I think this is a fantastic step for Google to take which will help take some of the pain away from trying to fix the mess from a recent hack. If you’ve ever been hacked before you’ll know just how tiresome it can be to clean up the site and get it re-included in Google’s search engine again. Google has always been slow to react to the webmaster’s changes(if at all!) but this gives webmasters a little more light at the end of the tunnel should something go wrong.
Source: Google Webmaster Central
Google makes changes to the layout of search pages for local search queries
Google is shaking things up in Local Search with changes to the layout of their pages for local search queries. If you’re invested in Local search then this will have an impact on your listing and you should carefully read the changes.
Google makes changes to Facebook rich snippets in their search results
David Markovich posted on Twitter that he could not see rich snippets for Facebook any more. For Facebook page owners who have their page ranked, it may have an impact provided they are targeting specific keywords with their page. If a Facebook page is targeted specifically to a brand then this should have minimal impact. We are not entirely sure why Google made this change but it may have something to do with artificially increasing Facebook likes and gaming the rich snippets ratings.
In the past week I realized fb star ratings no longer show up in the serps. Anyone know why? pic.twitter.com/uZK2mULXH8
— David Markovich (@DavidMarkovich_) September 21, 2015
Facebook AD Manager changes
Facebook made several changes to the Ads manager which you may need to know about if you run any type of extensive set of campaigns.
Latest Top Threads in SEO & IM Forums
A simple method to drive traffic(low quality)
The purpose of the method is to generate traffic which does not come from Google, It uses WP Robot to scrape and repost content to the target blog and then automatically link to it through twitter. This will generate traffic as some of the twitter users may not have seen the scarped content before. I think that this method will drive some traffic, however for the money invested I don’t see it being useful. Making the traffic useful is where the real magic is and that would depend entirely on what you need the traffic for. In my opinion, this wouldn’t provide a return on investment to send the traffic to an affiliate site. It may be useful to get a few odd subscribers to a blog but at the same time the user would have to be gullible to click the twitter spam links in the first and place and personally I unfollow accounts which spam links all day which are of no value to them.
I think this technique could be altered to help promote some pieces of content from blogs but it would need some extra time invested to look over the content and tweets to make them convincing and build some accounts up. Automating things is key to having the edge over the competition and there is room for improvement in this technique.