Google Change of Address with Questionable Backlink Profile
-
We have a .com domain where we are 301-ing the .co.uk site into it before shutting it down - the client no longer has an office in the UK and wants to focus on the .com.
The .com is a nice domain with good trust indicators. I've just redesigned the site, added a wad of healthy structured markup, had the duplicate content mostly rewritten - still finishing off this job but I think we got most of it with Copyscape.
The site has not so many backlinks, but we're working on this too and the ones it does have are natural, varied and from trustworthy sites. We also have a little feature on the redesign coming up in .Net magazine early next year, so that will help.
The .co.uk on the other hand has a fair few backlinks - 1489 showing in Open Site Explorer - and I spent a good amount of time matching the .co.uk pages to similar content on the .com so that the redirects would hopefully pass some pagerank.
However, approximately a year later, we are struggling to grow organic traffic to the .com site. It feels like we are driving with the handbrake on.
I went and did some research into the backlink profile of the .co.uk, and it is mostly made up of article submissions, a few on 'quality' (not in my opinion) article sites such as ezine, and the majority on godawful and broken spammy article sites and old blogs bought for seo purposes.
So my question is, in light of the fact that the SEO company that 'built' these shoddy links will not reply to my questions as to whether they received a penalty notification or noticed a Penguin penalty, and the fact that they have also deleted the Google Analytics profiles for the site, how should I proceed?
**To my mind I have 3 options. **
1. Ignore the bad majority in the .co.uk backlink profile, keep up the change of address and 301's, and hope that we can just drown out the shoddy links by building new quality ones - to the .com. Hopefully the crufty links will fade into insignificance over time.. I'm not too keen on this course of action.
2. Use the disavow tool for every suspect link pointing to the .co.uk site (no way I will be able to get the links removed manually) - and the advice I've seen also suggests submitting a reinclusion request afterwards- but this seems pointless considering we are just 301-ing to the new (.com) site.
3. Disassociate ourselves completely from the .co.uk site - forget about the few quality links to it and cut our losses. Remove the change of address request in GWT and possibly remove the site altogether and return 410 headers for it just to force the issue. Clean slate in the post.
What say you mozzers? Please help, working myself blue in the face to fix the organic traffic issues for this client and not getting very far as yet.
-
Thanks Ryan, that's definitely where our focus is, especially given that resources are limited for the time being.
-
IMO there are only 2 reasons anyone should use the disavow tool:
1. You have been given a manual warning of unnatural links in WMT.
(If you're not sure if you've got a manual warning then submit a reconsideration request. You'll either get a message back saying there is no warning or they'll tell you what the warning is.)
2. You are certain that you have been affected by Penguin. This usually means equating a large drop in rankings and traffic with the date of a known Penguin update (Apr 24, May 25, Oct 5)
It's going to be hard in your situation though to know if you've got a Penguin issue if you don't have access to the analytics from before. BUT, you mentioned you did the 301s over a year ago so I'm thinking this means you have access to a year's worth of analytics. All of Penguin happened in the last year so you should be able to figure out if Penguin affected you.
If you go disavowing a bunch of links you could do more harm than good. Plus, if you HAVE been affected by Penguin you can make a pile of changes and you're not going to see an improvement until there has been another Penguin refresh.
The best option would be to have someone experienced with Penguin have a look at your backlink profile and analytics and advise you from there.
Another option would be to remove the 301 and see what happens. There's a good chance that you'll see a drop in rankings if there are any good links in the mix. But it's possible that you could see no drop and then when Penguin refreshes see an improvement.
Personally, if the old backlink profile seems to be entirely spam I would remove the 301s because Penguin is going to continue to get more and more effective at devaluing spam tactics. So, even if you weren't being affected by Penguin now, it could happen in the future. I would only keep the 301s if there were some really good links that I knew were natural. In this case, I might consider 301ing and using the disavow on the bad links. But if you're going to do this you really need to know what you're doing.
And one final thought - it may be worthwhile to have someone evaluate the links that you are getting to the new site. I have seen a lot of people think they are building good quality links when really they're using tactics that no longer work. It could be that your methods of growing the traffic just aren't affective and maybe there is no penalty after all!
-
If you have clear indications of what back links are spammy I would suggest going through the disavow tool. The obvious ones you mention that stick out are the article submission links and broken links on shoddy websites.
You always have to be very careful with the disavow tool though. But if you are seeing clear spammy links get rid.
In terms of a penalty, what do your keyword rankings look like? a penalty will more than likely come parallel to a rankings drop. Did you see big dips in ranking drops? Are you link building properly for the new site?
You may be finding that pagerank just isn't being passed over as well as you would have expected from the .co.uk. Doing some good quality content marketing and link building on the .com site will more than probably help.
-
Sounds like a decent plan, but the main focus shouldn't be on disavowing, but rather getting the new, good, quality links earned back to your site. At least in my opinion. Best of luck
-
Thanks Ryan, I was gravitating towards disavowing the real garbage pointing to the .co.uk site - there are a few links I think I'll keep.
I'm wary of the risk of using the tool in light of how it has been presented - with big warning labels - but at the same time the .co.uk is not our focus, just a secondary site we hoped to get some residual PR and redirect referrals from - so if things go wrong with the disavow tool it's probably not a total train smash.
-
If you see your rankings dropping, chances are you have some backlinks working against you. I wouldn't go hog wild and disavow everything, as the link profile could have some spots that help you. I can't be specific without looking at it.
The best thing you can do, in my opinion, is spread the good news about the new site and redesign, and not "build" links from it, but "earn" links. Share your awesome new content and get people to like it so much that you earn some new traffic and links. You are on the right track by learning how to do it right (versus the old, shoddy SEO company that didn't do much good), but now you have to put the work in and do it right.
There are plenty of good methods that work for getting you some good, organic, quality traffic that will help your SERPs. If you feel you are over your head, there are plenty of people on here that would be more than willing to help you out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Exposure from backlinks for job posting URL. Will soon expire, how best to keep the backlink juice?
Hi All, First post and apologies if this seems obvious. I run a niche jobs board and recently one of our openings was shared online quite heavily after a press release. The exposure has been great but my problem is the URL generated for the job post will soon expire. I was wondering the best way to keep the "link juice" as I can't extend the post indefinitely as the job has been filled. Would a 301 redirect work best in this case? Thanks in advance for the info!
Technical SEO | | MartinAndrew0 -
Some SEO 2016 questions
Hello MOZ Community, I have some questions where the following is still working for seo in 2016: Is an exact keyword in the domain still a good start? If a domain contains the most important keyword does one still need subfolders with that keyword in the url? Do you need multiple subpages so the main url becomes stronger? Is linkbuilding still the number one factor? Thank you for your thoughts!
Technical SEO | | mhenze0 -
Mobile site backlinks?
Hello, Our mobile site redirects to desktop in a desktop browser and vice versa; however, they are different sites. This said, shouldn't the backlinks for our mobile site be the same as for our desktop site since one redirects to the other. We show no backlinks in my analysis? Any help or insight would be extremely appreciated! Thank you!
Technical SEO | | lfrazer1 -
Google News Sitemap
Currently for our website Thinkdigit, we are using a rss sitemap (http://www.thinkdigit.com/google_sitemap/news_rss.php) for news. Please let me know is this the right format or we should use xml format only. Also we have lost a huge chunk of traffic from news search, Previously it used to be around 10,000 visit from google news, now it is just 300 visit per day.
Technical SEO | | 9dot90 -
0 Google Backlinks
A sudden drop in the number of google backlinks. Earlier this month I had 15 google backlinks and now all of a sudden I have none. My google impression has also dropped drastically, my website's average is 10000 impression per day and now we have none. I have increased the crawler's speed on the website, would this be the cause of it? 4pmdesign.com
Technical SEO | | 4pm0 -
IP addresses indexed?
I've met with a potential client who has a site with 1,000's of very specific part #'s which don't show in the SERP's on Google. They definitely have the issue of dynamic URL's - but the URL for the part # searches is an IP address rather than their domain name - example: 188.888.888.888/partssearch.php?pnum='1233445' I've not seen the IP address used like this for an external website - is this acceptable for SEO purposes? Thanks, Mark
Technical SEO | | DenverKelly0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
Robots.txt question
I want to block spiders from specific specific part of website (say abc folder). In robots.txt, i have to write - User-agent: * Disallow: /abc/ Shall i have to insert the last slash. or will this do User-agent: * Disallow: /abc
Technical SEO | | seoug_20050