Google still listing old domain
-
Hi
We moved to a new domain back in March 2014 and redirected most pages with a 301 and submitted change of domain request through Google Webmaster tools. A couple of pages were left as 302 redirect as they had rubbish links pointing to them and we had previously had a penalty.
Google was still indexing the old domain and our rankings hadn't recovered. Last month we took away the 302 redirects and just did a blanket 301 approach from old domain to new in the the thinking that as the penalty had been lifted from the old domain there was no harm in sending everything to new domain.
Again, we submitted the change of domain in webmaster tools as the option was available to us but its been a couple of weeks now and the old domain is still indexed
Am I missing something? I realise that the rankings may not have recovered partly due to the disavowing / disregarding of several links but am concerned this may be contributing
-
Hi
I now have a robots.txt for the old site and I created a sitemap by replacing the current domain with the old one and uploaded.
Weirdly when I search for the non-www version of the old domain the pages indexed has increased!
According to WMT the Crawl postponed because robots.txt was inaccessible however I've checked it returns status 200 and the Robots.txt Tester says it's successful even though it never updates the timestamp.
-
Hi Marie
Many thanks for your response,
I've just looked in Webmater tools at the old domain and the option to change domains is there again but I also noticed when looking at the crawl errors there was a message along the lines of crawl postponed as robots.txt was inaccessible.
At the moment it's just a blanket redirect at IIS level so following your advice I'll re-establish the old site's robots.txt and a sitemap and see if Google crawls the 301's to the new domain.
In some ways I'm glad I haven't missed anything but would be nice if just the new domain indexed after all this time !
Thanks again
-
This is odd. The pages all seem to redirect from the old site to the new, so why is Google still indexing those old pages?
I can't see the robots.txt on the old site as it redirects, but is it possible that the robots.txt on fhr-net.co.uk is blocking Google? If this is the case, then Google probably wouldn't be able to see the old site and recognize the redirects.
It may also help to add a sitemap for the old site and also to ask Google to fetch and render the old site's pages and then submit them to the index. This should cause the 301's to be seen and processed by Google.
-
Even after all this time, there are still over 700 pages indexed on our old domain even though we have submitted the change of address twice in Webmaster tools, the second one being about 6 months ago if not longer
old domain is www.fhr-net.co.uk
Any advice would be appreciated
-
No worries,
I appreciate you taking the time to answer my question
-
I think that I'm so used to answering questions about penalized sites that I assumed that you had moved domains because of a penalty. My apologies!
Sounds like you've got the right idea.
-
Thanks for responses,
One week on and since submitting the second change of domain in GWT we've seen the number of pages indexed for the old domain drop from over 1300 to around 700 this week which is something
Regarding the redirect debate, it's an interesting read thanks for sending that. Isn't the situation the same as a site that didn't have a penalty in that you should be monitoring your backlink profile and reconfiguring or disavowing links outside the guidelines whilst carrying out activities that will naturally build decent links and therefore redress the balance?
-
This doesn't answer your question, but I just wanted to point out that the 301 or 302 redirects are not a good idea. Even if you got the penalty lifted, there still can be unnatural links there that can harm you in the eyes of the Penguin algorithm. A 301 will redirect those bad links to the new site. A 302, if left in place long enough will do the same.
Here's an article I wrote today that goes into greater detail:
-
Oh, it may be that it's the other way around with canonical URL-s. At least according to Google (here: https://support.google.com/webmasters/answer/6033086?hl=en
- _Each destination URL should have a self-referencing rel="canonical" meta tag. _
-
Hmm.. certainly someone with more experience than myself would have a more elegant solution, but I would still try to do this by establishing the canonical URL because you don't want to delist: https://support.google.com/webmasters/answer/139066#6
If you can configure your server, you can use
rel="canonical"
HTTP headers to indicate the canonical URL for HTML documents and other files such as PDFs. Say your site makes the same PDF available via different URLs (for example, for tracking purposes), like this:_http://www.example.com/downloads/white-paper.pdf http://www.example.com/downloads/partner-1/white-paper.pdf http://www.example.com/downloads/partner-2/white-paper.pdf http://www.example.com/downloads/partner-3/white-paper.pdf_
In this case, you can use a
rel="canonical"
HTTP header to specify to Google the canonical URL for the PDF file, as follows:Link: <http: www.example.com="" downloads="" white-paper.pdf="">; rel="canonical"</http:>
-
Hi there
The old pages don't exist any more to add the canonical they're 301's from old domain to new but over 1000 pages show up for site:www.fhr-net.co.uk
-
Got it, you must have tried adding the canonical URL meta tags already, right? If not, check out: http://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questions
"...in late 2009, Google announced support for cross-domain use of rel=canonical. This is typically for syndicated content, when you’re concerned about duplication and only want one version of the content to be eligible for ranking...
..First off, Google may choose to ignore cross-domain use of rel=canonical if the pages seem too different or it appears manipulative. The ideal use of cross-domain rel=canonical would be a situation where multiple sites owned by the same entity share content, and that content is useful to the users of each individual site. In that case, you probably wouldn’t want to use 301-redirects (it could confuse users and harm the individual brands), but you may want to avoid duplicate content issues and control which property Google displays in search results. I would not typically use rel=canonical cross-domain just to consolidate PageRank..."
-
Thanks for your reply,
It's not that I want to de-list the old domain as I would rather people get to the site using that domain than not at all but, my concern is that for whatever reason the transfer hasn't completed as it's been such a long time and we're for instance not getting the full benefit of sites linking to the old domain passed to the new one
-
If your goal is to delist the old domain I am going to copy the answer I just gave at http://moz.com/community/q/how-to-exclude-all-pages-on-a-subdomain-for-search, simply because it's clear and works quickly (48h) in my experience.
This is the authoritative way that Google recommends at https://support.google.com/webmasters/answer/1663419?hl=en&rd=1:
- Add an robots.txt file for your domain. Usually via FTP. Add the "noindex" meta-tags to every page as well.
- Add your subdomain as a separate site in Google Webmaster Tools
- On the Webmaster Tools home page, click the site you want.
- On the Dashboard, click Google Index on the left-hand menu.
- Click Remove URLs.
- Click New removal request.
- Type the URL of the page you want removed from search results (not the Google search results URL or cached page URL), and then click Continue. How to find the right URL. The URL is case-sensitive—use exactly the same characters and capitalization that the site uses.
- Click Yes, remove this page.
- Click Submit Request.
To exclude the entire domain, simply enter the domain URL (e.g. http://domain.com) at the 7th step.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old competitor site but GMB listing no more, are links still valuable?
One of my clients has come into the possession of a competitor's website. They sat on it for a while (other things going on) and because the company ceased trading the GMB listing seems to have been removed by Google and the leads have dropped off since this loss. The links are OK, so am considering 301 redirects, if the links still pass any value.
Intermediate & Advanced SEO | | GrouchyKids
Linking Domains 98
Domain Authority 23
Spam Score 2 % Are the links likely to still pass value? Also in terms of updating the WHOIS info what's the best approach?0 -
Google Manual Penalty Lifted - Why is my website still decreasing on traffic?
Hi there, I was hoping that somebody has a potential answer to this or if anyone else has experienced this issue. Our website has recently hit by a manual penalty (structured data wasn't matching the content on the page) After working hard on this to fix the issue across the site, we submitted a reconsideration request which was approved by Google a few days later. I understand that not all websites recover and it doesn't guarantee rankings will go back to normal, but it seems as if the traffic is continuing to drop at an even quicker rate. There's a number of small technical optimisations that have been briefed into the dev team such as: Redirecting duplicate versions, fixing redirects on internal links, There's also work on-page running in the background fixing up keyword cannibalization, consolidating content keyword mapping and ensuring the internal link structure is sound. Has this happened to anyone else before? If so, how did you recover? Any suggestions/advice would be really appreciated. Thank you
Intermediate & Advanced SEO | | dbutler9120 -
SEO on Jobs sites: how to deal with expired listings with "Google for Jobs" around
Dear community, When dealing with expired job offers on jobs sites from a SEO perspective, most practitioners recommend to implement 301 redirects to category pages in order to keep the positive ranking signals of incoming links. Is it necessary to rethink this recommendation with "Google for Jobs" is around? Google's recommendations on how to handle expired job postings does not include 301 redirects. "To remove a job posting that is no longer available: Remove the job posting from your sitemap. Do one of the following: Note: Do NOT just add a message to the page indicating that the job has expired without also doing one of the following actions to remove the job posting from your sitemap. Remove the JobPosting markup from the page. Remove the page entirely (so that requesting it returns a 404 status code). Add a noindex meta tag to the page." Will implementing 301 redirects the chances to appear in "Google for Jobs"? What do you think?
Intermediate & Advanced SEO | | grnjbs07175 -
Hide Aggregation from Google?
Google isn't a fan of aggregation, but sometimes it is a good way to fill out content when you cannot cover every news story there is. What I'm wondering is if anyone has continued to do any form of aggregation based on a category and hide that url from Google. Example: example.com/industry-news/ -- is where you'd post aggregation stories but you block robots from crawling that. We wouldn't be doing this for search value just value to our readers. Thoughts?
Intermediate & Advanced SEO | | meistermedia0 -
Homepage not ranking in Google AU, but ranking in Google UK?
Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks
Intermediate & Advanced SEO | | AdaptDigital0 -
Why does Google add my domain as a suffix to page title in SERPS?
Hi, If I do a search in Google - for one our products on our site, our site comes up - but it would appear that google is adding our domain name as a suffix to our title in the results... Anyone else seen this? Can I do anything about it? I would prefer it not to appear. Thanks!
Intermediate & Advanced SEO | | bjs20100 -
Google didn't indexed my domain.
I bought *out.com more than 1 year, google bot even don't come, then I put the domain to the domain parking. what can I do? I want google index me.
Intermediate & Advanced SEO | | Yue0 -
Google Places
If you rank on google places, I have noticed that you do not rank on the front page as well. I have a site that ranks on front page for it's keywords; however, because they are (1) on google places, they don't show up when someone is localized to that area. They show up on google places but not on front page. If you turn of localization, they are first in serps. How can I get around this? Two separate sites? One for Google+ (Places) and one for SERPS?
Intermediate & Advanced SEO | | JML11790