Google still listing old domain
-
Hi
We moved to a new domain back in March 2014 and redirected most pages with a 301 and submitted change of domain request through Google Webmaster tools. A couple of pages were left as 302 redirect as they had rubbish links pointing to them and we had previously had a penalty.
Google was still indexing the old domain and our rankings hadn't recovered. Last month we took away the 302 redirects and just did a blanket 301 approach from old domain to new in the the thinking that as the penalty had been lifted from the old domain there was no harm in sending everything to new domain.
Again, we submitted the change of domain in webmaster tools as the option was available to us but its been a couple of weeks now and the old domain is still indexed
Am I missing something? I realise that the rankings may not have recovered partly due to the disavowing / disregarding of several links but am concerned this may be contributing
-
Hi
I now have a robots.txt for the old site and I created a sitemap by replacing the current domain with the old one and uploaded.
Weirdly when I search for the non-www version of the old domain the pages indexed has increased!
According to WMT the Crawl postponed because robots.txt was inaccessible however I've checked it returns status 200 and the Robots.txt Tester says it's successful even though it never updates the timestamp.
-
Hi Marie
Many thanks for your response,
I've just looked in Webmater tools at the old domain and the option to change domains is there again but I also noticed when looking at the crawl errors there was a message along the lines of crawl postponed as robots.txt was inaccessible.
At the moment it's just a blanket redirect at IIS level so following your advice I'll re-establish the old site's robots.txt and a sitemap and see if Google crawls the 301's to the new domain.
In some ways I'm glad I haven't missed anything but would be nice if just the new domain indexed after all this time !
Thanks again
-
This is odd. The pages all seem to redirect from the old site to the new, so why is Google still indexing those old pages?
I can't see the robots.txt on the old site as it redirects, but is it possible that the robots.txt on fhr-net.co.uk is blocking Google? If this is the case, then Google probably wouldn't be able to see the old site and recognize the redirects.
It may also help to add a sitemap for the old site and also to ask Google to fetch and render the old site's pages and then submit them to the index. This should cause the 301's to be seen and processed by Google.
-
Even after all this time, there are still over 700 pages indexed on our old domain even though we have submitted the change of address twice in Webmaster tools, the second one being about 6 months ago if not longer
old domain is www.fhr-net.co.uk
Any advice would be appreciated
-
No worries,
I appreciate you taking the time to answer my question
-
I think that I'm so used to answering questions about penalized sites that I assumed that you had moved domains because of a penalty. My apologies!
Sounds like you've got the right idea.
-
Thanks for responses,
One week on and since submitting the second change of domain in GWT we've seen the number of pages indexed for the old domain drop from over 1300 to around 700 this week which is something
Regarding the redirect debate, it's an interesting read thanks for sending that. Isn't the situation the same as a site that didn't have a penalty in that you should be monitoring your backlink profile and reconfiguring or disavowing links outside the guidelines whilst carrying out activities that will naturally build decent links and therefore redress the balance?
-
This doesn't answer your question, but I just wanted to point out that the 301 or 302 redirects are not a good idea. Even if you got the penalty lifted, there still can be unnatural links there that can harm you in the eyes of the Penguin algorithm. A 301 will redirect those bad links to the new site. A 302, if left in place long enough will do the same.
Here's an article I wrote today that goes into greater detail:
-
Oh, it may be that it's the other way around with canonical URL-s. At least according to Google (here: https://support.google.com/webmasters/answer/6033086?hl=en
- _Each destination URL should have a self-referencing rel="canonical" meta tag. _
-
Hmm.. certainly someone with more experience than myself would have a more elegant solution, but I would still try to do this by establishing the canonical URL because you don't want to delist: https://support.google.com/webmasters/answer/139066#6
If you can configure your server, you can use
rel="canonical"
HTTP headers to indicate the canonical URL for HTML documents and other files such as PDFs. Say your site makes the same PDF available via different URLs (for example, for tracking purposes), like this:_http://www.example.com/downloads/white-paper.pdf http://www.example.com/downloads/partner-1/white-paper.pdf http://www.example.com/downloads/partner-2/white-paper.pdf http://www.example.com/downloads/partner-3/white-paper.pdf_
In this case, you can use a
rel="canonical"
HTTP header to specify to Google the canonical URL for the PDF file, as follows:Link: <http: www.example.com="" downloads="" white-paper.pdf="">; rel="canonical"</http:>
-
Hi there
The old pages don't exist any more to add the canonical they're 301's from old domain to new but over 1000 pages show up for site:www.fhr-net.co.uk
-
Got it, you must have tried adding the canonical URL meta tags already, right? If not, check out: http://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questions
"...in late 2009, Google announced support for cross-domain use of rel=canonical. This is typically for syndicated content, when you’re concerned about duplication and only want one version of the content to be eligible for ranking...
..First off, Google may choose to ignore cross-domain use of rel=canonical if the pages seem too different or it appears manipulative. The ideal use of cross-domain rel=canonical would be a situation where multiple sites owned by the same entity share content, and that content is useful to the users of each individual site. In that case, you probably wouldn’t want to use 301-redirects (it could confuse users and harm the individual brands), but you may want to avoid duplicate content issues and control which property Google displays in search results. I would not typically use rel=canonical cross-domain just to consolidate PageRank..."
-
Thanks for your reply,
It's not that I want to de-list the old domain as I would rather people get to the site using that domain than not at all but, my concern is that for whatever reason the transfer hasn't completed as it's been such a long time and we're for instance not getting the full benefit of sites linking to the old domain passed to the new one
-
If your goal is to delist the old domain I am going to copy the answer I just gave at http://moz.com/community/q/how-to-exclude-all-pages-on-a-subdomain-for-search, simply because it's clear and works quickly (48h) in my experience.
This is the authoritative way that Google recommends at https://support.google.com/webmasters/answer/1663419?hl=en&rd=1:
- Add an robots.txt file for your domain. Usually via FTP. Add the "noindex" meta-tags to every page as well.
- Add your subdomain as a separate site in Google Webmaster Tools
- On the Webmaster Tools home page, click the site you want.
- On the Dashboard, click Google Index on the left-hand menu.
- Click Remove URLs.
- Click New removal request.
- Type the URL of the page you want removed from search results (not the Google search results URL or cached page URL), and then click Continue. How to find the right URL. The URL is case-sensitive—use exactly the same characters and capitalization that the site uses.
- Click Yes, remove this page.
- Click Submit Request.
To exclude the entire domain, simply enter the domain URL (e.g. http://domain.com) at the 7th step.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi guys,Is it okay if I add keywords to google my business listing?
I have seen that some of my competitors are doing it ,so I was wondering if I can do that as well?
Intermediate & Advanced SEO | | EugeneMot0 -
Homepage is deindexed in Google
Please help for some reason my website home page has disappeared, we have been working on the site but nothing that I can think of which would block it. There are no warnings in google console? Can anyone lend a hand in understanding what has gone wrong, I would really appreciate it. The site is: http://www.discountstickerprinting.co.uk/ Seems to be working again but I had to fetch the home page in google console, any idea why this has happened cannot afford a heat op at this age lol?
Intermediate & Advanced SEO | | BobAnderson0 -
Google News - How to be featured in different countries with same domain
Hello, I am sure I am not the first to go through this but I did not find the answers in the previous Q&As. So, sorry if this question is bit redundant for some folks. We have a site in English with readers in multiple english speaking countries. How to make www.mysite.com accepted in Google news for US, UK, India, Australia and Canada at the same time? It is currently accepted in the US but, as I stated above, we do have a strong audience in other countries. I have read about sub-domains, but wouldn't it be considered duplicated content if I had the exact same article in different sub-domains? We are talking about creating 4 copies on mysite.com just to be added to Google News in those specific countries www.mysite.com/same-article/
Intermediate & Advanced SEO | | Koki.Mourao
uk.mysite.com/same-article/
au.mysite.com/same-article/
in.mysite.com/same-article/
ca.mysite.com/same-article/ Isn't there a better way of having mysite.com included in Google News for all English speaking countries?0 -
NEW WEBSITE WHAT IS THE BEST WAY TO RECOVERY THE AUTHORITY OF OLD DOMAIN NAME?
HOW TO DO RECOVERY AUTHORITY OF OLD DOMAIN NAME? I got some advise on this in another post here on MOZ based on this i need a few answers TO SUMMERIZE**:****.** My client got some REALLY bad advice when they got their new website. So they ended up changing the domain name and just redirecting everything from the old domain and old website to the front page of the new domain and new website. As the new domain not optimized for SEO they of cause now are not ranking on anything in Google anymore. QUESTION 1 According to my client, they use to rank well on keywords for the old domain and get a lot of organic traffic. They don’t have access to their old google analytics account, and don’t have any reports on their rankings. Can anyone suggestions how I can find out what keywords they were ranking on? QUESTION 2 I will change the domain name back to the old domnain name (the client actually prefer the old domain name) But how to get back most possible page authority: For information titles, descriptions, content has all been rewritten. A - Redirect I will try to match the old urls with the new ones. B - Recreate site structure Make the URL structure of the new website look like the old URL structure Etc. the old structure use to be like olddomain.com/our-destinations/cambadia.html (old) newdomain.com/destinations/Cambodia (new) Or olddomain.com/private-tours.html (old) newdomain.com/tailor-made (new) does the html in the old urls need any attention when recreating the permalinks in the new websites. Look forward to hear your thoughts on this, thanks!
Intermediate & Advanced SEO | | nm19770 -
Google and PDF indexing
It was recently brought to my attention that one of the PDFs on our site wasn't showing up when looking for a particular phrase within the document. The user was trying to search only within our site. Once I removed the site restriction - I noticed that there was another site using the exact same PDF. It appears Google is indexing that PDF but not ours. The name, title, and content are the same. Is there any way to get around this? I find it interesting as we use GSA and within GSA it shows up for the phrase. I have to imagine Google is saying that it already has the PDF and therefore is ignoring our PDF. Any tricks to get around this? BTW - both sites rightfully should have the PDF. One is a client site and they are allowed to host the PDFs created for them. However, I'd like Mathematica to also be listed. Query: no site restriction (notice: Teach for america comes up #1 and Mathematica is not listed). https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#q=HSAC_final_rpt_9_2013.pdf+"Teach+charlotte"+filetype:pdf&as_qdr=all&filter=0 Query: site restriction (notice that it doesn't find the phrase and redirects to any of the words) https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#as_qdr=all&q="Teach+charlotte"+site:www.mathematica-mpr.com+filetype:pdf
Intermediate & Advanced SEO | | jpfleiderer0 -
Should I serve images from the same Top level domain as the current domain?
We run a multidomain e-commerce website that targets each country respectively: .be -> Belgium .co.uk -> United Kingdom etc... .com for all other countries We also serve our product images via a media subdomain eg. "media.ourdomain.be/image.jpg"
Intermediate & Advanced SEO | | jef2220
This means that all TLD's contain the images of the .be media subdomain. Which is acually seen as an outbound link. We are considering to change this setup so that it serves the images from the same domain as the current TLD, which would make more sense: .be will serve images from media.ourdomain.be .co.uk -> media.ourdomain.co.uk etc.. My question is: Does google image search take the extension of the TLD into consideration? So that for example German users will be more likely to see an image that is served on a .de domain?0 -
Merging Domains
Up until last week, we had separate domains for each of our 3 products. We've now merged two products to sit under one URL. The merge coincided with a CMS upgrade which effectively killed all of our old URLs save for the homepage. Is it best for me to 301 the old homepage to it's new place, as well as the rest of the old site's top pages to according pages on the new site? Or is there a better solution?
Intermediate & Advanced SEO | | taylor.craig0 -
Why do some domains out rank stronger authority domains
Hi, If we take the Moz stats into account here, how comes sometimes weak Moz stat domains out ranking strong Moz stat domains? For example: A inner page with DA56 / PA40 is outranking a Wikipedia inner page with DA100 / PA82. That's a massive difference basically twice as strong on the Wikipedia page but being out ranking. In this case I assume the onpage SEO is playing a big part, but can onpage optimisation be that powerful? And I see this all the time, what SEO factors cause this? Thanks.
Intermediate & Advanced SEO | | Bondara0