Indexed non existent pages, problem appeared after we 301d the url/index to the url.
-
I recently read that if a site has 2 pages that are live such as:
http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live...
I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page.
We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server.
We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index
The links are showing 200 OK.
We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/.
What is the best way to solve this issue?
-
What are some examples of the "non-existent" URLs that are getting indexed, Bryan?
It's going to be pretty hard to diagnose this without actually seeing the site.
Paul
-
Hi I am afraid this is not the issue. It is not an endless loop, usually an endless loop will not let the site load, it keeps redirecting and you can never land on a page... This is not the case. But thank you for your efforts you get a +1
-
Bryan, you might have created an infinite loop that might be causing the issue you are describing. More on the issue HERE
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt Disallowed Pages and Still Indexed
Alright, I am pretty sure I know the answer is "Nothing more I can do here." but I just wanted to double check. It relates to the robots.txt file and that pesky "A description for this result is not available because of this site's robots.txt". Typically people want the URL indexed and the normal Meta Description to be displayed but I don't want the link there at all. I purposefully am trying to robots that stuff outta there.
Intermediate & Advanced SEO | | DRSearchEngOpt
My question is, has anybody tried to get a page taken out of the Index and had this happen; URL still there but pesky robots.txt message for meta description? Were you able to get the URL to no longer show up or did you just live with this? Thanks folks, you are always great!0 -
Wrong URLs indexed, Failing To Rank Anywhere
I’m struggling with a client website that's massively failing to rank. It was published in Nov/Dec last year - not optimised or ranking for anything, it's about 20 pages. I came onboard recently, and 5-6 weeks ago we added new content, did the on-page and finally changed from the non-www to the www version in htaccess and WP settings (while setting www as preferred in Search Console). We then did a press release and since then, have acquired about 4 partial match contextual links on good websites (before this, it had virtually none, save for social profiles etc.) I should note that just before we added the (about 50%) new content and optimised, my developer accidentally published the dev site of the old version of the site and it got indexed. He immediately added it correctly to robots.txt, and I assumed it would therefore drop out of the index fairly quickly and we need not be concerned. Now it's about 6 weeks later, and we’re still not ranking anywhere for our chosen keywords. The keywords are around “egg freezing,” so only moderate competition. We’re not even ranking for our brand name, which is 4 words long and pretty unique. We were ranking in the top 30 for this until yesterday, but it was the press release page on the old (non-www) URL! I was convinced we must have a duplicate content issue after realising the dev site was still indexed, so last week, we went into Search Console to remove all of the dev URLs manually from the index. The next day, they were all removed, and we suddenly began ranking (~83) for “freezing your eggs,” one of our keywords! This seemed unlikely to be a coincidence, but once again, the positive sign was dampened by the fact it was non-www page that was ranking, which made me wonder why the non-www pages were still even indexed. When I do site:oursite.com, for example, both non-www and www URLs are still showing up…. Can someone with more experience than me tell me whether I need to give up on this site, or what I could do to find out if I do? I feel like I may be wasting the client’s money here by building links to a site that could be under a very weird penalty 😕
Intermediate & Advanced SEO | | Ullamalm0 -
Is there a problems with putting encoding into the subdomain of a URL?
We are looking at changing our URL structure for tracking various affiliates from: https://sub.domain.com/quote/?affiliate_id=xxx to https://aff_xxx_affname.domain.com/quote/ Both would allow us to track affiliates, but the second would allow us to use cookies to track. Does anyone know if this could possibly cause SEO concerns? Also, For the site we want to rank for, we will use a reverse proxy to change the URL from https://aff_xxx.maindomain.com/quote/ to https://www.maindomain.com/quote/ would that cause any SEO issues. Thank you.
Intermediate & Advanced SEO | | RoxBrock0 -
HTTPS pages - To meta no-index or not to meta no-index?
I am working on a client's site at the moment and I noticed that both HTTP and HTTPS versions of certain pages are indexed by Google and both show in the SERPS when you search for the content of these pages. I just wanted to get various opinions on whether HTTPS pages should have a meta no-index tag through an htaccess rule or whether they should be left as is.
Intermediate & Advanced SEO | | Jamie.Stevens0 -
Ending URLs in .html versus /
Hi there! Currently all the URLs on my website, even the home page, end it .html, such as http://www,consumerbase.com/index.html Is this bad?
Intermediate & Advanced SEO | | Travis-W
Is there any benefit to this? Should I remove it and just have them end with a forward slash?
If I 301 redirect the old .html URLs to the forward slash URLs, will I lose PA? Thanks!0 -
Page not appearing in SERPs
I have a regional site that does fairly well for most towns in the area (top 10-20). However, one place that has always done OK and has great content is not anywhere within the first 200. Everything looks OK, canonical link is correct, I can find the page if I search for exact text, there aren't any higher ranking duplicate pages. Any ideas what may have happened and how I can confirm a penalty for example. TIA,
Intermediate & Advanced SEO | | Cornwall
Chris0 -
How to avoid seo loss after URL restructuring / change?
We are doing On Page SEO over haul of our website. Our old url used to be mydomain.com/send/FlowersInCity-1-CityName.html we are changing it to mydomain.com/send/Flowers-to-CityName Firstly, will it be advisable to do so since we are in the top 10 in most of the Keywords (but losing ranking each month): The website is very content rich site. Till beginning of 2012, we used be in the top three spots mostly due to On Page and Good content, thus getting the inbound links automatically. But now the things have change, industry has lot of competition and few players have already done heavy SEO for their website, both On and off page thus overtaking us in Ranking. We are also doing other requisite On Page and Off Page work but I am struck with the URL decision part. Secondly, and MOST IMPORTANTLY – if I should change the url, how to minimize the risk of losing the present SEO in this kind of URL restructuring case? Thanks Suman
Intermediate & Advanced SEO | | sumanpatra0 -
Should you stop indexing of short lived pages?
In my site there will be a lot of pages that have a short life span of about a week as they are items on sale, should I nofollow the links meaning the site has a fwe hundred pages or allow indexing and have thousands but then have lots of links to pages that do not exist. I would of course if allowing indexing make sure the page links does not error and sends them to a similarly relevant page but which is best for me with the SEarch Engines? I would like to have the option of loads of links with pages of loads of content but not if it is detrimental Thanks
Intermediate & Advanced SEO | | barney30120