Duplicate Pages on GWT when redesigning website
-
Hi, we recently redesigned our online shop. We have done the 301 redirects for all product pages to the new URL (and went live about 1.5 week ago), but GWT indicated that the old product URL and the new product URL are 2 different pages with the same meta title tags (duplication) - when in fact, the old URL is 301 redirecting to the new URL when visited.
I found this article on google forum: https://productforums.google.com/forum/#!topic/webmasters/CvCjeNOxOUw
It says we either just wait for Google to re-crawl, of use the fetch URL function for the OLD URLs. Question is, after i fetch the OLD URL to tell Google that it's being redirected, should i click the button 'submit to index' or not? (See screengrab - please note that it was the OLD URL that was being fetched, not the NEW URL). I mean, if i click this button, is it telling Google that:a. 'This old URL has been redirected, therefore please index the new URL'? or
b. 'Please keep this old URL in your index'?What's your view on this? Thanks
-
Hi,
I migrated a load of product category pages on one of my websites recently to cleaner URLs and to force the crawl I submitted the new URLs (and children) to index via WMT. This was to pick them up quickly - and it worked (within seconds). The old URLs appearing were never a problem. However there are limits to the number of times you can do this so that might be a sticking point for your solution as I'm guessing you have lots of products. Try it with one page (a low traffic and selling product!) and see what happens - and let us know.
It's possible Google is holding onto your old URLs because they have a number of inbound links and the crawl will eventually catch up to only display the new URLs if you give it time.
Aside from agreeing with the sitemap submission suggestion, I'd also triple check that your 301s / canonicals are set up properly on your website's old URLs by firing Screaming Frog or another crawler at it.
George
-
Have you resubmitted your sitemap? That is a slightly simpler step. Personally I would wait for the pages to be indexed. This should really only take about 2 weeks. The SERP might reflect the old site until then, but if your rankings are good then that is a good thing for your SEO.
I don't think that fetching in this case will correctly reindex your site. The wait and see game is going to be your best chance at getting the natural response you want from Google without sacrificing your existing rankings.
-
Honestly speaking, I am sick of this Google Webmaster Tool delay in update… most of the time it shows me the days of months old when the website will be completely changed it will still talking about the old problems…
My first suggestion is to wait and I believe after few crawls it will understand that they have moved on from the problem you had before.
The image you attached will only tell you if the redirection is properly working or not and if the user is shorting from old page to the new one that means it is working.
I believe another thing you people can do is to give a social bump to your new pages and at the same time request Google to de-index the page. GWT have this option somewhere.
Hope this helps!
-
Sorry, forgot to attach.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Partial duplicate content (reviews) on product pages - is this ok?
Hello, we recently received some really good reviews about a range of products we sell (there are normally 8 products in a range). Due to the industry we are in it made no sense to try and get reviews on each individual product within the range as they differ only ever so slightly. So my question is we want to add these reviews to each of the 8 products that lie within each range, but by adding them it would mean that each page has around 600 words of unique product description followed by approx 600 words of reviews that are the same on each of the products within that range. Is this ok? my only other option would be to screenshot the reviews and upload them as images below each product description. If anyone could offer advice here that would be much appreciated. Thanks
Technical SEO | | livs20130 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
I really need some help with Magento and Duplicate Page Content results I;m getting
Hi, We use Magento for our eCommerce platform and I'm getting a number of duplicate page content results. It mainly concerns the duplicate page content errors for our category pages. Firstly It seems like the product type and filter options highlighted in the picture are causing duplicate page content Also one particularity category is getting a lot from duplicate page content errors , http://www.tidy-books.co.uk/shop-all-products I understand that this category page is using duplicate pages of other category pages so I set this to exclude them from the site map but it looks likes its till being picked up? I've attached the csv file showing these errors as well. - > Any help would be massively appreciated Thanks filter.png moz-tidy-books-uk-crawl_issues-01-OCT-2014.csv
Technical SEO | | tidybooks0 -
Duplicate Page Title Crawl Error Issue
In the last crawl for on of our client websites the duplicate page title and page content numbers were very high. They are reading every page twice. http://www.barefootparadisevacations.com and http://barefootparadisevacations.com are being read as two different pages with the same page title. After the last crawl I used our built in redirect tool to redirect the urls, but the most recent crawl showed the same issue. Is this issue really hurting our rankings and if so, any suggestions on a fix for the problem? Thank you!
Technical SEO | | LoveMyPugs0 -
ECommerce site - Duplicate pages problem.
We have an eCommerce site with multiple products being displayed on a number of pages. We use rel="next" and rel="prev" and have a display ALL which I understand Google should automatically be able to find. Should we also being using a Canonical tag as well to tell google to give authority to the first page or the All Pages. Or was the use of the next and prev rel tags that we currently do adequate. We currently display 20 products per page, we were thinking of increasing this to make fewer pages but they would be better as this which would make some later product pages redundant . If we add 301 redirects on the redundant pages, does anyone know of the sort of impact this might cause to traffic and seo ?. General thoughts if anyone has similar problems welcome
Technical SEO | | SarahCollins0 -
When Is It Good To Redirect Pages on Your Site to Another Page?
Suppose you have a page on your site that discusses a topic that is similar to another page but targets a different keyword phrase. The page has medium quality content, no inbound links, and the attracts little traffic. Should you 301 redirect the page to a stronger page?
Technical SEO | | ProjectLabs1 -
Noindex, follow duplicate pages
I have a series of websites that all feature a library of the same content. These pages don't make up the majority of the sites content, maybe 10-15% of the total pages. Most of our clients won't take the time to rewrite the content, but it's valuable to their site. So I decided to noindex, follow all of the pages. Outside of convincing them all to write their own versions of the content, is this the best method? I could also block the pages with robots.txt, but then I couldn't pass any link juice through the pages. Any thoughts?
Technical SEO | | vforvinnie0 -
Looking to hire someone to help on my website. On page issues with subpages outranking homepage
Hello, I have a site that has been doing good for quite a while now. But lately I'm running into issues with on page seo. I have ranked well for long periods of time, but changes like adding content or moving stuff around has bumped me out of the rankings. I would prefer not to publicly disclose the site. My main keyword I rank for is "word1 word2". I notice if I do a site: search for "word1 word2" homepage is #1, site search for just word1 it's #1 as well, but the word2 returns a category listing above the homepage. I noticed this happen right as I fell back from the rankings. I am looking to hire an SEO profession to help me out with on page issues. I've come to admit that I just don't understand something about this and I need help from someone who has a lot of experience with this. I was hoping to find recommendations by people here for a company I could hire for this. I would want to pay no more than $1,000. I hope that budget is high enough to get a skilled individual/company working with me. I'm not sure where to search for online to get a skilled on page seo professional, so I thought a personal recommendation from someone here would possibly be able to help. Thanks
Technical SEO | | nux0