Duplicate Pages on GWT when redesigning website
-
Hi, we recently redesigned our online shop. We have done the 301 redirects for all product pages to the new URL (and went live about 1.5 week ago), but GWT indicated that the old product URL and the new product URL are 2 different pages with the same meta title tags (duplication) - when in fact, the old URL is 301 redirecting to the new URL when visited.
I found this article on google forum: https://productforums.google.com/forum/#!topic/webmasters/CvCjeNOxOUw
It says we either just wait for Google to re-crawl, of use the fetch URL function for the OLD URLs. Question is, after i fetch the OLD URL to tell Google that it's being redirected, should i click the button 'submit to index' or not? (See screengrab - please note that it was the OLD URL that was being fetched, not the NEW URL). I mean, if i click this button, is it telling Google that:a. 'This old URL has been redirected, therefore please index the new URL'? or
b. 'Please keep this old URL in your index'?What's your view on this? Thanks
-
Hi,
I migrated a load of product category pages on one of my websites recently to cleaner URLs and to force the crawl I submitted the new URLs (and children) to index via WMT. This was to pick them up quickly - and it worked (within seconds). The old URLs appearing were never a problem. However there are limits to the number of times you can do this so that might be a sticking point for your solution as I'm guessing you have lots of products. Try it with one page (a low traffic and selling product!) and see what happens - and let us know.
It's possible Google is holding onto your old URLs because they have a number of inbound links and the crawl will eventually catch up to only display the new URLs if you give it time.
Aside from agreeing with the sitemap submission suggestion, I'd also triple check that your 301s / canonicals are set up properly on your website's old URLs by firing Screaming Frog or another crawler at it.
George
-
Have you resubmitted your sitemap? That is a slightly simpler step. Personally I would wait for the pages to be indexed. This should really only take about 2 weeks. The SERP might reflect the old site until then, but if your rankings are good then that is a good thing for your SEO.
I don't think that fetching in this case will correctly reindex your site. The wait and see game is going to be your best chance at getting the natural response you want from Google without sacrificing your existing rankings.
-
Honestly speaking, I am sick of this Google Webmaster Tool delay in update… most of the time it shows me the days of months old when the website will be completely changed it will still talking about the old problems…
My first suggestion is to wait and I believe after few crawls it will understand that they have moved on from the problem you had before.
The image you attached will only tell you if the redirection is properly working or not and if the user is shorting from old page to the new one that means it is working.
I believe another thing you people can do is to give a social bump to your new pages and at the same time request Google to de-index the page. GWT have this option somewhere.
Hope this helps!
-
Sorry, forgot to attach.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Entire website is duplicated on 2 domains - what to do?
My client's website has 1000+ pages and a Domain Authority of 23. I have just discovered that the entire site is duplicated on a second domain (main URL = companyname.com - duplicate site URL = company-name.com). The home page of the duplicate domain has a 301 redirect going to the main domain. However, none of the 1000+ other pages have any redirect set up, so Google is indexing the entire duplicate site. I'm assuming this is a bad thing for SEO. Duplicate site has a domain Authority of 4, so I'd like to transfer whatever link juice it has, towards the main site. What's the best thing to do? Ultimately I think it would be best to delete the duplicate site. So would it be a case of adding a redirect to the htaccess file along the lines of: redirect company-name.com/?slug? to https://companyname.com/?slug? (I realise this isn't the correct syntax - but is the concept correct?) Has anyone ever dealt with this successfully?
Technical SEO | | BottleGreenWebsites0 -
Duplicate content due to numerous sub category level pages
We have a healthcare website which lists doctors based on their medical speciality. We have a paginated series to list hundreds of doctors. Algorithm: A search for Dentist in Newark locality of New York gives a result filled with dentists from Newark followed by list of dentists in locations near by Newark. So all localities under a city have the same set of doctors distributed jumbled an distributed across multiple pages based on nearness to locality. When we don't have any dentists in Newark we populate results for near by localities and create a page. The issue - So when the number of dentists in New York is <11 all Localities X Dentists will have jumbled up results all pointing to the same 10 doctors. The issue is even severe when we see that we have only 1-3 dentists in the city. Every locality page will be exactly the same as a city level page. We have about 2.5 Million pages with the above scenario. **City level page - **https://www.example.com/new-york/dentist - 5 dentists **Locality Level Page - **https://www.example.com/new-york/dentist/clifton, https://www.example.com/new-york/dentist/newark - Page contains the same 5 dentists as in New York city level page in jumbled up or same order. What do you think we must do in such a case? We had discussions on putting a noindex on locality level pages or to apply canonical pointing from locality level to city level. But we are still not 100% sure.
Technical SEO | | ozil0 -
Log files vs. GWT: major discrepancy in number of pages crawled
Following up on this post, I did a pretty deep dive on our log files using Web Log Explorer. Several things have come to light, but one of the issues I've spotted is the vast difference between the number of pages crawled by the Googlebot according to our log files versus the number of pages indexed in GWT. Consider: Number of pages crawled per log files: 2993 Crawl frequency (i.e. number of times those pages were crawled): 61438 Number of pages indexed by GWT: 17,182,818 (yes, that's right - more than 17 million pages) We have a bunch of XML sitemaps (around 350) that are linked on the main sitemap.xml page; these pages have been crawled fairly frequently, and I think this is where a lot of links have been indexed. Even so, would that explain why we have relatively few pages crawled according to the logs but so many more indexed by Google?
Technical SEO | | ufmedia0 -
Duplicate title error in GWT over spelling in URL
Hi, How do I resolve a duplicate title error in GWT over spelling in URL? Ttile of Post: Minneapolis Median Home Sales Price Up 16 Percent Not sure how this happened, but I have two URL versions show up. Even with a 301 redirect, the both remain an error in GWT. /real-estate-blog/Minneapolis-median-home-sales-price-up-16-percent and /real-estate-blog/minneapolis-median-home-sales-price-up-16-percent
Technical SEO | | jessential0 -
Translating Page Titles & Page Descriptions
I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company. Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?
Technical SEO | | DanielFreedman0 -
Duplicates on the page
Hello SEOMOZ, I've one big question about one project. We have a page http://eb5info.com/eb5-attorneys and a lot of other similar pages. And we got a big list of errors, warnings saying that we have duplicate pages. But in real not all of them are same, they have small differences. For example - you select "State" in the left sidebar and you see a list on the right. List on the right panel is changing depending on the what you selecting on the left. But on report pages marked as duplicates. Maybe you can give some advices how to improve quality of the pages and make SEO better? Thanks Igor
Technical SEO | | usadvisors0 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0 -
50+ duplicate content pages - Do we remove them all or 301?
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much). Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages? Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc) thanks!
Technical SEO | | Motava0