Locating Duplicate Pages
-
Hi,
Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index.
I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success.
It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week.
I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content.
Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine.
I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem?
Thanks guys!
-
It's cool. Sorry, the point I was making is that irrespective of what you search for the page that is returned is http://www.refreshcartridges.co.uk/advanced_search_result.php (with nothing after the .php) and as such the search results page couldn't spurn multiple pages which could be indexed by Google.
-
Hmm, I'm not too knowledgeable about php pages. Sorry!
-
Sorry, I'm not sure what happened to that bit.ly address - The actual address of the website is www.refreshcartridges.co.uk.
Ah, I see what you mean about the search results now however this hopefully shouldn't be an issue as for security (our web guy said something about injections) the URL that is returned irrespective of what is searched for is http://www.refreshcartridges.co.uk/advanced_search_result.php
Thanks again!
-
I can't get that link to work.
What I said before still applies with physical input (this is what I assumed when I said it).
For example, user inputs the words "snakes and dogs" and clicks search. The new URL is "www.yoursite.com/search?q=snakes and dogs" All these weird URL pages need noindex meta tags or Google will flag them as duplicate content because, for example, this page and the result for "dogs and snakes" generate almost the same page.
Does that make sense?
It is in Google's Webmaster Guidelines that you should noindex these pages. -
Many thanks for your input on this. I have actually looked at this through the HTML improvements section of GWMT however I am showing only a few dozen duplicated titles / descriptions and this is simply due to the product categories being almost identical (for example HP Deskjet 500 and HP Deskjet 500+)
-
Many thanks for your response. Our site is an eCommerce site that doesn't employ tags as such and our categories are all accounted for in the 15,000 page figure.
-
We did have this at the beginning of the year when we used a ?dispmode=grid and ?dispmode=list to change the way our results were displayed. This has been rectified however by us completely removing the option and any instances of dispmode present in the URL force a 301 to the correct master page. There are still a few hundred instances of this dispmode being present in the Google index but 99% of them have fallen out now.
I have checked and double checked and we don't seem to have any issues like this at present.
-
I'm not certain if this is the case as our search engine requires physical input in order to yield a result. I don't know if it helps but the URL is http://bit.ly/4Cogchww if you fancy taking a look
-
Thanks for your reply. Indeed our website does force www. if someone were to attempt to navigate to us without prefixing www.
-
Hi Chris,
Google Webmaster has a tool that helps identify duplicate HTMLs and maybe you can use that to see if the 11,000 pages are duplicate. IF they are, I am assuming they should have the duplicate Title Tag and etc. which the tool may discover.
-
Have you checked for instances where a page parameter is being seen as another version of the same page? One of the sites I work for had an issue a few months back where every instance of a product page was being flagged as duplicate content because of an oversight. We had one of our coders write a clause into the page where every time a page loaded with a parameter such as ?color=72 it would canonicalize it to the page minus the parameter. This decreased our duplicate content warnings quickly and effectively.
-
it could be that your tags and categories are considered individual pages and therefore creating their own permalink: ex: http:www.example.com/keyword, and http://www.example.com/tag/keyword and http://www.example.com/category/keyword. Another way would be to check the sitemaps you have in webmaster tools and compare those to each other. Just a suggestion.
-
Does your website force 'www.'?
Both yourdomain.com and www.yourdomain.com are separate sites and can have different pages spidered.
-
Be sure to try different combinations of 'site:www.domain.com' and 'site:domain.com'. They will all yield different results.
Sounds to me like you probably have an internal search engine that is generating search results pages based off the search term, and each different results page is a piece of duplicate content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I run a Business Directory, Where all the businesses are listed. I am having an issue.. with Duplication content. I have categories, Like A, B, C Now a business in Category A, User can filter it by different locations, by State, City, Area So If they filter it by the State and the State has 10 businesses and all of them are in one City. Both of the page
On-Page Optimization | | Adnan4SEO
The state filtered and the city filtered are same, What can i do to avoid that? canonical-url-tag or changing page Meta's and Body text? Please help 🙂0 -
Duplicate content, which seems not to be duplicate :S
After crawling I am used to getting a lot of duplicate content messages in Moz, which are High Priority. I do not know what to do with them, since I believe we tackled all the issues. Main point being the advise to put in a link rel=canonical. An example of a page that accordeing to the report has a duplicate. I do not see how. Can you help with that? http://www.beat-it.nl/4y6hctr24x7wdmr-ml350-p-ic-procaresvc.html duplicate sample http://www.beat-it.nl/modu-hp-a5800-acm-for-64-256-aps.html
On-Page Optimization | | Raymo0 -
Duplicate Content - What can be duplicate in two different product pages.
I am having a hard time understanding how my 3 different product pages are being shown up as Duplicate Content in s crawl. Some of my 21 different pages are being shown as duplicate content. Here are 3 of those: 1. http://champu.in/korn-rock-band-mens-round-neck-t-shirt-india 2. http://champu.in/stop-the-burning-mens-round-neck-t-shirt-india 3. http://champu.in/funny-t-shirts/absolut-punjabi-red-men-s-round-neck-t-shirt Can someone help me with this. Thanks in advance 🙂
On-Page Optimization | | sidjain4you0 -
Duplicate Content - Deleting Pages
The Penguin update in April 2012 caused my website to lose about 70% of its traffic overnight and as a consequence, the same in volume of sales. Almost a year later I am stil trying to figure out what the problem is with my site. As with many ecommerce sites a large number of the product pages are quite similar. My first crawl with SEOMOZ identified a large number of pages that are very similar - the majority of these are in a category that doesn't sell well anyway and so to help with the problem I am thinking of removing one of my categories (about 1000 products). My question is - would removing all these links boost the overall SEO of the site since I am removing a large chunk of near-duplicate links? Also - if I do remove all these links would I have to put in place a 301 redirect for every single page and if so, what's the quickest way of doing this. My site is www.modern-canvas-art.com Robin
On-Page Optimization | | robbowebbo0 -
Editing Author Pages
Hi, Quick question regarding author pages. I have the blog set as me for my author page. so the url is: mysite.com/blog/author/miles/ Now, seomoz has picked up that my author page is missing meta description. But, this cannot be editied through wordpress as there is no edit option available. I may really be missing something, but where can I alter the author page, I have a feeling it might be fed from G+ but no really sure what part of G+ is used as the description. Thanks Miles
On-Page Optimization | | easyrider20 -
Duplicate content on ecommerce
We have a website that we created a little over a year ago and have included our core products we have always focused on such as mobility scooters and power wheelchairs. We have been going through and updating product descriptions, adding product reviews that our customers have provided etc in order to improve on our SEO rankings and not be penalized by the Panda update. We were approached by a manufacturer last year about their products and they had close to 10k products that we were able to upload easily into our system. Obviously these all have standard manufacturers descriptions many sites are also using. It will take us forever to go through and change all of these and many products are similar to each other anyway they just vary in size, color etc. Will it help our rankings for our core products to simply go through and delete all of these additional products and categories and just add them one by one with unique descriptions and more detailed information when we have time? We aren't really selling many of them anyway so it won't hurt our sales. I'm clearly new to SEO and any help at all would be greatly appreciated. My main website is www.bestmedicalsuppliesonsale dot com A sample core category that we have changed descriptions for is http://www.bestmedicalsuppliesonsale.com/mobility-scooters-s/36.htm A sample of a category and products we simply uploaded would be at http://www.bestmedicalsuppliesonsale.com/Wound-Care-s/4837.htm I'm open to all suggestions I would just like to see my traffic and obviously sales increase. If there are any other glaring problems please let me know. I need help!
On-Page Optimization | | BestMedical0 -
Different pages for OS's vs 1 Page with Dynamic Content (user agent), what's the right approach?
We are creating a new homepage and the product are at different stages of development for different OS's. The value prop/messaging/some target keywords will be different for the various OS's for that reason. Question is, for SEO reasons, is it better to separate them into different pages or use 1 page and flip different content in based on the user agent?
On-Page Optimization | | JoeLin0 -
Should I use a Page Name variable after the ? for a dynamic web page
I'm converting for static to dynamic web pages. It appears that the page name is used for page ranking in the search engines. Will adding a Page Name variable help to increase our SEO. For example aspecialgift.com/subcat.php?PageName=GiftPage&ProductID=ABCDE. Does the page name variable make a difference?
On-Page Optimization | | NCBob0