Googlebot indexing URL's with ? queries in them. Is this Panda duplicate content?
-
I feel like I'm being damaged by Panda because of duplicate content as I have seen the Googlebot on my site indexing hundreds of URL's with ?fsdgsgs strings after the .html. They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later. At a loss what to do. Since Panda, I have lost a couple of dozen #1 rankings that I've held for months on end and had one drop over 100 positions.
-
Thanks for all that. Really valuable information. I have gone to Parameter handing and there were 54 parameters listed. In total, generating over 20 million unnecessary URLs. I nearly died when I saw it. We have 6,000 genuine pages and 20 million shitty ones that don't need to be indexed. Thankfully, I'm upgrading next week and I have turned the feature off on the current site, the new one won't have that feature. Phew.
I have changed the settings for these parameters that were already listed in Webmaster tools, and now I wait for the biggest re-index in history LOL!
I have submitted a sitemap now and as I rewrite page titles & meta descriptions, I'm using the Fetch as Google tool to ask for resubmission. It's been a really valuable lesson, and I'm just thankful that I wasn't hit worse than I was. Now, it's a waiting game.
Of my 6,000 URLs' on the site map submitted a couple of days ago, around 1/3 of them have been indexed. When I first uploaded it, only 126 of them were.
-
The guys here are all correct - you can handle these in WMT with parameter handling, but as every piece of text about parameter handling states, handle with care. You can end up messing things up big-time if you block areas of the site you do want crawled.
You'll also have to wait days / longer for Google to acknowledge the changes and reflect these in its index and in WMT.
If it's an option, look at using the canonical tag to self-reference: this means that if the CMS creates multiple pages with the same file on different URLs, they'll all point back to the original URL.
-
"They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later."
Google will continue to index them, until you tell them specifically not to do so. Go to GWT, and resubmit a sitemap containing only the URL's you want them to index. Additionally, do a "fetch as Google" on the same pages as your sitemap. This can help to speed up the "reindex" process.
Also, hours? LMAO it will take longer than that. Unless you are a huge site that gets crawled hourly, it can take days, if not weeks for those URL's to disappear. I'm thinking longer since it does not sound like you have redirected those links, just turned off the plugin that was used to create them. Depending on how your store is set up, and how many pages you have, it may be wise to 301 all the offending pages to their proper destination URL.
-
Check out parameter exclusion options in Webmaster Tools. You can tell the search engines to ignore these appended parameters.
-
Use a spidering tool to check out all of the links from your site, such as Screaming Frog.
Also check your XML & HTML Site Maps doesn't have old links.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento Duplicate Content Question - HELP!
In Magento, when entering product information, does the short description have to be different than the meta description? If they are both the same is this considered duplicate content? Thanks for the help!!!
On-Page Optimization | | LeapOfBelief0 -
What to do about resellers duplicating content?
Just went through a big redevelopment for a client and now have fresh images and updated content but now all the resellers have just grabbed the new images/content and pasted them on their own site. My client is a manufacture that sells directly online and over the phone for large orders. I'm just not sure how to handle the resellers duplicate content. Any thoughts on this? Am I being silly for worrying about this?
On-Page Optimization | | ericnkatz0 -
Why isn't our site being shown on the first page of Google for a query using the exact domain, when its pages are indeed indexed by Google
When I type our domain.com as a query into Google, I only see one of our pages on the homepage, and it's in 4th position. It seems though, that all pages of the site are indexed by google when I type in the query "site:domain.com". There was an issue at the site launch, where the robots.txt file was left active for around two weeks. Would this have been responsible for the fact that another domain ranks #1 when we type in our own domain? It has been around a couple of months now since the site was launched. Thanks in advance.
On-Page Optimization | | featherseo0 -
Dates in URL's
I have an issue of duplicate content errors and duplicate page titles which is penalising my site. This has arisen because a number of URLs are suffixed by date(s) and have been spidered . In principle I do not want any url with a suffixed date to be spidered. Eg:- www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/06_07_13/13_07_13 http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/20_07_13/27_07_13 Only this URL should be spidered:- http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm I have over 10,000 of these duplicates and firstly wish to remove them on block from Google ( not one by one ) and secondly wish to amend my robots.txt file so the URL's are not spidered. I do not know the format for either. Can anyone help please.
On-Page Optimization | | carbisbayhols0 -
Duplicate Page Content on Empty Manufacturer Pages
I work for an internet retailer that specializes in pet supplies and medications. I was going through the Crawl Diagnostics for our website, and I saw in the Duplicate Page Content section that some of our manufacturer pages were getting flagged. The way our site is set up is that when products are discontinued we mark them as discontinued and use 301 redirects to redirect their URLs to other relevant products, brands, or our homepage. We do the same thing with brand and manufacturer pages if all of their products are discontinued. 90% of the time, this is a manual process. However, the other 10% of the time certain products come and go automatically as part of our inventory system with one of our fulfillment partners. This can sometimes create empty manufacturer pages. I can't redirect these empty pages because there's a chance that products will be brought back in stock and the page will be populated again. What can we do so that these pages won't get marked as duplicates while they're empty? Write unique short descriptions about the companies? Would the placement of these short descriptions matter--top of the page under the category name vs bottom of the page underneath where the products would go? The links in the left sidebar, top, and in the footer our part of our site architecture, so those are always going to be the same. To contrast, here's what a manufacturer page with products looks like: Thanks! http://www.vetdepot.com/littermaid-manufacturer.html
On-Page Optimization | | ElDude0 -
Duplicate content - what to do?
Hi, We have a whole lot of articles on our site. In total 5232 actually. The web crawler tells me that in the articles we have a lot of duplicate content. Which is sort of nonsense, since each article is unique. Ah, some might have some common paragraphs because they are recurring news about a weekly competition. But, an example: http://www.betxpert.com/artikler/bookmakere/brandvarme-ailton-snupper-topscorerprisen AND http://www.betxpert.com/artikler/bookmakere/opdaterede-odds-pa-sportschef-situationen-pa-vestegnen These are "duplicate content", however the two article texts are not the same. The menu, and the widgets are all the same, but highly relevant to the article. So what should I do? How can i rid myself of these errors? -Rasmus
On-Page Optimization | | rasmusbang0 -
Why isn't Google indexing me?
Recently got handed off a .org site for a quasi state agency here in Michigan. Turns out the developer had the site live for the past six months but left the noindex, nofollow tag on everything so the site was invisible to search engines. Obviously we wiped all of those things a couple weeks ago when we got started, added all of our sitemaps to bing/yahoo/google webmaster tools and we've already started getting indexed by yahoo and bing and showing up for branded terms...but NOTHING from Google. WMT says our pages are all indexed, but we aren't showing up for anything in search and we don't seem to be indexed at all. Granted, if this site was brand new and didn't have any links I could see us taking a little time to get found, but this site has very good .gov and .edu links, plus we've built some other solid links to it since we've launched and Google continues to ignore it. I haven't seen this before, but could Google still be ignoring us from the months of noindex, nofollowing? If so, any tips on how to get back in teh Google's good graces here?
On-Page Optimization | | NetvantageMarketing0 -
Not making a change of the 100's in crawl Diagnostic
Based on the PRO crawl Diagnostics – if we don’t make a change on 1 page, does that just affect the SEO on that one page, or does it affect the SEO on all pages of the site? E.g. If we get a “Too many on page links” for a certain page that we don’t really want to rank for – does not fixing that particlaur page affect the site as a whole? Hope I explained this ok..
On-Page Optimization | | inhouseninja0