GWT URL Removal Tool Risky to Use for Duplicate Pages?
-
I was planning to remove lots of URL's via GWT that are highly duplicate alike pages (similar pages exist on other websites across the web). However, this Google article had me a bit concerned: https://support.google.com/webmasters/answer/1269119?hl=en
I already have "noindex, follow" on the pages I want to remove from the index, but Google seems to take ages to remove pages from index, which appear to drag down unique content pages from my site.
-
Hi
I have used the URL removal tool in the past to remove URLs with success - as we know it helps speed things up. What you have done is right and if you are patient Google will start removing each page as it crawls it again. You might find this confirmation from Google reassuring in your situation - https://support.google.com/webmasters/answer/93710?hl=en
Reading the article you posted of when not to use the tool I can't see that your pages fall into any of these categories - but either way I personally can't see using it causing an issue to be honest but its your call.
-
adding "nofollow" as well makes it even easier to get out of the index?
-
Last time I used URL removal in GWT was a long time ago and at that time the URL will not get out of the index for ever but for 90 days only and after that it will come again.
The better idea in any case is to use no index, no follow tag on the pages that you want out from the Google’s index!
Hope this helps!
-
issue with having pages that are similar to pages on other websites is the ratio of unique vs duplicate content is low and that can drag down other more unique pages ranking. The pages I have without much unique content is what users want: http://www.honoluluhi5.com/oahu/honolulu/hawaii-kai-homes/ but since content isn't unique I - unfortunately - need to noindex those pages and instead rank for this type of page: http://www.honoluluhi5.com/hawaii-kai-homes-real-estate/
When a user is looking for "….for sale" type keyword they want that first URL. Not the 2nd URL with pictures and video and writing. The "noindex, follow" is on the 1st URL, but still indexed after 1 month. I want to get de-indexed and I am trying to establish the risk associated with using that GWT tool - based on the article where G seems to indicate one shouldn't so easily use that tool. Conclusion is probably I have to be patient and wait for G to noindex those pages of mine. I look forward to the day G's algorithm can see the layout of a page and understand the value for users, even though it lacks unique content….
-
There's really no problem with having pages on your site having content that can be seen on other sites. Since you have noindexed them already, it shouldn't be a problem.
If they aren't really getting any traffic for you or aren't really bringing in anything that helps the site overall, then just take them off.
Focus on your new and existing content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tool to help find blog / news pages?
Do you guys know of any tools where if I have a list of Url's it can help find blog and news pages and let me know which ones have these.
Intermediate & Advanced SEO | | BobAnderson0 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Replace dynamic paramenter URLs with static Landing Page URL - faceted navigation
Hi there, got a quick question regarding faceted navigation. If a specific filter (facet) seems to be quite popular for visitors. Does it make sense to replace a dynamic URL e.x http://www.domain.com/pants.html?a_type=239 by a static, more SEO friendly URL e.x http://www.domain.com/pants/levis-pants.html by creating a proper landing page for it. I know, that it is nearly impossible to replace all variations of this parameter URLs by static ones but does it generally make sense to do this for the most popular facets choose by visitors. Or does this cause any issues? Any help is much appreciated. Thanks a lot in advance
Intermediate & Advanced SEO | | ennovators0 -
Does it make sense to create new pages with friendlier URLs then redirect old pages to new?
Hi Moz! My client has messy URLs. does it make sense to write new clean URLs, then 301 redirect all old URLs to the new ones? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Webmaster tools: which one do you use? Yandex Yay or Nay?
I usually verify websites on Google and Bing Webmaster. How important it is to verify on Yandex Webmaster if Russia is not one of the targeted locations?
Intermediate & Advanced SEO | | selectitaly0 -
Is using dots in URL path really a problem?
we have a couple of pages displaying a dot in the URL path like domain.com/mr.smith/widget-mr.smith It displays fine in chrome, firefox and IE and for the user it may actually look better than replacing it by _ or -. Did this ever cause problems to anybody?
Intermediate & Advanced SEO | | lcourse
Any statement from google about it?
Should I change existing URLs? If so, which other characters can I use in the URL instead of underscore and dash, since in our system dash and underscore are already used for rewriting other characters. Thanks0 -
Duplicate peices of content on multiple pages - is this a problem
I have a couple of WordPress clients with the same issue but caused in different ways: 1. The Slash WP theme which is a portfolio theme, involves setting up multiple excerpts of content that can then be added to multiple pages. So although the pages themselves are not identical, there are the same snippets of content appearing on multiple pages 2. A WP blog which has multiple categories and/or tags for each post, effectively ends up with many pages showing duplicate excerpts of content. My view has always been to noindex these pages (via Yoast), but was advised recently not to. In both these cases, even though the pages are not identical, do you think this duplicate content across multiple pages could cause an issue? All thoughts appreciated
Intermediate & Advanced SEO | | Chammy0 -
Google Web Master Tools: Duplicate Title Tags?
According to Google Web Master Tools it says my site has 910 Duplicate Title Tags. How verbose should title tags be? What's the maximum character length? For example, let's say I have an image of an iPhone 4S being held in someone's hand. How verbose of a title and how many characters am I allowed to have? Is the goal with title tags to be very specific in describing the image? In the above example, do I need to say something like: "iPhone 4S being held by white caucasian male" or will "iPhone 4S" suffice?
Intermediate & Advanced SEO | | asc760