Why is google webmaster tools ignoring my url parameter settings
-
I have set up several url parameters in webmaster tools that do things like select a specific products colour or size. I have set the parameter in google to "narrows" the page and selected to crawl no urls but in the duplicate content section each of these are still shown as being 2 pages with the same content. Is this just normal, i.e. showing me that they are the same anyway or is google deliberately ignoring my settings (which I assume it does when they are sure they know better or think I have made a mistake)?
-
its been about a month but ill give it a bit longer. ta
-
Allow a couple of month to see the changes. It they were recently made, Google will take a while until removing duplicate content errors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Inspector, Rich Results Tool, GSC unable to detect Logo inside Embedded schema
I work on a news site and we updated our Schema set up last week. Since then, valid Logo items are dropping like flies in Search Console. Both URL inspector & Rich Results test cannot seem to be able to detect Logo on articles. Is this a bug or can Googlebot really not see schema nested within other schema?Previously, we had both Organization and Article schema, separately, on all article pages (with Organization repeated inside publisher attribute). We removed the separate Organization, and now just have Article with Organization inside the publisher attribute. Code is valid in Structured Data testing tool but URL inspection etc. cannot detect it. Example: https://bit.ly/2TY9Bct Here is this page in URL inspector: By comparison, we also have Organization schema (un-nested) on our homepage. Interestingly enough, the tools can detect that no problem. That's leading me to believe that either nested schema is unreadable by Googlebot OR that this is not an accurate representation of Googlebot and it's only unreadable by the testing tools. Here is the homepage in URL inspector: In pseudo-code, our OLD schema looked like this: The NEW schema set up has the same Article schema set up, but the separate script for Organization has been removed. We made the change to embed our schema for a couple reasons: first, because Google's best practices say that if multiple schemas are used, Google will choose the best one so it's better to just have one script; second, Google's codelabs tutorial for schema uses a nested structure to indicate hierarchy of relevancy to the page. My question is, does nesting schemas like this make it impossible for Googlebot to detect a schema type that's 2 or more levels deep? Or is this just a bug with the testing tools?
Technical SEO | | ValnetInc0 -
Numbers in URL
Hey guys! Need your many awesome brains. 🙂 This may be a very basic question but am hoping you can help me out with some insights beyond "because Google says it's better". 🙂 I only recently started working with SEO, and I work for a SaaS website builder company that has millions of open/active user sites, and all our user sites URLs, instead of www.mydomainname.com/gallery or myusername.simplesite.com/about, we use numbers, so www.mysite.com/453112 or myusername.simplesite.com/426521 The Sales manager has asked me to figure out if it will pay off for us in terms of traffic (other benefits?) to change it from the number system to the "proper" and right way of setting up these URLs. He's looking for rather concrete answers, as he usually sits with paid search and is therefore used to the mindset of "if we do x it will yield us y in z months". I'm finding it quite difficult to find case studies/other concrete examples beyond the generic, vague implication that it will simply be "better" (when for example looking at SEO checklists and search engine guidelines). Will it make a difference? How so? I have to convince our developers of the importance and priority of this adjustment, or it will just drown in the many projects they already have. So truly, any insights would be so very welcome. Thank you!
Technical SEO | | michelledemaree2 -
Sitelink demotion not working after submitting in Google webmaster tool
Hello Friends, I have a question regarding demotion of sitelinks in Google webmaster tool. Scenario: I have demoted one of the sitelink for my website two months back; still the demoted sitelink has not been removed from the Google search results.May I know any reason, why this page is not getting removed even after demoting from GWT? If we resubmit the same link in demotion tool one more time, will it work? Can anybody help me out with this? Note: Since the validly of demotion exists only for 3 months (90 days), I am concerned about the same.
Technical SEO | | zco_seo0 -
Webmaster Tools Manual Actions - Should I Disavow Spammy Links??
My website has a manual action against it in webmaster tools stating; Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole I have checked the link profile of my site and there are over 4,000 spammy links from one particular website which I am guessing this manual action refers to. There is no way that I will be able to get these links removed so should I be using Google's Disavow Tool or is there no need? Any ideas would be appreciated!!
Technical SEO | | Pete40 -
Google Webmaster Tools: MESSAGE
Dear site owner or webmaster of http://www.enakliyat.com.tr/,
Technical SEO | | iskq
Some of your site's pages may be using techniques that do not comply with Google's Webmaster Guidelines.
On your site, in particular, does not provide an adequate level of innovation in low-quality unique content or set of pages. Examples of this type of thin affiliate pages, pages, bridge pages, it will automatically be created or copied content. For more information about the unique and interesting content, visit http://www.google.com/support/webmasters/bin/answer.py?answer=66361.
We recommend you to make the necessary changes to your site to fit your site's quality guidelines. After making these changes, please submit your site for reconsideration in Google's search results.
If you have questions about how to resolve this problem, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team **After this massege ve find our low quality pages and we added this urls on Robots.txt. Other than that, what can we do? ** **Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices. We were generating listing page URL’s by using the title submitted by customer. **0 -
Showing duplicate content when I have canonical url set, why?
Just inspecting my sites report and I see that I have a lot of duplicate content issues, not sure why these two pages here http://www.thecheapplace.com/wholesale-products/Are-you-into-casual-sex-patch http://www.thecheapplace.com/wholesale-products/small-wholesale-patches-1/Are-you-into-casual-sex-patch are showing as duplicate content when both pages have a clearly defined canonical url of http://www.thecheapplace.com/Are-you-into-casual-sex-patch Any answer would be appreciated, thank you
Technical SEO | | erhansimavi0 -
Old URL redirect to New URL
Alright I did something dumb a year a go and I'm still paying for it. I changed my hyphenated URL to the non-hyphenated version when I redesigned my website. I say it was dumb because I lost most of my link juice even though I did 301 redirects (via the htaccess file) for almost all of the pages I could find in Google's index. Here's my problem. My new site took a huge hit in traffic (down 60%) when I made the change and even though I've done thousands of redirects my old site is still showing up in the SERPS and send much if not most of my traffic. I don't want to take the old site down in fear it will kill all of my traffic. What should I do? Is there a better method I should explore then 301 redirects? Could the other site be affecting my current rank since it's still there? (FYI...both sites are built on the WP platform). Any help or ideas are greatly appreciated. Thank you! Joe
Technical SEO | | kaje0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0