Sitelinks: Does Google Recognize Your Requests for Removal?
-
I've been trying to influence branded SERPs recently by demoting certain pages from appearing in the Sitelinks feature provided in Google's Webmaster Tools.
However, despite demoting various URLs, they continue to appear for the branded SERPs nearly a week after they should've been suppressed.
What is your experience with Sitelinks? Do links you request to demote ever disappear or change positions in the SERPs for you?
-
Yes, Google DOES recognize your request for branded sitelinks demotion.
The only down side is that they unfortunately do not update within a week. It took us two months of weekly tracking and demotion to finally get the branded sitelinks that we want displayed.
In a nutshell, give it a few more weeks and you'll eventually see the requested URL demotions take effect.
-
They did not honour my requests, it does say they MAY honour them
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not Indexing images on CDN.
My URL is: http://bit.ly/1H2TArH We have set up a CDN on our own domain: http://bit.ly/292GkZC We have an image sitemap: http://bit.ly/29ca5s3 The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: http://bit.ly/29eNSXv. We used to have a disallow to /thumb/ which had a 301 redirect to our CDN but we removed both the disallow in the robots.txt as well as the 301. Yet, GWT still reports none of our images on the CDN are indexed. The above screenshot is from the GWT of our main domain.The GWT from the CDN subdomain just shows 0. We did not submit a sitemap to the verified subdomain property because we already have a sitemap submitted to the property on the main domain name. While making a search of images indexed from our CDN, nothing comes up: http://bit.ly/293ZbC1While checking the GWT of the CDN subdomain, I have been getting crawling errors, mainly 500 level errors. Not that many in comparison to the number of images and traffic that we get on our website. Google is crawling, but it seems like it just doesn't index the pictures!? Can anyone help? I have followed all the information that I was able to find on the web but yet, our images on the CDN still can't seem to get indexed.
Intermediate & Advanced SEO | | alphonseha0 -
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Problem with Google finding our website
We have an issue with Google finding our website: (URL removed) When we google "(keyword removed)" in google.com.au, our website doesn't come up anywhere. This is despite inserting the suitable title tag and onsite copy for SEO. We found this strange, and thought we'd investigate further. We decided to just google the website URL in google.com.au, to see if it was being properly found. Our site appeared at the top but with this description: A description for this result is not available because of this site's robots.txt – learn more. We also can see that the incorrect title tag is appearing. From this, we assumed that there must be an issue with the robot.txt file. We decided to put a new robot.txt file up: (URL removed) This hasn't solved the problem though and we still have the same issue. If someone could get to the bottom of this for us, we would be most appreciative. We are thinking that there may possibly be another robot.txt file that we can't find that is causing issues, or something else we're not sure of! We want to get to the bottom of it so that the site can be appropriately found. Any help here would be most appreciated!
Intermediate & Advanced SEO | | Gavo0 -
Removing UpperCase URLs from Indexing
This search - site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
Intermediate & Advanced SEO | | friendoffood
https://www.qjamba.com/online-savings/automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'. When I click to see it I get: Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/Automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice. Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls? And if, so what is the best way -- there are thousands.0 -
Google Penalty - Has It Been Lifted?
Hi, We have been trying to remove a ‘partial’ google penalty for a new client by the way of removing unnatural backlinks over a period of time and then submitting a reconsideration request, and uploading a disavow file etc. Previously Google listed the partial penalty in the ‘manual actions’ section of webmaster tools, making it possible for us to submit a reconsideration request. Having just logged in however we get the message ‘no manual webspam actions found’. So there isn’t any way we can submit a reconsideration request. Does this mean that the penalty has been lifted? Or could it still exist? If the latter is there any other way to submit a reconsideration request? Many thanks in advance, Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Spam Links? -115 Domains Sharing the Same IP Address, to Remove or Not Remove Links
Out of 250 domains that link to my site about 115 are from low quality directories that are published by the same company and hosted on the same ip address. Examples of these directories are: -www.keydirectory.net -www.linkwind.com -www.sitepassage.com -www.ubdaily.com -www.linkyard.org A recent site audit from a reputable SEO firm identified 125 toxic links. I assume these are those toxic links. They also identified about another 80 suspicious domains linking to my site. They audit concluded that my site is suffering a partial Penguin penalty due to low quality links. My question is whether it is safe to remove these 125 links from the low quality directories. I am concerned that removing this quantity of links all at once will cause a drop in ranking because the link profile will be thin with only about 125 domains remaining that point to the site. Granted those 125 domains should be of somewhat better quality. I am playing with fire by having these removed. I URGENTLY NEED ADVICE AS THE WEBMASTER HAS INITIATED STEPS TO REMOVE THE 125 LINKS. Thanks everyone!!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
Site wide links removal
A website of mine has about 4,000 backlinks of which 2,500 of them are coming from one website to the homepage and about 6 internal pages. These have been built up over about 5 years, mainly via article posts. The site was recently hit via penguin 2.0 but has only had natural links built so i'm wondering if the sitewide links are in fact the issue? The website linking to mine is an authority source within its niche but the concern is the amount of backlinks coming from this one site and if it may now be seen as having a negative impact. When ive reviewed the links from this one site via a backlink removal tool about 80% seem fine and suggestions are to remove about 20% of the backlinks. Would you keep all the sitewide backlinks or remove them?
Intermediate & Advanced SEO | | jazavide
Have you come across a similar situation and how did it affect ranking/traffic?0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0