Why specify robots instead of googlebot for a Panda affected site?
-
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database
Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
-
Hi there,
I'm not sure of the percentage of sites who have specified all bots as opposed to just Google, but I also have to assume that the percentage depends on the sites' territory. For instance, there are few sites in the UK who would think of Bing or Yahoo, due to those search engines having a tiny market share. In the US, Japan and several other places where non-Google search engines have more share, the consideration will be much higher.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site is being deindexed for unknown reason
A few days ago I noticed that my site gusty.se was not showing up in google, only the subpages. There is no message in the google search console. I requested the site to be reindexed and about a day later the site was showing up in google again. Now another day has past and the site is now again not indexed in google. Question is why the site is being deindexed??? I have worked a bit with getting backlinks to the site and I did recently gain 3 backlinks within a few days (about a week has past since I gained these links). Still I can't believe Google would count this as unnatural link building, especially since I guess it will take some time for Google to detect new incoming links. Another thing I've notice though is that my site about two weeks ago got a high number of incoming links from different spam sites with .gq TLD's (see the attached screenshot). The majority of these sites have however not linked to my main page but to a sub page which still is indexed by Google. Can all these spamlinks be the reason to why Google has deindexed the main page of my site? I've read that Google in general ignore links from spam sites, still I have taken action against these spam sites by submitting a disavow text file containing all these spam domains. I submitted this file about 2 days ago. I have now again requested the site to be reindexed so perhaps will it soon be listed again. Still, I can't keep having my site deindexed and having me reindexing it every second day. I would really appreciate if someone could give me some insight in this problem. moz.jpg
Intermediate & Advanced SEO | | Grodan21 -
How much SEO damage would it do having a subdomain site rather directory site?
Hi all! With a coleague we were arguing about what is better: Having a subdomain or a directory.
Intermediate & Advanced SEO | | Gaston Riera
Let me explain some more, this is about the cases: Having a multi-language site: Where en.domain.com or es.domain.com rather than domain.com/en/ or domain.com/es/ Having a Mobile and desktop version: m.domain.com or domain.com rather than domain.com/m or just domain.com. Having multiple location websites, you might figure. The dicussion started with me saying: Its better to have a directory site.
And my coleague said: Its better to have a subdomain site. Some of the reasons that he said is that big companies (such as wordpress) are doing that. And that's better for the business.
My reasons are fully based on this post from Rand Fishkin: Subdomains vs. Subfolders, Rel Canonical vs. 301, and How to Structure Links for SEO - Whiteboard Friday So, what does the community have to say about this?
Who should win this argue? GR.0 -
Have a Robots.txt Issue
I have a robots.txt file error that is causing me loads of headaches and is making my website fall off the SE grid. on MOZ and other sites its saying that I blocked all websites from finding it. Could it be as simple as I created a new website and forgot to re-create a robots.txt file for the new site or it was trying to find the old one? I just created a new one. Google's website still shows in the search console that there are severe health issues found in the property and that it is the robots.txt is blocking important pages. Does this take time to refresh? Is there something I'm missing that someone here in the MOZ community could help me with?
Intermediate & Advanced SEO | | primemediaconsultants0 -
Is my site being penalized?
I've gone through all the points on https://moz.com/blog/technical-site-audit-for-2015 but the site only ranks for its brand name after months. The website is not ranking in the top 100 for any main keywords (2,3,4 word phrases), only for a handful of very long phrases (4+). All of the content is unique, all pages are indexed, the website is fast and doesn't contain any crawl errors and there are a couple of links pointing to it. There is a sitewide follow link in the footer pointing to another domain, its parent company and vice-versa. This is not done for any SEO reasons but the companies are related and also the products are supplementary of each other. Could this be an issue? Or is my site being penalized by something else?
Intermediate & Advanced SEO | | Robbern0 -
Site not progressing at all....
We relaunched our site almost a year ago after our old site dropped out of ranking due to what we think was overused anchor text.... We transferred over the content to the new site, but started fresh in terms of links etc. And did not redirect the old site. Since the launch we have focused on producing good content and social, but the site has made no progress at all. The only factor I can think off is that one site linked to us from all of their pages, which we asked them to remove which they did over 3 months ago, but still showing in Webmaster tools.... Any help would be appreciated. Thanks
Intermediate & Advanced SEO | | jj34340 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Separate Site or should we incorporate it into our main site
Hello, We have a website to sell personal development trainings. The owners want to start 2 blogs - one for each owner - that promotes their personal coaching practices. What's the SEO advantages of embedding both blogs in the current site vs starting 2 brand new blogs with their names as the domain names?
Intermediate & Advanced SEO | | BobGW0 -
Site #2 beats site #1 in every aspect?
Hey guys, loving SEOMoz so far and will definitely continue my subscription after the free trial. I have a question however, which I am really confused about. When researching my primary keyword, I have found that the second ranked site beats the top site in every single aspect, apart from domain age, which is almost 6 years for the top one and 6 months for the second. When I say every single aspect, I mean everything. More authority for the page and domain, more links, more anchor text links, more authoritive links, more social signals, more relevant links, better domain (although second ranked site is a .net), better MozRank, better MozTrust etc.... I have noticed though, that in the UK SERPs, those sites are switched, so #2 is actually #1. Could it be that the US SERPs just haven't updated yet, or am I missing something completely different.
Intermediate & Advanced SEO | | darrenspeed1