Can I disallow my subdomain for penguin recover?
-
Hi,
I have a site like BannerBuzz.com, before last penguin my site's all keywords were in good position in google, but after penguin hit on my website, my all keywords are going down and down day by day, i have done some changes in my website for improvement, but in 1 change i have some confusion.
i have one sub domain (http://reviews.bannerbuzz.com/), which display my websites all keywords user reviews, in which every category's 15 reviews are display in my website http://www.bannerbuzz.com so are those user reviews consider as duplicate content between sub domain and main website.
can i disallow sub domain from all search engine? currently sub domain is open for all search engine, is that helpful to block it?
Thanks
-
Hello Rafi,
I am going to make necessary changes on it. And, I have started work to gather backlinks on home page with Vinyl Banners keyword from various sources. It may help me to recover my old ranking!
-
No problem my friend. You are most welcome.
So if you are using 3rd party services to fill in the reviews content on the sub-domain, you can the following:
1. Stop using the sub-domain henceforth for the reviews content and use the new reviews sub-folder to get the reviews content filled in.
2. Redirect the old reviews content on the sub-domain to the new reviews sub-folder via 301.
This will make sure that you don't loose the SEO goodies that the sub-domain has acquired till date and also all (almost all) of those goodies will be passed on to the new sub-folder.
Please feel free to post any or all of your queries if you have any in this regard.
Best regards,
Devanur Rafi.
-
Thanks Devanur Rafi, for your information
You gave us really great information, but i have one question, currently i am using 3rd party reviews services fro customer's users (powerreviews.com), so is it possible to make sub folder and redirect sub-domain to sub-folder?
-
Hi there,
Here are my two cents in this regard. Instead of showing 10 or 15 reviews on the root domain, show no more than 2 and for more reviews you can send the visitors to the reviews sub-domain (using a 'view more reviews' button as you currently have). This will mitigate duplicate content issues to a great extent if at all any. I do not recommend blocking the sub-domain from the search engines. However, you can move the content of the sub-domain to something like a reviews sub-folder as follows:
From an SEO stand point, sub-folder is a safe bet compared to a sub-domain. Here is what Rand Fishkin has to say in this regard (http://www.seomoz.org/q/subdomains-vs-subfolders
_ “All the testing, research and examples I've seen in the past few years (and even the past few months) strongly suggest that the same principles still hold true._
Subdomains SOMETIMES inherit and pass link/trust/quality/ranking metrics between one another.
Subfolders ALWAYS inherit and pass link/trust/quality/ranking metrics across the same subdomain.
Thus, having a single subdomain (even just domainname.tld with no subdomain extension) with all of your content is absolutely ideal from an SEO perspective. It's also more usable and brandable, too IMO.”
Here is an interesting discussion about the same here on Moz.com:
http://www.seomoz.org/q/multiple-subdomains-my-worst-seo-mistake-now-what-should-i-do
Hope these help.
Best regards,
Devanur Rafi.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallowing URL Parameters vs. Canonicalizing
Hi all, I have a client that has a unique search setup. So they have Region pages (/state/city). We want these indexed and are using self-referential canonicals. They also have a search function that emulates the look of the Region pages. When you search for, say, Los Angeles, the URL changes to _/search/los+angeles _and looks exactly like /ca/los-angeles. These search URLs can also have parameters (/search/los+angeles?age=over-2&time[]=part-time), which we obviously don't want indexed. Right now my concern is how best to ensure the /search pages don't get indexed and we don't get hit with duplicate content penalties. The options are this: Self-referential canonicals for the Region pages, and disallow everything after the second slash in /search/ (so the main search page is indexed) Self-referential canonicals for the Region pages, and write a rule that automatically canonicalizes all other search pages to /search. Potential Concern: /search/ URLs are created even with misspellings. Thanks!
Technical SEO | | Alces1 -
Geo ip filtering / Subdomain can't be crawled
My client has "load balancing" site traffic in the following way: domain: www.example.com traffic from US IP redirected to usa.example.com traffic from non-US IP redirected to www2.example.com The reason for doing this is that site contents on the www2 contains herbal medicine info banned by FDA."usa.example.com" is a "cleaned" site. Using HK IP, when I google an Eng keyword, I can see that www.example.com is indexed. When googling a Chi keyword, nothing is indexed - neither the domain or www2 subdomain. From Google Search Console, it shows a Dell Sonicwall geo ip filtering alert for www2 (Connection initiated from country: United States). GSC data also confirms that www2 has never been indexed by Google. Questions: Is geo ip filtering the very reason why www2 isn't indexed? What should I do in order to get www2 to be indexed? Thanks guys!
Technical SEO | | irene7890 -
How can you promote a sub-domain ahead of a domain on the SERPs?
I have a new client that wants to promote their subdomain uk.imagemcs.com and have their main domain imagemcs.com fall off the SERPs. Objective? Get uk.imagemcs.com to rank first for UK 'brand' searches. Do a search for 'imagem creative services' and you should see the issue (it looks like rules have been applied to the robots.txt on the main domain to exclude any bots from crawling - but since they've been indexed previously I need to take action as it doesn't look great!). I think I can do this by applying a permanent redirect from the main domain to the subdomain at domain level and then no-indexing the site - and then resubmit the sitemap. My slight concern is that this no-indexing of the main domain may impact on the visibility of the subdomains (I'm dealing with uk.imagemcs.com, but there is us.imagemcs.com and de.imagemcs.com) and was looking for some assurance that this would not be the case. My understanding is that subdomains are completely distinct from domains and as such this action should have no impact on the subdomains. I asked the question on the Webmasters Forum but haven't really got anywhere
Technical SEO | | nathangdavidson2
https://productforums.google.com/forum/#!msg/webmasters/1Avupy3Uw_o/hu6oLQntCAAJ Can anyone suggest a course of action? many thanks, Nathan0 -
What can be the cause for difference in local rankings between mobile and desktop?
I have a site that ranks differently for the same search term on mobile and desktop computer. I'm based in Glasgow, and the search term is (I've replaced the term with X's) XXXXXX XXXXX Glasgow Searching from a location in Glasgow: Desktop: Snackpack : 2, Organic : 6
Technical SEO | | johanisk
Mobile: Snackpack: 1, Organic : 10 I'm keen to improve on the Organic positions as this term is a lead generating one for me. My site is mobile friendly and scores 69/100 on the speed test. Do you think bumping the pagespeed well "into the green" would help improve it's position? Is there anything else I should look at?0 -
Spider Indexed Disallowed URLs
Hi there, In order to reduce the huge amount of duplicate content and titles for a cliënt, we have disallowed all spiders for some areas of the site in August via the robots.txt-file. This was followed by a huge decrease in errors in our SEOmoz crawl report, which, of course, made us satisfied. In the meanwhile, we haven't changed anything in the back-end, robots.txt-file, FTP, website or anything. But our crawl report came in this November and all of a sudden all the errors where back. We've checked the errors and noticed URLs that are definitly disallowed. The disallowment of these URLs is also verified by our Google Webmaster Tools, other robots.txt-checkers and when we search for a disallowed URL in Google, it says that it's blocked for spiders. Where did these errors came from? Was it the SEOmoz spider that broke our disallowment or something? You can see the drop and the increase in errors in the attached image. Thanks in advance. [](<a href=)" target="_blank">a> [](<a href=)" target="_blank">a> LAAFj.jpg
Technical SEO | | ooseoo0 -
Can anyone tell me why the bot has only picked up one page?
www.namebadgesinternational.co.nz After the 2nd week, I changed the robots.txt file to allow ALL robots on the website, but it still hasn't gone through any pages after another crawl Any help would be hugely appreciated.
Technical SEO | | designsecrets0 -
How can you get accurate search traffic volumes?
I have been trying to get accurate search volumes for the search term "how to dance" in the UK. The results seems to vary wildly. Please could someone tell me how many people actually search for this term each month in the UK on Google (or any other search engine for that matter!)? Thank you!
Technical SEO | | harrygardiner0 -
How can i increase my website traffic
Hello, my boss has decide a build website we have more than 12500 products in ourwebsite its mtscellular.com, im new as seo but im confused and need help i want to know how somebody help me to increase my website traffic
Technical SEO | | jimmylora0