Can I disallow my subdomain for penguin recover?
-
Hi,
I have a site like BannerBuzz.com, before last penguin my site's all keywords were in good position in google, but after penguin hit on my website, my all keywords are going down and down day by day, i have done some changes in my website for improvement, but in 1 change i have some confusion.
i have one sub domain (http://reviews.bannerbuzz.com/), which display my websites all keywords user reviews, in which every category's 15 reviews are display in my website http://www.bannerbuzz.com so are those user reviews consider as duplicate content between sub domain and main website.
can i disallow sub domain from all search engine? currently sub domain is open for all search engine, is that helpful to block it?
Thanks
-
Hello Rafi,
I am going to make necessary changes on it. And, I have started work to gather backlinks on home page with Vinyl Banners keyword from various sources. It may help me to recover my old ranking!
-
No problem my friend. You are most welcome.
So if you are using 3rd party services to fill in the reviews content on the sub-domain, you can the following:
1. Stop using the sub-domain henceforth for the reviews content and use the new reviews sub-folder to get the reviews content filled in.
2. Redirect the old reviews content on the sub-domain to the new reviews sub-folder via 301.
This will make sure that you don't loose the SEO goodies that the sub-domain has acquired till date and also all (almost all) of those goodies will be passed on to the new sub-folder.
Please feel free to post any or all of your queries if you have any in this regard.
Best regards,
Devanur Rafi.
-
Thanks Devanur Rafi, for your information
You gave us really great information, but i have one question, currently i am using 3rd party reviews services fro customer's users (powerreviews.com), so is it possible to make sub folder and redirect sub-domain to sub-folder?
-
Hi there,
Here are my two cents in this regard. Instead of showing 10 or 15 reviews on the root domain, show no more than 2 and for more reviews you can send the visitors to the reviews sub-domain (using a 'view more reviews' button as you currently have). This will mitigate duplicate content issues to a great extent if at all any. I do not recommend blocking the sub-domain from the search engines. However, you can move the content of the sub-domain to something like a reviews sub-folder as follows:
From an SEO stand point, sub-folder is a safe bet compared to a sub-domain. Here is what Rand Fishkin has to say in this regard (http://www.seomoz.org/q/subdomains-vs-subfolders
_ “All the testing, research and examples I've seen in the past few years (and even the past few months) strongly suggest that the same principles still hold true._
Subdomains SOMETIMES inherit and pass link/trust/quality/ranking metrics between one another.
Subfolders ALWAYS inherit and pass link/trust/quality/ranking metrics across the same subdomain.
Thus, having a single subdomain (even just domainname.tld with no subdomain extension) with all of your content is absolutely ideal from an SEO perspective. It's also more usable and brandable, too IMO.”
Here is an interesting discussion about the same here on Moz.com:
http://www.seomoz.org/q/multiple-subdomains-my-worst-seo-mistake-now-what-should-i-do
Hope these help.
Best regards,
Devanur Rafi.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can bad html code hurt your website from ranking ?
Hello,For example if I search for “ Bike Tours in France” I am looking for a page with a list of tours in France.Does it mean that if my html doesn’t have list * in the code but only that apparently doesn’t have any semantic meaning for a search engine my page won’t rank because of that ?Example on this page : https://bit.ly/2C6hGUn According to W3schools: "A semantic element clearly describes its meaning to both the browser and the developer. Examples of non-semantic elements: <div> and - Tells nothing about its content. Examples of semanticelements: <form>, , and- Clearly defines its content."Has anyone any experience with something similar ?Thank you, </form>
Technical SEO | | seoanalytics0 -
2 sites versus a subdomain: Which is better?
I have a client that sponsors a couple of events during the year. They currently have pages within a single website for these events but are interested in creating a separate website so they can brand the events differently. I'm not sure this is the most effective way to do it for fear of losing the "google juice" already there for these pages.Here's what I'm thinking is a better strategy: 1) Host the content both on the main domain and the sub-domain2) Make sure there is a tag on each page of the sub-domain version that points to the main version.That will give them the branding they are seeking while pushing all juice across to the main domain.What are your thoughts?
Technical SEO | | Britewave0 -
What if my developers tell me they can only handle a certain amount of 301 redirects?
We recently launched a new site and I felt the need to redirect all of our old site URLs to the new site URLs. Our developers told me they would only be able to do about 1000 before it starts to bog down the site. Has anyone else came across this before? On top of that, with our new site structure, whenever our content team changes a title (which is more often than i had hoped), the URL changes. This means I'm finding i have many other redirects I need to put in place, but cant at the moment. Advice please??
Technical SEO | | CHECOM0 -
Spider Indexed Disallowed URLs
Hi there, In order to reduce the huge amount of duplicate content and titles for a cliënt, we have disallowed all spiders for some areas of the site in August via the robots.txt-file. This was followed by a huge decrease in errors in our SEOmoz crawl report, which, of course, made us satisfied. In the meanwhile, we haven't changed anything in the back-end, robots.txt-file, FTP, website or anything. But our crawl report came in this November and all of a sudden all the errors where back. We've checked the errors and noticed URLs that are definitly disallowed. The disallowment of these URLs is also verified by our Google Webmaster Tools, other robots.txt-checkers and when we search for a disallowed URL in Google, it says that it's blocked for spiders. Where did these errors came from? Was it the SEOmoz spider that broke our disallowment or something? You can see the drop and the increase in errors in the attached image. Thanks in advance. [
Technical SEO | | ooseoo](<a href=)" target="_blank">a> [
](<a href=)" target="_blank">a> LAAFj.jpg
0 -
Oh no googlebot can not access my robots.txt file
I just receive a n error message from google webmaster Wonder it was something to do with Yoast plugin. Could somebody help me with troubleshooting this? Here's original message Over the last 24 hours, Googlebot encountered 189 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%. Recommended action If the site error rate is 100%: Using a web browser, attempt to access http://www.soobumimphotography.com//robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure. If the site error rate is less than 100%: Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. After you think you've fixed the problem, use Fetch as Google to fetch http://www.soobumimphotography.com//robots.txt to verify that Googlebot can properly access your site.
Technical SEO | | BistosAmerica0 -
How can i improve alexa rank of my website
www.meetuniversities.com Meet Universities - Get connected to your dream university
Technical SEO | | debal0 -
What can I do about missing Meta Description for category pagest etc.?
On all my campaigns I'm returning high levels of 'Missing Meta Description Tags'. The problem with fixing this is they're all for category, tag and author pages. Is there a way to add a meta description to these pages (there are hundreds) or will it not really have any ranking effect?
Technical SEO | | SiliconBeachTraining0 -
Google can read japanese or only alphabet ?
Hi Actually im running a web shop in several languages: english, french, spanish, italian, russian, german, japanese, korean and japanese ! lol Im trying to optimize my web site for SEO so i changed URL rewriting rules for example French example: From: http://www.test.com/je-suis-francais.html -> http://www.test.com/Je-suis-français.html Japanese example: (i use UTF8 encoding) From: http://www.test.com/watashiwa-nihonnjin-desu.html -> http://www.test.com/私は日本人です.html So i get something like wikipedia (url with accents, ideogramms in several languages) Do you think wikipedia and me are doing wrong ?
Technical SEO | | nipponx0