What has this subdomain done to recover from Panda?
-
I found that doctor.webmd.com was affected by Google Panda, and then recovered (if you look at traffic on compete.com). What do you think they did to recover?
-
My "opinion" is that the website was slightly affected by the Panda update, and than a week or two later it started to gain back it's rankings or traffic. This has been the behavior of many large websites after the first Panda update and it's continous updates, you would see a drop in traffic than couple weeks later it will level back off to it's usual traffic numbers before the Panda update that caused a slight drop.
I most likley might be wrong here, but I am just making an assumption from observations I have seen on other websites. Also from my understanding, subdomains are not as powerful as they use to be and are not really favored as much from the new "Panda" update. I believe the preferred site architecture now is using sub-folders.
I apologize if any of my statements above are false, and if they are please correct me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are TLD and numbers in subdomain ranking factors?
Several years ago my firm migrated our domain from a very lengthy 3point7designs.com to 3.7designs.co (we couldn't get 7designs.com at the time) thinking this would be a clever way to brand the name 3.7 Designs. Ever since that change we've had a dramatic reduction in search rankings which has lasted years. https://monosnap.com/file/adJUdkX9YCXQaODcXype4qza70pMCE You can see the drop in early 2011, we made the switch in February. I've read some discussion about Google changing weights based on having numbers in the subdomain as it appears spammy. I've also herd speculation about .co vs .com. Further evidence is being outranked by a competitor for a term we previously dominated despite having higher domain authority, inbound links, exact match keyword in our title and content. We now own 37designs.com and 7designs.com and are contemplating a switch. Any insight into these being ranking factors or is the site being penalized for other reasons?
Intermediate & Advanced SEO | | 3PointRoss0 -
What is better for web ranking? A domain or subdomain?
I realise that often it is better put content in a subfolder rather than a subdomain, but I have another question that I cannot seem to find the answer to. Is there any ranking benefit to having a site on a .co.uk or .com domain rather than on a subdomain? I'm guessing that the subdomain might benefit from other content on the domain it's hosted on, but are subdomains weighted down in any way in the search results?
Intermediate & Advanced SEO | | RG_SEO0 -
.ac.uk subdomain vs .co.uk domain
I'd be grateful if I could check my thinking... I've agreed to give some quick advice to a non profit organisation who are in the process of moving their website from an ac.uk subdomain to a .co.uk domain. They believe that their SEO can be improved considerably by making this migration. From my experience, I don't see how this could be the case. Does the unique domain in itself offer enough ranking benefit to justify this approach? The subdomain is on a very high authority domain with many pre-existing links, which makes me even more nervous about this approach. Does anyone have any opinions on this that they could share please? I'm guessing that it is possible to migrate safely and that there might be branding advantages, but from an actual SEO point of view there is not that much benefit? It looks like most of their current traffic is branded traffic.
Intermediate & Advanced SEO | | RG_SEO0 -
Product Pages & Panda 4.0
Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Recovering from robots.txt error
Hello, A client of mine is going through a bit of a crisis. A developer (at their end) added Disallow: / to the robots.txt file. Luckily the SEOMoz crawl ran a couple of days after this happened and alerted me to the error. The robots.txt file was quickly updated but the client has found the vast majority of their rankings have gone. It took a further 5 days for GWMT to file that the robots.txt file had been updated and since then we have "Fetched as Google" and "Submitted URL and linked pages" in GWMT. In GWMT it is still showing that that vast majority of pages are blocked in the "Blocked URLs" section, although the robots.txt file below it is now ok. I guess what I want to ask is: What else is there that we can do to recover these rankings quickly? What time scales can we expect for recovery? More importantly has anyone had any experience with this sort of situation and is full recovery normal? Thanks in advance!
Intermediate & Advanced SEO | | RikkiD220 -
Ever had a case where publication of products & descriptions in ebay or amazon caused Panda penalty?
One of our shops got a Panda penalty back in september. We sell all our items with same product name and same product description also on amazon.com , amazon.co.uk, ebay.com and ebay.co.uk. Did you ever have a case where such multichannel sales caused panda penalty?
Intermediate & Advanced SEO | | lcourse0 -
Penguin & Panda: Geographic Penalities?
Has anyone ever come across information about a website appearing strongly in SERP's in one region, but poorly in another? (ie: great in Europe, not so great in N. America) If so, perhaps it is a Panda or Penguin issue?
Intermediate & Advanced SEO | | Prospector-Plastics0 -
How can this site rank post panda/penguin?
I am doing link building for an adult dating comparison website. One of the main competitors though, having checked their backlink profile have anchor text that is not varied at all. In fact many, many links that are all the same. How can they possibly rank in the post panda/penguin era? In fact they're at number 2! The site is an adult site and it www.f hypen buddy.co.uk if anyone wants to runa backlink check on OSE. Any help greatly appreciated!
Intermediate & Advanced SEO | | SamCUK0