Please help me with your advice
-
Hi all,
Couple years ago I started to build my business based on EMD domain. The intention was to create the source with the rich unique content.
After a year of hard work the site achieved top 10 in Google and started to generate good amount of leads.
Then Google announced the EMD Update and site lost the 90% of traffic (after Pandas updates our SERP was steady )
“ a new filter that tries to ensure that low-quality sites don’t rise high in Google’s search results simply because they have search terms in their domain names. ”
But I don’t consider my site low-quality site, every page, every post is 100% unique and has been created only to share the knowledge with others…
The site has EXCELLENT content from industry point of view....
Since the “ EMD Update “ I read hundreds , hundreds of different articles and opinions related to EMD update and finally I am confused and lost.
What should I do…
• Kill the site and start new one
• Get more links, but what type of links and how I should get them
• Keep hoping and pray....
• Or do something elsePlease help me with your advice
-
Thank you...I will appreciate if you will give me 10 mins of your time....i sent site to your PM
-
I have a post on the subject here - it's very long, because it's a complex subject:
http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
We're not saying this is definitely the problem - just that you should be aware of how complicated it can get. Unfortunately, it's hard to tell without really looking at the site. A lot can happen to hurt a site's rankings, and the EMD update was just one piece of the puzzle.
-
But Panda...as far as know looking for duplicate content....and we are very careful with that for all our sites....Content is 100 unique and never been spinned, written by humans....
so i am confused more...
This is the article by M.Cutts I remeber i read it
date 3 october 2012....
after that day our site started to loose the trafic...not in 1 night but slowly..slowly...than
up to mid nov 2012 it lost about 65% ..this is why i have concluded from the begging that it was EMD -
Ah! I was thinking it was end of September because you said you were a casualty of EMD. If this happened mid November then it's definitely not EMD.
There were Panda updates November 5 and November 21. If the drop doesn't coincide with those dates then it is not due to a major algorithm change. (By major I mean Panda/Penguin as Google is constantly tweaking the algorithm.)
-
all top 10 positions were dropped in mid of november 2012.
I said 90 % ...i think the statment has been exaggerated....but 65% for sure -
The thing is that if you lost 90% of your traffic at the end of September (i.e. Sept 27/28) then the issue is very likely either EMD or Panda. If you have a good site with 300 well written unique pages then in my mind EMD is almost impossible. So, I would go investigating Panda issues. Duplicate and thin content are the top culprits but there can be other factors.
There are other possibilities though including a change in urls, DNS problems, hosting problems, malware issues, robots.txt problems, accidentally noindexing, a competitor ramping up their SEO etc. etc.
If the traffic drop was a little later, like October 5 then Penguin is a possibility. Penguin is related to overoptimized anchor text in your backlinks among other things.
Sometimes when a site is affected on a Panda date but doesn't seem to have Panda issues it is possible that sites linking to your site were affected by Panda and as such you have lost some of your link juice. But it is unlikely that 90% of your traffic would go because of this.
-
the site got only unique urls, by pages i meant urls too
-
I will appreciate if you will explain it bit more in details....
-
Keep in mind, too, that a lot of duplicate content is accidental. Google doesn't care about pages, per se - they care about unique URLs. So, if you have 300 unique pages, but something about your CMS translates that into 5,000 crawlable URLs, then you could definitely have problems.
-
I really appreciate your knowledge , but the site has nothing to do with Panda as it got 300 pages with unique and well written content
-
The number of unique pages doesn't really matter when it comes to Panda. You could have 300 unique pages, but if there are also 50 pages of copied content then this can trigger Panda.
But the other question I had was about thin content. An example of thin content would be a page that has say, a product photo, a bunch of template text that is the same from page to page a few ads and then only 1 or 2 lines of text.
Another example of a thin page would be if you had a section of definitions and each one had its own page. They could possibly be considered thin.
-
If this is really related to the EMD update, then I'd agree with Charles - starting over is a bad idea. My best guess is that the EMD update wasn't a penalty, per se - it was more like Google lowered the volume on EMDs. In other words, having one isn't bad now - it's just that it's not as good as it used to be. There's no way to fix that really (you can't turn the volume back up), but the risks of switching domains would probably far outweigh the benefits.
To back up Marie, though, a lot happened right around the EMD update, and it's really tough to diagnose. I'd definitely look at Panda factors, like "thin" content. Try to look at the site from Google's POV - what you view as unique doesn't matter, frankly. You could be spinning out URL-based duplicates, for example, and not realize it - that's more of a technical SEO issue (you're not doing anything devious, but the site may still be giving Google problems).
The other issue to consider is whether your EMD has caused you to really pile on exact-match anchor text, especially keyword-loaded anchor text. This could trigger Penguin or similar problems. This is often correlated with EMDs, even though it wouldn't necessarily be a result of the EMD update.
If they've really just turned down the "volume", then you have to get other ranking factors in play - build more relevant, authoritative links, increase your social signals, etc. In other words, focus on aspects of SEO beyond simple on-page ranking factors.
-
Hi Maire, thanks for your answer site got over 300 unique pages...
-
When EMD hit on September 28, I asked for people to send me domains that had been affected so that I could see if there were any patterns. I had over 100 domains sent to me and the vast majority of them actually had Panda issues. Then, a few days after EMD, Google announced that they had also done a Panda refresh on September 27.
In all of the domains that I analyzed I would say that one of them was likely a true EMD candidate. This was a one page site with very little content and several affiliate links. It previously was ranking well for a competitive niche. The only reason it was ranking well was because of its domain name. EMD was designed to take the ranking benefit away from sites that ONLY ranked because they had keywords in their domain name. It doesn't punish a site simply because there are keywords in the domain name.
You've mentioned that your pages are 100% unique. Do you have thin pages? If you have a section of your site that has pages with very little content on the page then this can cause Panda to affect you. But there are other possible reasons as well.
-
Thanks Charles for your time...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difficulty with Indexing Pages - Desperate for Help!
I have a website with product pages that use the same URL, but load different data based on what's passed to them with GET. I am using a Wordpress website, but all of the page information is retrieved from a database using PHP and displayed with PHP. Somehow these pages are not being indexed by Google. I have done the following: 1. Created a site map pointing to each page. 2. Defined URL parameters in Search Console for these type of pages. 3. Created a product schema using schema.org, and tested it without errors. I have requested re-indexing repeatedly and these pages and images on the pages are still not being indexed! Does anybody have any suggestions?
Intermediate & Advanced SEO | | jacleaves0 -
Moving to a new domain for second time - critical, help needed fast!
Hello, Important: please do not ask why we need to change the domain, its not the matter at all, thank you for understanding. Over a month ago we successfully changed our domain name, 301 redirected, did GWT 'change of address' and all. The old domain was 2 years old, ranking very well, the new domain change of address was a success and traffic back on the new domain after a week. Today we need to change the domain name again, unfortunately, for some reasons, we have to, however we are not sure what to do in GWT, when I went to 'change of address' in the domain (the new first domain), i saw the following message (screenshot attached too): This site is undergoing a move Old URL | New URL If any URL on the left should not be moved, you can withdraw its move request. To do this, click the URL and then Withdraw. Now our questions: 1. For second time moving to a new domain, we should move from the old first domain (301 from the first old domain) or from the second domain (301 from the second domain)? 2. If from the old first domain, should we Withdraw from the first domain (lift up the first change of address in GWT) and then redirect the old first domain to the second new domain (the one we want to move now)? If yes, what to do with the first new domain (the one which we moved to a month ago) 3. If we should move from the first new domain, then what to do? The situation is clear but confusing what to do? It's just that we need to change the domain name again, move to a new one, for the second time, now we should redirect from the first old domain or first new domain? I purchased MOZ just to get help from you guys here, the only place i thought I could be helped. Of course gonna use Moz service too now that i have puurchased it 🙂 Awaiting your quick help guys. Thank you! 8csVpOZ2QoiYCoTR1t_SnQ.png
Intermediate & Advanced SEO | | mdmoz0 -
Can adding "noindex" help with quality penalizations?
Hello Moz fellows, I have another question about content quality and Panda related penalization. I was wondering this: If I have an entire section of my site that has been penalized due to thin content, can adding "noindex,follow" to all pages belonging to that section help de-penalizing the rest of the site in the short term, while we work to improve those penalized pages, which is going to take a long time? Can that be considered a "short term solution" to improve the overall site scoring on Google index while we work to improve those penalized pages, and, once ready, we remove the "noindex" tag? I am eager to know your thoughts on this possible strategy. Thank you in advance to everyone!
Intermediate & Advanced SEO | | fablau0 -
Can some one help me find this Matt Cutts article on disavows?
Hey everyone. A while ago, I remember reading that Matt Cutts said that you can just disavow domains, and that the Google Webmaster Tools team doesn't read for comments (like if webmasters had been reached out to). Is this ringing any bells? I'm trying to find this tidbit again. Thanks!
Intermediate & Advanced SEO | | Charles_Murdock
Charles0 -
SEO advice with having a blog on sub domain.
Righto, so: I've been working on our company website www.nursesfornurses.com.au which is built on .asp which is a real pain because the site is built so messy and on a very dated CMS which means I have to go back to the dev every time I want to make a change. We've made the decision to move the site over to Wordpress in stages. So, (and I hope logically), i've started by making them a proper blog with better architecture to start targeting industry related keywords. I had to put it on a sub domain as the current hosting does not support Wordpress http://news.nursesfornurses.com.au/Nursing-news/
Intermediate & Advanced SEO | | 9868john
The previous blog is here: http://www.nursesfornurses.com.au/blog Its not live yet, so I'm just looking for SEO advice or issues I might encounter by having the blog on a sub domain. In terms of user experience, I realise that there needs a clearer link back to the main website, I'm just trying to work out the best way to do it... Any advice / criticism is greatly welcomed. Thanks0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Please help :) Troubles getting 3 types of content de-indexed
Hi there,
Intermediate & Advanced SEO | | Ltsmz
I know that it takes time and I have already submitted a URL removal request 3-4 months ago.
But I would really appreciate some kind advice on this topic. Thank you in advance to everyone who contributes! 1) De-indexing archives Google had indexed all my:
/tag/
/authorname/
archives. I have set them as no-index a few months ago but they still appear in search engine.
Is there anything I can do to speed up this de-indexing? 2) De-index /plugins/ folder in wordpress site They have also indexed all my /plugins/ folder. So I have added a disallow /plugin/ in my robots.txt 3-4 months ago, but /plugins/ still appear in search engine. What can I do to get the /plugins/ folder de-indexed?
Is my disallow /plugins/ in robots.txt making it worse because google has already indexed it and not it can't access the folder? How do you solve this? 3) De-index a subdomain I had created a subdomain containing adult content, and have it completely deleted it from my cpanel 3months ago, but it still appears in search engines. Anything else I can do to get it de-indexed? Thank you in advance for your help!0 -
Splitting one Website into 2 Different New Websites with 301 redirects, help?
Here's the deal. My website stbands.com does fairly well. The only issue it is facing a long term branding crisis. It sells custom products and sporting goods. We decided that we want to make a sporting goods website for the retail stuff and then a custom site only focusing on the custom stuff. One website transformed and broken into 2 new ones, with two new brand names. The way we are thinking about doing this is doing a lot of 301 redirects, but what do we do with the homepage (stbands.com) and what is the best practice to make sure we don't lose traffic to the categories, etc.? Which new website do we 301 the homepage to? It's rough because for some keywords we rank 3 or 4 times on the first page. Scary times, but something must be done for the long term. Any advise is greatly appreciated. Thank you in advance. We are set for a busy next few months 🙂
Intermediate & Advanced SEO | | Hyrule0