Google said that low-quality pages on your site may affect rankings on other parts
-
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out.
Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality.
The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article).
If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically.
Or do you have another idea what I could to do?
I invite you to take a look at the site www.ghacks.net
-
Depending on the nature and quality of your content, you may also want to consider removing article pages that might be hurting your domain.
-
Are you syndicating the content for link building? Or are scrapers just pulling in your content?
On the note of tags and archives, in most cases this is best practice anyway. However it seems you may have been hit with the content udate, and as Shailendra suggests, you may want to get webmaster support people to look into your site.
Without a full analysis it is difficult to say what else is affecting your site.
-
I believe too much of ads and linking (internal as well as external) may be the probable cause. I suggest to submit your site @ http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en and wait for some time before indulging in some action
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
About porn sites and ranking
Hello, I'm thinking to extend my website into porn. At the moment there is no pornography on it, although we do talk about sex related topics and products (from dating to tutorials, to toys etc.) Would it be dangerous to keep the porn section on the same domain as the rest? Would this negatively affect my non-porn content as Googlebot would "flag" my website as being pornographic (although only a few pages would be)? Or simply Googlebot would leave the current non-porn pages ranking as they are now, just fine, and plus it would rank the porn pages if they "deserve" to? I hope my question is clear. I don't want to create a subdomain.
Algorithm Updates | | fabx0 -
Internal pages ranking over the homepage: How to optimise to rank better at Google?
Hi, We have experienced a shift in SERP from internal pages ranking over website homepage for more than a year. Previously website homepages used to rank for the primary keyword like moz.com for "SEO". Now we can see that internal pages like moz.com/learn/seo/what-is-seo been ranking for the primary keyword "SEO". Google is picking up these "what is ABC" pages than the homepage. All our competitor sites are ranking with these internal pages which are about "what is (primary keyword)". We do have the same internal pages "what is....", but this pages is not ranking; only our homepage is ranking. Moreover we dropped more than 15 positions after this shift in SERP. How to diagnose this? Thanks
Algorithm Updates | | vtmoz0 -
Blog-posts pages are dominating in search console "Internal Links". Only home-page at top!
Hi all, Ours is WordPress website and we have a blog...website.com/blog/. All the important pages in the website are well linked from top and footer menu. But in our webmasters...internal links section, only homepage is at the top. Blog-posts are others followed by homepage. I wonder why blog pages are dominating our website pages. Please give your suggestions on this. Do you think Google will give more priority for the blog-posts than website pages as they are more linked technically? Thanks
Algorithm Updates | | vtmoz1 -
Specific Page Penalty?
Having trouble to figure out why one of our pages is not ranking in SERPs, on-page optimisation looks decent to me. Checked by using gInfinity extension and searched for the page URL. Can one page be penalised from Google engines (.ie / .com ) and the rest of the website not penalised? The (possible) penalised page is showing in Google places in SERPs. I assume this would not show if it was penalised. Would appreciate any advice. Thanks
Algorithm Updates | | notnem0 -
Google and Wikipedia
Ok, I love Wikipedia as much as the next guy but the amount of weight that google puts on this site is getting crazy. My search terms that I am going after are "speakers" and "loudspeakers" Can somebody tell me why wikipedia needs the top 8 -10 spots for those terms? is that really a good search result for users of google? More of a rant then a question I know. I just needed to get that off my chest!.
Algorithm Updates | | kevin48030 -
Ranking well for main key terms but site traffic has dropped sharply?
Hello All, Just a quick question. Since the penguin update our site www.caravanguard.co.uk has seen some pretty fluctuating movement in Google, many of our key terms dropped over night, but over the last few weeks they have slowly started to move back up the rankings. The bizarre thing is despite the recover in rankings our unique traffic has taken a fairly large whack in numbers. Seasonality? Weather? ( it's been nice in the UK for a change) I can only assume the longer tail terms are taking more time to recover. I have tried to look into our back link profile and have noticed a little too much in terms concise keyword targeting, How do you go about changing these terms and removing the really bad links (struggling to identify the worst cases) on totally irrelevant sites or poor directories. Put in place before I started here 🐵 Any help truly appreciated. Regards Tim
Algorithm Updates | | TimHolmes0 -
When Google crawls and indexes a new page does it show up immediately in Google search - "site;"?
We made changes to a site, including the addition of a new page and corresponding link/text changes to existing pages. The changes are not yet showing up in the Google index (“site:”/cache), but, approximately 24 hours after making the changes, The SERP's for this site jumped up. We obtained a new back link about a couple of weeks ago, but it is not yet showing up in OSE, Webmaster Tools, or other tools. Just wondering if you think the Google SERP changes run ahead of what they actually show us in site: or cache updates. Has Google made a significant SERP “adjustment” recently? Thanks.
Algorithm Updates | | richpalpine0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0