Specific Page Penalty?
-
Having trouble to figure out why one of our pages is not ranking in SERPs, on-page optimisation looks decent to me.
Checked by using gInfinity extension and searched for the page URL.
Can one page be penalised from Google engines (.ie / .com ) and the rest of the website not penalised?
The (possible) penalised page is showing in Google places in SERPs. I assume this would not show if it was penalised.
Would appreciate any advice.
Thanks
-
You may get no traffic from ranking #4 these days, especially on queries with a competitive paid portion of the SERP.
What I would do is stop assessing the "what if" scenario's and start focusing all your energy towards acquiring those editorial type links grasshopper was talking about.. right now! You'll get that ranking and secure it for long-term traffic.
-
There's no way to give an accurate time-scale answer to that question. If you're able to get editorial links from authoritative, trusted sites, you can see substantial movement within a week or two of the links being crawled. However, if your links are from lower-quality sites, or are weighted heavily toward devalued methods of link building (directories, reciprocal links, three-way links, etc), engines may not give those links much weight, if any, no matter how long you wait.
-
It has ranked well previously, according to Rank tracker on SEOmoz - it was ranking 4th last week however I don't think that is correct.
Is it a reliable tool??
Organic traffic shows no drop for keywords for the page in 2012 nor does page views for the page. If it was over-optimised, these would be noticeble in Google Analytics..
-
To me, the true clue would be whether or not the URL ranked well previously.
If it has not.. you need more links. It is probably a page authority issue.
If it has.. you may have over-optimized on the anchor text, sitewide links will do this. You may rank well for awhile, then you'll find yourself on page five shaking your fist at Google.
-
Hi Grasshopper,
I know the keywords I am trying to rank for are competitive.
I will take that into consideration and start working on these. How long do this take effect in Google engines?
Thanks
-
Hi Ronan,
Since it passes tests 1,2 and 4, I would say that #3 is the culprit. Having solid on-page optimization is great, but link authority is the name of the game for achieving ranking, especially if the keywords you're trying to rank for are competitive.
Run the Keyword Difficulty Tool against the keywords you're trying to rank for. I would expect that the URLs on page 1 of the SERPs all have significantly significantly stronger, more trustworthy link profiles than your URL does.
If that's the case, all the standard advice applies - create a truly differentiated page that offers content / resources / tools above and beyond what your competitors offer, and market the hell out of it.
-
The page is indexed
-
Hi Grasshopper,
Thanks for your input. I have checked each one and appears to be fine:
-
Yes
-
Text is true content on the page
-
The page itself does have low inbound links. It might be this?
-
Appearing first
Despite low number of inbound links, I wouldn't say this alone would cause the ranking issue as the page is well optimised and similar to competitors.
-
-
Hi Ronan,
First, to your general question - yes, it is possible for one page of a domain to be penalized / filtered, while the rest of the domain is not. However, it seems extremely unlikely that the URL in question would rank in Google Places if it was penalized. There are a few things you want to check:
-
The first thing you want to check is whether or not the page is indexed and cached, which is a simple query [cache:mypage.com/this-page]. Does it return a result?
-
If so, in the gray banner across the top of the cached page, click on "Text-only version". Does the machine-readable text match the true content on the page? If you have large amounts of machine-readable text that are only visible to an engine, and not a user, that can trip an algorithmic spam filter. Also, look for off-topic words - sometimes sites get hacked and hackers inject all kinds of spammy garbage and links, which can also trip the filter.
-
If the page is cached, and rendering the intended content, does it have sufficient link authority to rank for the terms you intend? It's quite possible that your page is in a competitive keyword space, and doesn't have enough juice to push past the competition.
-
If you want to see if it has enough juice to rank for anything at all, pick an sentence in the first paragraph of text, and search for it enclosed in quotes, ["Some random sentence from my first paragraph here."] Is your URL the #1 result? It should be. If there are other sites that you've syndicated your content to, or have scraped your content and are more authoritative than your site, it's possible that your URL isn't ranking because it's being (incorrectly) filtered out as duplicate content.
Hope that helps.
-
-
Is the URL no longer in Google's index at all?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sizable decrease in amount of pages indexed, however no drop in clicks, impressions, or ranking.
Hi everyone, I've run into a worrying phenomenon in GSC and im wondering if anyone has come across something similar. Since August, I have seen a steady decline in the number of pages that are indexed from my site, from 1.3 million down to about 800,000 in two months. Interestingly, my clicks/impressions continue to increase gradually (on the same pace they have been for months) and I see no other negative side affects resulting from this drop in coverage. In total I have 1.2 million urls that fall into one of three categories, "Crawled - currently not indexed", "Crawl anomaly", and "Discovered - currently not indexed" Some other notes - all of my valid, error, and excluded pages are https://www. , so I don't believe there is an issue with different versions of the same site being submitted. Also, my rankings have not changed so I tentatively believe that this is unrelated to the Medic Update. If anyone else has experienced this or has any insight to the problem I would love to know. Thanks!
Algorithm Updates | | Jason-Reid0 -
Is it stil a rule that Google will only index pages up to three tiers deep? Or has this changed?
I haven't looked into this in a while, it used to be that you didn't want to bury pages beyond three clicks from the main page. What is the rule now in order to have deep pages indexed?
Algorithm Updates | | seoessentials0 -
Any benefit to splitting up links from one company to diff pages?
We are the presenting sponsor for this big event in our area (Chasco Fiesta). As part of being their sponsor, their website has linked to us in five different places on their site. But it's all to our homepage. Would there be any benefit to having them link to other pages on our site instead of just our homepage (assuming the other pages are a reasonable expectation for the user, of course)? Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Pages fluctuating +/- 70 positions in Google SERPs?
I've got some pages that appear somewhere around #25 in Google. Every now and then, it just goes away from the top 100 results for a few days (even up to a week) and then it comes back. I've got other pages that rank around #8 which falls down to about #75 for a while and then it comes back. But while a page may be gone from the top 100 results in the US, it still ranks at about the same place everywhere else in the world (+/- 10 positions). I've seen this happen in the past but never it happened so often. What gives?!?
Algorithm Updates | | sbrault740 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Quickest way to deindex a large number of pages
Our site was recently hacked by spammers posting fake content and bringing down our servers, etc. After a few months, we finally figured out what was going on and fixed the issue. However, it turns out that Google has indexed 26K+ spammy pages and we've lost page rank and search engine rankings as a result. What is the best and fastest way to get these pages out of Google's index?
Algorithm Updates | | powpowteam0 -
Penalty for Mixing Microdata with Metadata
The folks that built our website have insisted on including microdata and metadata on our pages. What we end up with is something that looks like this in the header: itemprop="description" content="Come buy your shoes from us, we've got great shoes."> Seems to me that this would be a bad thing, however I can't find any info leaning one way or the other. Can anyone provide insight on this?
Algorithm Updates | | markcely0 -
Google has indexed a lot of test pages/junk from the development days.
With hind site I understand that this could have been avoided if robots.txt was configured properly. My website is www.clearvisas.com, and is indexed with both the www subdomain and with out. When I run site:clearvisas.com in Google I get 1,330 - All junk from the development days. But when I run site:www.clearvisas.com in Google I get 66 - these results all post development and more in line with what I wanted to be indexed. Will 1,330 junk pages hurt my seo? Is it possible to de-index them and should I? If the answer is yes to any of the questions how should I proceed? Kind regards, Fuad
Algorithm Updates | | Fuad_YK0