Specific Page Penalty?
-
Having trouble to figure out why one of our pages is not ranking in SERPs, on-page optimisation looks decent to me.
Checked by using gInfinity extension and searched for the page URL.
Can one page be penalised from Google engines (.ie / .com ) and the rest of the website not penalised?
The (possible) penalised page is showing in Google places in SERPs. I assume this would not show if it was penalised.
Would appreciate any advice.
Thanks
-
You may get no traffic from ranking #4 these days, especially on queries with a competitive paid portion of the SERP.
What I would do is stop assessing the "what if" scenario's and start focusing all your energy towards acquiring those editorial type links grasshopper was talking about.. right now! You'll get that ranking and secure it for long-term traffic.
-
There's no way to give an accurate time-scale answer to that question. If you're able to get editorial links from authoritative, trusted sites, you can see substantial movement within a week or two of the links being crawled. However, if your links are from lower-quality sites, or are weighted heavily toward devalued methods of link building (directories, reciprocal links, three-way links, etc), engines may not give those links much weight, if any, no matter how long you wait.
-
It has ranked well previously, according to Rank tracker on SEOmoz - it was ranking 4th last week however I don't think that is correct.
Is it a reliable tool??
Organic traffic shows no drop for keywords for the page in 2012 nor does page views for the page. If it was over-optimised, these would be noticeble in Google Analytics..
-
To me, the true clue would be whether or not the URL ranked well previously.
If it has not.. you need more links. It is probably a page authority issue.
If it has.. you may have over-optimized on the anchor text, sitewide links will do this. You may rank well for awhile, then you'll find yourself on page five shaking your fist at Google.
-
Hi Grasshopper,
I know the keywords I am trying to rank for are competitive.
I will take that into consideration and start working on these. How long do this take effect in Google engines?
Thanks
-
Hi Ronan,
Since it passes tests 1,2 and 4, I would say that #3 is the culprit. Having solid on-page optimization is great, but link authority is the name of the game for achieving ranking, especially if the keywords you're trying to rank for are competitive.
Run the Keyword Difficulty Tool against the keywords you're trying to rank for. I would expect that the URLs on page 1 of the SERPs all have significantly significantly stronger, more trustworthy link profiles than your URL does.
If that's the case, all the standard advice applies - create a truly differentiated page that offers content / resources / tools above and beyond what your competitors offer, and market the hell out of it.
-
The page is indexed
-
Hi Grasshopper,
Thanks for your input. I have checked each one and appears to be fine:
-
Yes
-
Text is true content on the page
-
The page itself does have low inbound links. It might be this?
-
Appearing first
Despite low number of inbound links, I wouldn't say this alone would cause the ranking issue as the page is well optimised and similar to competitors.
-
-
Hi Ronan,
First, to your general question - yes, it is possible for one page of a domain to be penalized / filtered, while the rest of the domain is not. However, it seems extremely unlikely that the URL in question would rank in Google Places if it was penalized. There are a few things you want to check:
-
The first thing you want to check is whether or not the page is indexed and cached, which is a simple query [cache:mypage.com/this-page]. Does it return a result?
-
If so, in the gray banner across the top of the cached page, click on "Text-only version". Does the machine-readable text match the true content on the page? If you have large amounts of machine-readable text that are only visible to an engine, and not a user, that can trip an algorithmic spam filter. Also, look for off-topic words - sometimes sites get hacked and hackers inject all kinds of spammy garbage and links, which can also trip the filter.
-
If the page is cached, and rendering the intended content, does it have sufficient link authority to rank for the terms you intend? It's quite possible that your page is in a competitive keyword space, and doesn't have enough juice to push past the competition.
-
If you want to see if it has enough juice to rank for anything at all, pick an sentence in the first paragraph of text, and search for it enclosed in quotes, ["Some random sentence from my first paragraph here."] Is your URL the #1 result? It should be. If there are other sites that you've syndicated your content to, or have scraped your content and are more authoritative than your site, it's possible that your URL isn't ranking because it's being (incorrectly) filtered out as duplicate content.
Hope that helps.
-
-
Is the URL no longer in Google's index at all?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Internal pages ranking over the homepage: How to optimise to rank better at Google?
Hi, We have experienced a shift in SERP from internal pages ranking over website homepage for more than a year. Previously website homepages used to rank for the primary keyword like moz.com for "SEO". Now we can see that internal pages like moz.com/learn/seo/what-is-seo been ranking for the primary keyword "SEO". Google is picking up these "what is ABC" pages than the homepage. All our competitor sites are ranking with these internal pages which are about "what is (primary keyword)". We do have the same internal pages "what is....", but this pages is not ranking; only our homepage is ranking. Moreover we dropped more than 15 positions after this shift in SERP. How to diagnose this? Thanks
Algorithm Updates | | vtmoz0 -
Can we ignore "broken links" without redirecting to "new pages"?
Let's say we have reaplced www.website.com/page1 with www.website.com/page2. Do we need to redirect page1 to page2 even page1 doesn't have any back-links? If it's not a replacement, can we ignore a "lost page"? Many websites loose hundreds of pages periodically. What's Google's stand on this. If a website has replaced or lost hundreds of links without reclaiming old links by redirection, will that hurts?
Algorithm Updates | | vtmoz0 -
Diluting your authority - adding pages diluting rankings of other pages?
I'm looking after a site that has around 400 pages. All of these pages rank pretty well for the KW they are targetting. My question is: if we add another 400 pages without doing any link building work, holding DA the same, 1) would the rankings of those 400 previously good pages diminish? and 2) Would the new pages, as more and more new ones are created, rank less and less well?
Algorithm Updates | | xoffie0 -
How much is Page Rank really worth?
We are in a position to purchase a domain, made of relevant keywords to our company with a current page ranking of 4 for their home page. However in looking at their analytics and other information they do not do well on significant keywords and have very low site traffic. In fact they do very, very poorly. With their high page ranking would it be relatively easy to conduct a successful SEO campaign on the domain if we were to take it over as our own and attempt to climb in the SERP's? I know Page Rank doesn't mean everything when it comes to your ranking, but 4 is relatively high in our field, so I don't really understand why they do so poorly when it comes to their actual rankings on key words.
Algorithm Updates | | absoauto0 -
Google has indexed a lot of test pages/junk from the development days.
With hind site I understand that this could have been avoided if robots.txt was configured properly. My website is www.clearvisas.com, and is indexed with both the www subdomain and with out. When I run site:clearvisas.com in Google I get 1,330 - All junk from the development days. But when I run site:www.clearvisas.com in Google I get 66 - these results all post development and more in line with what I wanted to be indexed. Will 1,330 junk pages hurt my seo? Is it possible to de-index them and should I? If the answer is yes to any of the questions how should I proceed? Kind regards, Fuad
Algorithm Updates | | Fuad_YK0 -
Google.co.uk vs pages from the UK - anyone noticed any changes?
We've started to notice some changes in the rankings of Google UK and Google pages from the UK. Pages from the UK have always typically ranked higher, however it seems like these are slipping, and Google UK pages (pages from the web) are climbing. We've noticed a similar thing happening in the Bing/Yahoo algorithm as well. Just wondered if anyone else has anyone else noticed this? Thanks
Algorithm Updates | | Digirank0 -
Using Brand Name in Page titles
Is it a good practice to append our brand name at the end of every page title? We have a very strong brand name but it is also long. Right now what we are doing is saying: Product Name | Long brand name here Product Category | Long brand name here Is this the right way to do it or should we just be going with ONLY the product and category names in our page titles? Right now we often exceed the 70 character recommendation limit.
Algorithm Updates | | mlentner1