Specific Page Penalty?
-
Having trouble to figure out why one of our pages is not ranking in SERPs, on-page optimisation looks decent to me.
Checked by using gInfinity extension and searched for the page URL.
Can one page be penalised from Google engines (.ie / .com ) and the rest of the website not penalised?
The (possible) penalised page is showing in Google places in SERPs. I assume this would not show if it was penalised.
Would appreciate any advice.
Thanks
-
You may get no traffic from ranking #4 these days, especially on queries with a competitive paid portion of the SERP.
What I would do is stop assessing the "what if" scenario's and start focusing all your energy towards acquiring those editorial type links grasshopper was talking about.. right now! You'll get that ranking and secure it for long-term traffic.
-
There's no way to give an accurate time-scale answer to that question. If you're able to get editorial links from authoritative, trusted sites, you can see substantial movement within a week or two of the links being crawled. However, if your links are from lower-quality sites, or are weighted heavily toward devalued methods of link building (directories, reciprocal links, three-way links, etc), engines may not give those links much weight, if any, no matter how long you wait.
-
It has ranked well previously, according to Rank tracker on SEOmoz - it was ranking 4th last week however I don't think that is correct.
Is it a reliable tool??
Organic traffic shows no drop for keywords for the page in 2012 nor does page views for the page. If it was over-optimised, these would be noticeble in Google Analytics..
-
To me, the true clue would be whether or not the URL ranked well previously.
If it has not.. you need more links. It is probably a page authority issue.
If it has.. you may have over-optimized on the anchor text, sitewide links will do this. You may rank well for awhile, then you'll find yourself on page five shaking your fist at Google.
-
Hi Grasshopper,
I know the keywords I am trying to rank for are competitive.
I will take that into consideration and start working on these. How long do this take effect in Google engines?
Thanks
-
Hi Ronan,
Since it passes tests 1,2 and 4, I would say that #3 is the culprit. Having solid on-page optimization is great, but link authority is the name of the game for achieving ranking, especially if the keywords you're trying to rank for are competitive.
Run the Keyword Difficulty Tool against the keywords you're trying to rank for. I would expect that the URLs on page 1 of the SERPs all have significantly significantly stronger, more trustworthy link profiles than your URL does.
If that's the case, all the standard advice applies - create a truly differentiated page that offers content / resources / tools above and beyond what your competitors offer, and market the hell out of it.
-
The page is indexed
-
Hi Grasshopper,
Thanks for your input. I have checked each one and appears to be fine:
-
Yes
-
Text is true content on the page
-
The page itself does have low inbound links. It might be this?
-
Appearing first
Despite low number of inbound links, I wouldn't say this alone would cause the ranking issue as the page is well optimised and similar to competitors.
-
-
Hi Ronan,
First, to your general question - yes, it is possible for one page of a domain to be penalized / filtered, while the rest of the domain is not. However, it seems extremely unlikely that the URL in question would rank in Google Places if it was penalized. There are a few things you want to check:
-
The first thing you want to check is whether or not the page is indexed and cached, which is a simple query [cache:mypage.com/this-page]. Does it return a result?
-
If so, in the gray banner across the top of the cached page, click on "Text-only version". Does the machine-readable text match the true content on the page? If you have large amounts of machine-readable text that are only visible to an engine, and not a user, that can trip an algorithmic spam filter. Also, look for off-topic words - sometimes sites get hacked and hackers inject all kinds of spammy garbage and links, which can also trip the filter.
-
If the page is cached, and rendering the intended content, does it have sufficient link authority to rank for the terms you intend? It's quite possible that your page is in a competitive keyword space, and doesn't have enough juice to push past the competition.
-
If you want to see if it has enough juice to rank for anything at all, pick an sentence in the first paragraph of text, and search for it enclosed in quotes, ["Some random sentence from my first paragraph here."] Is your URL the #1 result? It should be. If there are other sites that you've syndicated your content to, or have scraped your content and are more authoritative than your site, it's possible that your URL isn't ranking because it's being (incorrectly) filtered out as duplicate content.
Hope that helps.
-
-
Is the URL no longer in Google's index at all?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page? e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
Algorithm Updates | | David-Stern0 -
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Google AMP (accelerated mobile pages), can it be used for non-Google news and Ecommerce Websites?
Mozzers, I've been doing a lot of research on Google's new Accelerated Mobile Pages (AMP) https://moz.com/blog/accelerated-mobile-pages-whiteboard-friday. From what I'm seeing, these AMP version websites are only for Google News-worthy websites such as New York Times, Cosmopolitan, and the BuzzFeeds of the world. But what about Ecommerce websites like Ebay or Amazon? Will AMP versions of "scotch tape" via OfficeDepot work in the SERP's on non-Google News cards?
Algorithm Updates | | Shawn1240 -
Page details in Google Search
I noticed this morning a drop in the SERPs for a couple of my main keywords. And even though this is a little annoying the more pressing matter is that Google is not displaying the meta title I have specified for the majority of my sites pages, despite one being specified and knowing my site has them in place. Could this sudden change to not using my specified title be the cause of the drop, and why would they be being displayed by Google in the first place, when they are there to be used. The title currently being displayed inthe SERPs is not anything that has been specified in the past or from the previous latest crawl etc. Any insight would be appreciated. Tim
Algorithm Updates | | TimHolmes0 -
Canonicalization on more than one page?
is it proper to "canocalize" more than one page in a site? Or should it only be on the home page? eg: http://www.sundayschoolnetwork.com">
Algorithm Updates | | sakeith0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Server Down for Few Hours went from Page 1 to Page 6?
We were on Page 1 - our server went down for about 4-6 hours and then we dropped to page 6. Would the server being down for this amount of time affect our position? Any advice would be much appreciated.
Algorithm Updates | | webdesigncwd0 -
Is a slash just as good as buying a country specific domain? .com/de vs .de
I guess this question comes in a few parts: 1. Would Google read a 2-letter country code that is after the domain name (after the slash) and recognize it as a location (targeting that country)? Or does is just read it as it would a word. eg. www.marketing.com/de for a microsite for the Germans www.marketing.com/fr for a microsite for the French Or would it read the de and fr as words (not locations) in the url. In which case, would it have worse SEO (as people would tend to search "marketing france" not "marketing fr")? 2. Which is better for SEO and rankings? Separate country specific domains: www.marketing.de and www.marketing.fr OR the use of subfolders in the url: www.marketing.com/de and www.marketing.com/fr
Algorithm Updates | | richardstrange0