Is Google suppressing a page from results - if so why?
-
UPDATE: It seems the issue was that pages were accessible via multiple URLs (i.e. with and without trailing slash, with and without .aspx extension). Once this issue was resolved, pages started ranking again.
Our website used to rank well for a keyword (top 5), though this was over a year ago now. Since then the page no longer ranks at all, but sub pages of that page rank around 40th-60th.
- I searched for our site and the term on Google (i.e. 'Keyword site:MySite.com') and increased the number of results to 100, again the page isn't in the results.
- However when I just search for our site (site:MySite.com) then the page is there, appearing higher up the results than the sub pages.
I thought this may be down to keyword stuffing; there were around 20-30 instances of the keyword on the page, however roughly the same quantity of keywords were on each sub pages as well.
I've now removed some of the excess keywords from all sections as it was getting in the way of usability as well, but I just wanted some thoughts on whether this is a likely cause or if there is something else I should be worried about.
-
Technically the disavow acts like a nofollow, so unless you think they might turn into "followed" at some point, you do not need to disavow them.
It can take 6+ months for a disavow to take effect too. So if it was submitted only recently, it might need some more time.
-
Unfortunately I've already been through the Webmaster Tools links and disavowed hundreds of domains (blog comment spam primarily).
I did overlook any press releases though, whether hosted on PRWeb or picked up on other sites, so the question remains should these be disavowed despite the fact they are no follow links?
-
Hi - I would recommend using webmaster tools in addition to Moz to check for backlinks. There are likely more links in there that OSE does not have.
What I usually do is pull the links from there, and crawl them with Screaming Frog (as some may be old and are now gone). There's a really good process for going through links here: http://www.greenlaneseo.com/blog/2014/01/step-by-step-disavow-process/ - although it's for disavowing in the article, you can use the process to find bad links for any situation.
-
There are only 7 external, equity passing links to this page - none of which use exact match anchor text.
There are also internal links; 284 banner links pointed to the page until last week (the same banner appeared on each page, hence the number) with the img alt text "sage erp x3 energise your business". In addition there are links throughout the site that feature the exact match anchor text - it is in the nav on every page for example. Im not sure if Google would take this into account, in my opinion is shouldn't as it is natural for on-site links to be exact match, unlike off site links.
That leaves the PRWeb articles, hosted on the PR web site and on sites that picked up the article, which are all no follow with exact match anchor text.
The only other thing I can think of, which I mentioned previously, is that there are multiple valid URLs for each page (with and without www, with and without .aspx, etc) -this bumps up the number of internal links, increases number of pages that can be indexed, could trigger duplicate content issues and 'water down' seo juice.
-
It's possible, although I would definitely look into any followed links that are of low quality or over optimized. The site may have just been over some sort of threashold and you'd want to reel back that percentage.
-
I've had a look and it seems all PRWeb links are no follow these days - could Google still be filtering due to anchor text despite this?
We have around 30 articles on PRWeb with around 3-400 pickups on really low quality sites, all with the exact match low quality anchor text links but they are all no follow.
-
Hi - yes you'd want to clean up the links to that page, and ideally in this order or preference;
- Try to get exact anchors changed on pages where the link quality is ok, but the anchors are over optimized
- Try to get links removed entirely from low quality pages
- If #1 and #2 are not possible, than disavow the links.
- Ideally of course, you'd want to acquire some new trusted links to the page.
At minimum you'd want to see the page show up again for site: searches with the term in question. That to me would be a sign the filter being applied has been removed. I'm not sure how long this would take Google to do, it may depend on how successful you are at the steps above.
-
No manual actions are listed within Webmaster Tools; I'd checked previously as the fact pages are listed in site: searches but not site: searches containing the term in question (e.g. "sage erp x3 site:datel.info") made me think that search engines were removing results in some way.
Links crossed my mind - there is another page on the site has a large number of very spammy links (hundreds of blog comments with exact match anchor), which I disavowed around 2-3 weeks ago. This page also suffers the same issue of appearing in the site search but not if the term is mentioned and it doesn't rank for the term, though it used to.
As I mentioned in one of the other comments, the number of links that competitors have for these sort of terms is very low and PR Web seems to be something we've done that competitors haven't. How would I go about finding out if it the culprit? Disavowing PR Web links, or seeing if the articles can be amended to remove the exact match anchor text?
-
This doesn't feel like an on-page thing to me. Perhaps it's the exact match anchor links from Press Releases? See the open site explorer report. For example, here is one page with such a link: http://www.prweb.com/releases/2013/8/prweb10974324.htm
Google's algo could have taken targeted action on that one page due to the exact match anchor backlinks, and because many are from press releases.
Have you checked webmaster tools for a partial manual penalty?
The suppression of this page when using site: searches could further indicate a link based issue.
-
It's an area with very low volumes of links. Comparing to two competitors, looking at only external link equity passing links:
- We have links on two of our microsites (domain authority 11 and and two press releases (DA 52 and 44)
- One competitor only has links from another site in it's group (Domain Authority 12) and 301 redirects from another group site (Domain Authority 22) - no other links.
- Another competitor has guest blog links (Domain Authority 44) and links on one 3rd party site (DA 31)
The only significant difference in backlink profile is that Moz reports we have a vast amount of internal links. I believe this is due to the main navigation have 2 sublayers via dropdowns - available on every page of the site.
In addition, URL's aren't rewritten in anyway, so the same page can be accessed:
- with and without www
- With and without .aspx
- With and without the trailing slash
This creates a vast number of combinations which results in our 4-500 page site having 100,000 internal links.
Though the site is available with different links, all links to the site and all links on the site use www and .aspx with no trailing slash, and Google has only indexed these pages, there don't appear to be any duplicates with different combinations of URL.
Other than not being an ideal set up, and it is something I want to change (IT are looking at installing the IIS rewrite module), could this be causing any harm I'm not aware of?
-
Thanks so far everyone - the responses so far have been helpful and have given me some areas to improve.
Unless they are related and I just don't know, the responses so far have addressed how to rank better and why the page may not be ranking for a term, however as far as I can see they don't address why the specific page isn't listed in the SERPs when you search for 'sage erp x3 site:datel.info', but it does appear for 'site:datel.info'.
The page has the keyword used a fair amount, but instead every single sup-page is listed - as are pages with only one use of the keyword in a paragraph of text. Until last week the keyword was used excessively on this page (over 35 uses of the keyword on a relatively short page) - which is why I wondered if Google may have suppressed it as it was being too spammy for that keyword? I've changed it now so if that were the case it should hopefully be changing soon, but I just wanted to know if that was a possible cause, or if there was something else that could be causing.
-
Thanks, I'm now in the process of changing this - for some reason keywords have been over abundant in every link and subpage, as you say it may be hard for google to select the correct page.
-
I checked but nothing that I can find unfortunately (or fortunately depending how you look at it).
-
Have you inspected the backlink profile for your page vs. the top ranking competitors? What are you seeing as far as relevance, quality and quantity of inbound links?
-
Grab a unique string of about 20 words from the page and search for it between quotes. Your page content may have been scraped and republished on other websites.
-
Thanks for the info. The page in discussion is as good or even better than many other pages trying to rank for the term, 'sage erp x3' from your website. I can clearly see an issue with too many pages from your website trying to rank for the term. My honest suggestion would be to please have the page titles and descriptions written optimally. Here is a general rule, one One page (internal page) should be optimized for one keyword/phrase.
Moreover if you look at the page titles of your website in Google results, Google actually modified the titles. This is a clear hint that its time to optimize your titles.
Here you go for more: http://searchenginewatch.com/article/2342232/Why-Google-Changes-Your-Titles-in-Search-Results
Its actually very confusing for a search engine line Google about which page from your website should rank high for the term, 'sage erp x3' as there are a many pages competing for the same term leading to keyword cannibalization.
Best regards,
Devanur Rafi
-
Looking at traffic, it looks like there was a drop in March 2013 but it's hard to pin point for certain as January 2013 was lower than January 2012, and there was a general downward trend until May 2013 at which point things have leveled out.
Unfortunately I wasn't working at the company at the time, so I'm not sure if any of this would correspond with levels of marketing activity.
The site in question is http://www.datel.info and the specific page I'm looking into at the moment is http://www.datel.info/our-solutions/sage-erp-x3.aspx though it isn't the only page affected. I just find it odd that it appears for the site search, but not the keyword & site search.
The site also has a fair amount of low quality links from comments on chinese blogs and forums due to the activity of an 'SEO Agency' 3 years ago. As I'm unable to get these removed, I've disavowed them in both Google and Bing webmaster tools. They were all anchor text rich, but none of them for this term or page.
-
Hi, have you seen a drop in the overall organic traffic from Google during the past year? However, if you are using Google Analytics for your site, you can try the following tool to check if there has been a hit due to the Panda updates:
http://www.barracuda-digital.co.uk/panguin-tool/
Without knowing the exact domain or the URLs, its hard to assess the exact issues. If you can share those details, we will be able to comment better.
Best regards,
Devanur Rafi
-
Hi Benita,
If you've been watching this keyword for a while and noticed the trend change, have you also noticed any of the other sites that were there with you (competitor sites within the top 10 when you were top 5) also change in rankings? How does their page copy look? Have they updated their content?
Have you looked at your copy on-page and seen that it appropriately addresses the theme of the keyword you're trying to be found for?
What is the ratio of links / images / navigation versus the text copy on the page? Does this seem natural to you when you look at the current top 5 sites / pages that rank?
Since the emergence of Panda & Penguin, the grey area previously allowed by Google to post repetitive content and questionable backlinks has significantly shrunk so if you now find you're badly off, chances are you might have moved from the grey area to the black...
my thoughts...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old pages not mobile friendly - new pages in process but don't want to upset current traffic.
Working with a new client. They have what I would describe as two virtual websites. Same domain but different coding, navigation and structure. Old virtual website pages fail mobile friendly, they were not designed to be responsive ( there really is no way to fix them) but they are ranking and getting traffic. New virtual website pages pass mobile friendly but are not SEO optimized yet and are not ranking and not getting organic traffic. My understanding is NOT mobile friendly is a "site" designation and although the offending pages are listed it is not a "page" designation. Is this correct? If my understanding is true what would be the best way to hold onto the rankings and traffic generated by old virtual website pages and resolve the "NOT mobile friendly" problem until the new virtual website pages have surpassed the old pages in ranking and traffic? A proposal was made to redirect any mobile traffic on the old virtual website pages to mobile friendly pages. What will happen to SEO if this is done? The pages would pass mobile friendly because they would go to mobile friendly pages, I assume, but what about link equity? Would they see a drop in traffic ? Any thoughts? Thanks, Toni
Technical SEO | | Toni70 -
Site's meta description is not being shown in Google Search results. Instead our privacy policy is getting indexed.
We re-launched our new site and put in the re-directs. Our site is https://www.fico.com/en. When I search for "fico" in Google. I see the privacy policy getting indexed as meta descriptions instead of our actual meta description. I have edited the meta description, requested Google to re-index our site. Not sure what to do next? Thanks for your advise.
Technical SEO | | gosheen0 -
Can Mobile Results show on Desktop results?
Hi everyone, I have a question - I have a client who has told me they had a problem with duplicate mobile pages showing on desktop results in place of their main URL pages They have a mobile domain mobile.domain.com. My understanding is that Google can differentiate mobile and nonmobile pages and hence you cannot see mobile results on desktop and duplication of pages is not a problem. Has anyone had this problem before? Is this possible? Thanks
Technical SEO | | CayenneRed890 -
Google not indexing /showing my site in search results...
Hi there, I know there are answers all over the web to this type of question (and in Webmaster tools) however, I think I have a specific problem that I can't really find an answer to online. site is: www.lizlinkleter.com Firstly, the site has been live for over 2 weeks... I have done everything from adding analytics, to submitting a sitemap, to adding to webmaster tools, to fetching each individual page as googlebot and then submitting to index via webmaster tools. I've checked my robot files and code elsewhere on the site and the site is not blocking search engines (as far as I can see) There are no security issues in webmaster tools or MOZ. Google says it has indexed 31 pages in the 'Index Status' section, but on the site dashboard it says only 2 URLS are indexed. When I do a site:www.lizlinketer.com search the only results I get are pages that are excluded in the robots file: /xmlrpc.php & /admin-ajax.php. Now, here's where I think the issue stems from - I developed the site myself for my wife and I am new to doing this, so I developed it on the live URL (I now know this was silly) - I did block the content from search engines and have the site passworded, but I think Google must have crawled the site before I did this - the issue with this was that I had pulled in the Wordpress theme's dummy content to make the site easier to build - so lots of nasty dupe content. The site took me a couple of months to construct (working on it on and off) and I eventually pushed it live and submitted to Analytics and webmaster tools (obviously it was all original content at this stage)... But this is where I made another mistake - I submitted an old site map that had quite a few old dummy content URLs in there... I corrected this almost immediately, but it probably did not look good to Google... My guess is that Google is punishing me for having the dummy content on the site when it first went live - fair enough - I was stupid - but how can I get it to index the real site?! My question is, with no tech issues to clear up (I can't resubmit site through webmaster tools) how can I get Google to take notice of the site and have it show up in search results? Your help would be massively appreciated! Regards, Fraser
Technical SEO | | valdarama0 -
Is There Google 6th page penalty?
My site have keyword domain but my page doesnt up or down at 6th page on search results. And my main page doesnt show 6th page too my alt pages. So what can i do for this penaly? Thanks for your help
Technical SEO | | iddaasonuclari0 -
Google places page where is my additional information
Hi When creating Google places you can add additional information but where does this information go? its not showing up on the page when you place page? Whats the best practice when creating pages in relation to optimising them ? thanks
Technical SEO | | Bristolweb0 -
Google showing former meta tags in search results inspite of new tags being crawled by it
I had changed the meta tags for a site www.aztexsodablast.com.au about a month back and Google has also crawled those new tags but in search results when I search for the term 'Aztex Sodablast' it is continuing to show the old tags while on the site, the new tags are being displayed. What may be the issue and how could I correct the problem?
Technical SEO | | pulseseo0 -
Title Tag for homepage Not displaying correctly in Google Search Results
Hi We have a problem with how Google is is displaying our title tag when a user does a search on our company name Increation. I have attached two images of how the title tag for our Homepage is dispayed in Google Search Results. One of them is "increations" - this is the result which is displaying incorrectly. The other shows Kitchens | Bathrooms | | Luxury Kitchens | - which is how we want the title tag to be dispalyed. This inorreclt result is only displayed when a user types in Increation into Google. (BTW it only happens on Google, not the other search engines). To give you some background an external SEO company did some work for us (before I joined as usual :-)). They built a lot of anchor links with the word "Increations" pointing back to our home page. Is this whats causing the probelm? Also how can we resolve this issue so that the correct Title Tag is displayed. Thanks for your help. Mik increation_search_results_google#hyhfZ
Technical SEO | | increation0