Is Google suppressing a page from results - if so why?
-
UPDATE: It seems the issue was that pages were accessible via multiple URLs (i.e. with and without trailing slash, with and without .aspx extension). Once this issue was resolved, pages started ranking again.
Our website used to rank well for a keyword (top 5), though this was over a year ago now. Since then the page no longer ranks at all, but sub pages of that page rank around 40th-60th.
- I searched for our site and the term on Google (i.e. 'Keyword site:MySite.com') and increased the number of results to 100, again the page isn't in the results.
- However when I just search for our site (site:MySite.com) then the page is there, appearing higher up the results than the sub pages.
I thought this may be down to keyword stuffing; there were around 20-30 instances of the keyword on the page, however roughly the same quantity of keywords were on each sub pages as well.
I've now removed some of the excess keywords from all sections as it was getting in the way of usability as well, but I just wanted some thoughts on whether this is a likely cause or if there is something else I should be worried about.
-
Technically the disavow acts like a nofollow, so unless you think they might turn into "followed" at some point, you do not need to disavow them.
It can take 6+ months for a disavow to take effect too. So if it was submitted only recently, it might need some more time.
-
Unfortunately I've already been through the Webmaster Tools links and disavowed hundreds of domains (blog comment spam primarily).
I did overlook any press releases though, whether hosted on PRWeb or picked up on other sites, so the question remains should these be disavowed despite the fact they are no follow links?
-
Hi - I would recommend using webmaster tools in addition to Moz to check for backlinks. There are likely more links in there that OSE does not have.
What I usually do is pull the links from there, and crawl them with Screaming Frog (as some may be old and are now gone). There's a really good process for going through links here: http://www.greenlaneseo.com/blog/2014/01/step-by-step-disavow-process/ - although it's for disavowing in the article, you can use the process to find bad links for any situation.
-
There are only 7 external, equity passing links to this page - none of which use exact match anchor text.
There are also internal links; 284 banner links pointed to the page until last week (the same banner appeared on each page, hence the number) with the img alt text "sage erp x3 energise your business". In addition there are links throughout the site that feature the exact match anchor text - it is in the nav on every page for example. Im not sure if Google would take this into account, in my opinion is shouldn't as it is natural for on-site links to be exact match, unlike off site links.
That leaves the PRWeb articles, hosted on the PR web site and on sites that picked up the article, which are all no follow with exact match anchor text.
The only other thing I can think of, which I mentioned previously, is that there are multiple valid URLs for each page (with and without www, with and without .aspx, etc) -this bumps up the number of internal links, increases number of pages that can be indexed, could trigger duplicate content issues and 'water down' seo juice.
-
It's possible, although I would definitely look into any followed links that are of low quality or over optimized. The site may have just been over some sort of threashold and you'd want to reel back that percentage.
-
I've had a look and it seems all PRWeb links are no follow these days - could Google still be filtering due to anchor text despite this?
We have around 30 articles on PRWeb with around 3-400 pickups on really low quality sites, all with the exact match low quality anchor text links but they are all no follow.
-
Hi - yes you'd want to clean up the links to that page, and ideally in this order or preference;
- Try to get exact anchors changed on pages where the link quality is ok, but the anchors are over optimized
- Try to get links removed entirely from low quality pages
- If #1 and #2 are not possible, than disavow the links.
- Ideally of course, you'd want to acquire some new trusted links to the page.
At minimum you'd want to see the page show up again for site: searches with the term in question. That to me would be a sign the filter being applied has been removed. I'm not sure how long this would take Google to do, it may depend on how successful you are at the steps above.
-
No manual actions are listed within Webmaster Tools; I'd checked previously as the fact pages are listed in site: searches but not site: searches containing the term in question (e.g. "sage erp x3 site:datel.info") made me think that search engines were removing results in some way.
Links crossed my mind - there is another page on the site has a large number of very spammy links (hundreds of blog comments with exact match anchor), which I disavowed around 2-3 weeks ago. This page also suffers the same issue of appearing in the site search but not if the term is mentioned and it doesn't rank for the term, though it used to.
As I mentioned in one of the other comments, the number of links that competitors have for these sort of terms is very low and PR Web seems to be something we've done that competitors haven't. How would I go about finding out if it the culprit? Disavowing PR Web links, or seeing if the articles can be amended to remove the exact match anchor text?
-
This doesn't feel like an on-page thing to me. Perhaps it's the exact match anchor links from Press Releases? See the open site explorer report. For example, here is one page with such a link: http://www.prweb.com/releases/2013/8/prweb10974324.htm
Google's algo could have taken targeted action on that one page due to the exact match anchor backlinks, and because many are from press releases.
Have you checked webmaster tools for a partial manual penalty?
The suppression of this page when using site: searches could further indicate a link based issue.
-
It's an area with very low volumes of links. Comparing to two competitors, looking at only external link equity passing links:
- We have links on two of our microsites (domain authority 11 and and two press releases (DA 52 and 44)
- One competitor only has links from another site in it's group (Domain Authority 12) and 301 redirects from another group site (Domain Authority 22) - no other links.
- Another competitor has guest blog links (Domain Authority 44) and links on one 3rd party site (DA 31)
The only significant difference in backlink profile is that Moz reports we have a vast amount of internal links. I believe this is due to the main navigation have 2 sublayers via dropdowns - available on every page of the site.
In addition, URL's aren't rewritten in anyway, so the same page can be accessed:
- with and without www
- With and without .aspx
- With and without the trailing slash
This creates a vast number of combinations which results in our 4-500 page site having 100,000 internal links.
Though the site is available with different links, all links to the site and all links on the site use www and .aspx with no trailing slash, and Google has only indexed these pages, there don't appear to be any duplicates with different combinations of URL.
Other than not being an ideal set up, and it is something I want to change (IT are looking at installing the IIS rewrite module), could this be causing any harm I'm not aware of?
-
Thanks so far everyone - the responses so far have been helpful and have given me some areas to improve.
Unless they are related and I just don't know, the responses so far have addressed how to rank better and why the page may not be ranking for a term, however as far as I can see they don't address why the specific page isn't listed in the SERPs when you search for 'sage erp x3 site:datel.info', but it does appear for 'site:datel.info'.
The page has the keyword used a fair amount, but instead every single sup-page is listed - as are pages with only one use of the keyword in a paragraph of text. Until last week the keyword was used excessively on this page (over 35 uses of the keyword on a relatively short page) - which is why I wondered if Google may have suppressed it as it was being too spammy for that keyword? I've changed it now so if that were the case it should hopefully be changing soon, but I just wanted to know if that was a possible cause, or if there was something else that could be causing.
-
Thanks, I'm now in the process of changing this - for some reason keywords have been over abundant in every link and subpage, as you say it may be hard for google to select the correct page.
-
I checked but nothing that I can find unfortunately (or fortunately depending how you look at it).
-
Have you inspected the backlink profile for your page vs. the top ranking competitors? What are you seeing as far as relevance, quality and quantity of inbound links?
-
Grab a unique string of about 20 words from the page and search for it between quotes. Your page content may have been scraped and republished on other websites.
-
Thanks for the info. The page in discussion is as good or even better than many other pages trying to rank for the term, 'sage erp x3' from your website. I can clearly see an issue with too many pages from your website trying to rank for the term. My honest suggestion would be to please have the page titles and descriptions written optimally. Here is a general rule, one One page (internal page) should be optimized for one keyword/phrase.
Moreover if you look at the page titles of your website in Google results, Google actually modified the titles. This is a clear hint that its time to optimize your titles.
Here you go for more: http://searchenginewatch.com/article/2342232/Why-Google-Changes-Your-Titles-in-Search-Results
Its actually very confusing for a search engine line Google about which page from your website should rank high for the term, 'sage erp x3' as there are a many pages competing for the same term leading to keyword cannibalization.
Best regards,
Devanur Rafi
-
Looking at traffic, it looks like there was a drop in March 2013 but it's hard to pin point for certain as January 2013 was lower than January 2012, and there was a general downward trend until May 2013 at which point things have leveled out.
Unfortunately I wasn't working at the company at the time, so I'm not sure if any of this would correspond with levels of marketing activity.
The site in question is http://www.datel.info and the specific page I'm looking into at the moment is http://www.datel.info/our-solutions/sage-erp-x3.aspx though it isn't the only page affected. I just find it odd that it appears for the site search, but not the keyword & site search.
The site also has a fair amount of low quality links from comments on chinese blogs and forums due to the activity of an 'SEO Agency' 3 years ago. As I'm unable to get these removed, I've disavowed them in both Google and Bing webmaster tools. They were all anchor text rich, but none of them for this term or page.
-
Hi, have you seen a drop in the overall organic traffic from Google during the past year? However, if you are using Google Analytics for your site, you can try the following tool to check if there has been a hit due to the Panda updates:
http://www.barracuda-digital.co.uk/panguin-tool/
Without knowing the exact domain or the URLs, its hard to assess the exact issues. If you can share those details, we will be able to comment better.
Best regards,
Devanur Rafi
-
Hi Benita,
If you've been watching this keyword for a while and noticed the trend change, have you also noticed any of the other sites that were there with you (competitor sites within the top 10 when you were top 5) also change in rankings? How does their page copy look? Have they updated their content?
Have you looked at your copy on-page and seen that it appropriately addresses the theme of the keyword you're trying to be found for?
What is the ratio of links / images / navigation versus the text copy on the page? Does this seem natural to you when you look at the current top 5 sites / pages that rank?
Since the emergence of Panda & Penguin, the grey area previously allowed by Google to post repetitive content and questionable backlinks has significantly shrunk so if you now find you're badly off, chances are you might have moved from the grey area to the black...
my thoughts...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My WP website got attack by malware & now my website site:www.example.ca shows about 43000 indexed page in google.
Hi All My wordpress website got attack by malware last week. It affected my index page in google badly. my typical site:example.ca shows about 130 indexed pages on google. Now it shows about 43000 indexed pages. I had my server company tech support scan my site and clean the malware yesterday. But it still shows the same number of indexed page on google. Does anybody had ever experience such situation and how did you fixed it. Looking for help. Thanks FILE HIT LIST:
Technical SEO | | Chophel
{YARA}Spam_PHP_WPVCD_ContentInjection : /home/example/public_html/wp-includes/wp-tmp.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-includes/wp-vcd.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-content/themes/oceanwp.zip
{YARA}webshell_webshell_cnseay02_1 : /home/example2/public_html/content.php
{YARA}eval_post : /home/example2/public_html/wp-includes/63292236.php
{YARA}webshell_webshell_cnseay02_1 : /home/example3/public_html/content.php
{YARA}eval_post : /home/example4/public_html/wp-admin/28855846.php
{HEX}php.generic.malware.442 : /home/example5/public_html/wp-22.php
{HEX}php.generic.cav7.421 : /home/example5/public_html/SEUN.php
{HEX}php.generic.malware.442 : /home/example5/public_html/Webhook.php0 -
Tough SEO problem, Google not caching page correctly
My web site is http://www.mercimamanboutique.com/ Cached version of French version is, cache:www.mercimamanboutique.com/fr-fr/ showing incorrectly The German version: cache:www.mercimamanboutique.com/de-de/ is showing correctly. I have resubmitted site links, and asked Google re-index the web site many times. The German version always gets cached properly, but the French version never does. This is frustrating me, any idea why? Thanks.
Technical SEO | | ss20160 -
Why Google ranks a page with Meta Robots: NO INDEX, NO FOLLOW?
Hi guys, I was playing with the new OSE when I found out a weird thing: if you Google "performing arts school london" you will see w w w . mountview . org. uk at the 3rd position. The point is that page has "Meta Robots: NO INDEX, NO FOLLOW", why Google indexed it? Here you can see the robots.txt allows Google to index the URL but not the content, in article they also say the meta robots tag will properly avoid Google from indexing the URL either. Apparently, in my case that page is the only one has the tag "NO INDEX, NO FOLLOW", but it's the home page. so I said to myself: OK, perhaps they have just changed that tag therefore Google needs time to re-crawl that page and de-index following the no index tag. How long do you think it will take to don't see that page indexed? Do you think it will effect the whole website, as I suppose if you have that tag on your home page (the root domain) you will lose a lot of links' juice - it's totally unnatural a backlinks profile without links to a root domain? Cheers, Pierpaolo
Technical SEO | | madcow780 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Google sees 2 home pages while I only have 1
How to solve the problem of google seeing both domain.com and domain.com/index.htm when I only have one file? Will the cannonical work? If so which? Or any other solutions for a novice? I learned from previous blogs that it needs to be done by hosting service, but Yahoo has no solution.
Technical SEO | | Kurtyj0 -
My site was Not removed from google, but my most visited page was. what does that mean?
Help. My most important page http://hoodamath.com/games/ has disappeared from google, why the rest of my site still remains. i can't find anything about this type of ban. any help would be appreciated ( i would like to sleep tonight)
Technical SEO | | hoodamath0 -
Do Collections in Shopify create Duplicate Pages according to Google/Bing/Yahoo?
I'm using the e-commerce platform Shopify to host an e-store. We've put our products into different collections. Shopify automatically creates different URL paths to a product in multiple collections. I'm worried that the same product listed in different collections is soon as different pages, and therefore duplicate content by Google/Bing/Yahoo. Would love to get your opinion on this concern! Thanks! Matthew
Technical SEO | | HappinessDigital0 -
Help: Google Time Spent Downloading a Page, My Site is Slow
All, My site: http://www.nationalbankruptcyforum.com shows an average time spent downloading a page of 1,489 (in milliseconds) We've had spikes of well over 3,000 and lows of around 980 (all according to WMT). I understand that this is really slow. Does anyone have some suggestions as to how I could improve load times? Constructive criticism welcomed and encouraged.
Technical SEO | | JSOC0