Noindex,follow - linked pages not showing
-
We have a blog on our site where the homepage and category pages have "noindex,follow" but the articles have "index,follow".
Recently we have noticed that the article pages are no longer showing in the Google SERPs (but they are in Bing!) - this was done by using the "site:" search operator.
Have double-checked our robots.txt file too just in case something silly had slipped in, but that's as it should be...
Has anyone else noticed similar behaviour or could suggest things I could check?
Thanks!
-
Well you're on Wordpress and are using YoastSEO. When a Wordpress category is created, a URL is generated for that category.
Your sitemap was created with Yoast:
Sitemap Last Modified |
|
- 2018-08-23 08:10 +01:00
|
- 2018-08-23 08:21 +01:00
|
- 2018-08-23 08:08 +01:00
|
- 2018-08-07 10:40 +01:00
|
- 2018-08-23 08:10 +01:00
|
- 2018-08-23 08:13 +01:00
|
I can see your articles are indexed now, but I would still recommend removing the Wordpress category URL's from your sitemap. Since the sitemap is commonly used for the things you want Google to crawl and index, I would add the article urls and content with "index,follow" webpages directly to your xml sitemap instead of linking the category pages you don't want indexed.
(IE: ie: http://www.genetex.com/sitemap.xml)
Yoast should give you this option in the settings for xml sitemap generation. If not, I would recommend using Screaming Frog to generate the sitemap.
-
Just as an update to this question, I submitted XML sitemaps directly to the blog articles and those pages are still not showing in the Google SERPs. It seems that new pages are discovered quite quickly (as per Google Alerts) but are then dropped from the index within a day or so.
The only pages which are returned consistently are links to the page which allows comments to be added.
The links which were initially identified as broken, were not actually broken so there was nothing to fix there.
Next step I can think of is to attempt some page sculpting by setting a noindex on the comments pages...
If anyone has any more thoughts or ideas, I'd appreciate your input
-
Great - thanks for your help
-
I resubmitted the sitemap for the blog in GWT and no errors were found...
I have to say I am v surprised at the number of dead links - we don't have that many blog posts so unless this is indicating content on our main site (where the pages are still . Even then, as I mentioned to Alan, the only missing content Google Webmaster Tools picks up on is where event tracking is used and it thinks the label is a link.... I did ask Google about these erroneous missing page and they said there was nothing that can be done to indicate they're not meant to be pages and that it would not affect the site's quality.
BTW, An article we published a few hours ago is now showing up in the Google results so it does seem like the rest of the pages have been penalised
Time to figure out what's going on with the missing pages...
Thanks, Irving
-
i sent the list, i had a bit of a look and it may be that they were timing out
-
Thanks Alan, have DM'ed you.
-
Submit a sitemap.xml file for these pages you want indexed, If they are linked to on the site and not blocked in robots.txt they will get indexed again. Definitely fix that sick amount of broken links, Google could be determining that these pages are not worth anything because the links on them are all dead ends.
-
The broken links were found using the Bing api. so bing will see them as such,
If yougive me a email, i willl send you the list
-
39 no-index pages on the blog could be correct with the category pages.
I'm quite surprised at the number of broken links - is this specific to /blog and are they actual links? GWT usually picks up event tracking as broken links...
Good point about the homepage - I should get a canonical tag on that...
Thanks!
-
I found 39 pages that have been no-index, does that add up?
I also found 33,000 broken links.
anouther problem you have is that both http://www.abcam.com/blog/ and http://www.abcam.com/blog/index.cfm are linked to in your site, this means that the pagerank is split. you should link to only http://www.abcam.com/blog/
-
The blog homepage is http://www.abcam.com/blog
@Alan: The rest of the site is indexable, just the the blog area where noindex has been used (the blog homepage and category pages are auto-generated and repeat a lot of the content in the articles)
@Shailendra: Yes, they were indexed - the last Google Alert which specifically highlights content from the blog is mid-June.
-
Firstly, you don't need to write index,follow on normal pages. Secondly, as you say, "no longer showing in Google SERPs", this means that it was earlier indexed, right? Now if it is no longer in Google's index, it means penalization. Please give the url of your website.
-
It may have something to do with the homepage being noindex, as that is unusual.
Can we get a url, I may find what you missed?
-
Hi,
Can you please share URL ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Updating inbound links vs. 301 redirecting the page they link to
Hi everyone, I'm preparing myself for a website redesign and finding conflicting information about inbound links and 301 redirects. If I have a URL (we'll say website.com/website) that is linked to by outside sources, should I get those outside sources to update their links when I change the URL to website.com/webpage? Or is it just as effective from a link juice perspective to simply 301 redirect the old page to the new page? Are there any other implications to this choice that I may want to consider? Thanks!
Technical SEO | | Liggins0 -
GWT shows 38 external links from 8 domains to this PDF - But it shows no links and no authority in OSE
Hi All, I found one other discussion about the subject of PDFs and passing of PageRank here: http://moz.com/community/q/will-a-pdf-pass-pagerank But this thread didn't answer my question so am posting it here. This PDF: http://www.ccisolutions.com/jsp/pdf/YAM-EMX_SERIES.PDF is reported by GWT to have 38 links coming from 8 unique domains. I checked the domains and some of them are high-quality relevant sites. Here's the list: Domains and Number of Links
Technical SEO | | danatanseo
prodiscjockeyequipment.com 9
decaturilmetalbuildings.com 9
timberlinesteelbuildings.com 6
jaymixer.com 4
panelsteelbuilding.com 4
steelbuildingsguide.net 3
freedocumentsearch.com 2
freedocument.net 1 However, when I plug the URL for this PDF into OSE, it reports no links and a Page Authority if only "1". This is not a new page. This is a really old page. In addition to that, when I check the PageRank of this URL, the PageRank is "nil" - not even "0" - I'm currently working on adding links back to our main site from within our PDFs, but I'm not sure how worthwhile this is if the PDFs aren't being allocated any authority from the pages already linking to them. Thoughts? Comments? Suggestions? Thanks all!0 -
Is the Authority of Individual Pages Diluted When You Add New Pages?
I was wondering if the authority of individual pages is diluted when you add new pages (in Google's view). Suppose your site had 100 pages and you added 100 new pages (without getting any new links). Would the average authority of the original pages significantly decrease and result in a drop in search traffic to the original pages? Do you worry that adding more pages will hurt pages that were previously published?
Technical SEO | | Charlessipe0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Noindex Pages indexed
I'm having problem that gogole is index my search results pages even though i have added the "noindex" metatag. Is the best thing to block the robot from crawling that file using robots.txt?
Technical SEO | | Tedred0 -
Does Google monitor which links are actually followed?
If a link to my site is posted on another site and many people start following that link and they stay on my site for a while, and I also have GA set up, does Google factor that information back into the Google rankings? In other words, are there ways that Google tries to use clicks and bounce rates and other user behavior to determine page rank and domain rank? Best,
Technical SEO | | ChristopherGlaeser
Christopher1 -
Javascript funtion as link? Why not show up?
We joined our Chamber of Commerce for the "link" as much as anything. After 9 months of having a link from our local chamber it has never showed up anywhere. You can see the link on my Chambers page, and you can click on it and it works. But it does not show up anywhere else....Not in any backlink checker, not in SEOmoz, not in Google Webmaster Tools. When I hover over our link on their page I see "javascript:encodeclick........my url" Is this link worth anything? What is a javascriptencodeclick? Does Google know it exists and give me credit for it? Our Chamber is clueless... they hire someone to do their website. Their webmasters response to my question was: Hi, These links look like this because this is just the way our system parses URLs that are entered into the membership directory so they can be clickable when displayed in the lister. These links will not have a negative effect on Google or SEO indexing purposes if that is what you are concerned about. They are not encoded or encrypted, this just happens to be the name of the Javascript function.
Technical SEO | | SCyardman0 -
Keywords Ranking Dropped from 1st Page to Above 5th Page
Hello, My site URL is http://bit.ly/161NeE and our site was ranked first page for over hundred keywords before March, 30. But all of a sudden, all the keywords on first page dropped to 5th or 6th page. When we search for our site name without ".com", the results appeared on first page are all from other sites. And our page can only be seen on 6th page. We think we have been penalized by Google. But we don't know the exact reason. Can anyone please help? Some extra info on our site: 1. We have been building links by posting blog, articles and PR. All the articles are unique, written by the writers we hire. It has been working fine all the time. We also varied the anchor text a lot. 2. We didn't make any change to the website. But one real problem with our site is that the server is very slow recently and when google crawl our website, many errors were found, mostly 503, 404 errors. And the total number of errors have reach to over 50,000. Do you think this might be a problem for Google not displaying us on first page? Our technicals are working hard to solve server problem. And if it is solved, shall our rankings be back? Please advise. Thanks.
Technical SEO | | Milanoocom0