Help with pages Google is labeling "Not Followed"
-
I am seeing a number of pages that I am doing 301 redirects on coming up under the "Not Followed" category of the advanced index status in google webmasters. Google says this might be because the page is still showing active content or the redirect is not correct. I don't know how to tell if the page is still showing active content, and if someone can please tell me how to determine this it would be greatly appreciated. Also if you can provide a solution for how to adjust my page to make sure that the content is not appearing to be active, that would be amazing. Thanks in advance, here is a few links to pages that are experiencing this:
-
Hi Joshua -
If someone is linking to the www version then it doesnt pass as much juice as it would if it wasnt redirected (theres lots of info on this on the internet with varied options). Overall, most SEO's agree that an inbound link that points directly to a page without being 301 redirected has more of a positive SEO effect.
With that being said, in your case Google Webmaster Tools may be detecting this double redirect error simply because there is an external website somewhere linking to the 'www' version. You can find this using OSE or using the WMT by going to CRAWL ERRORS and looking for the sunny-isles url. Clicking on it (if its there) will show who is linking to you and from where.
BTW - when did you do the redirects, and how long since you noticed the new url wasnt indexed (and was the old URL indexed?)
-
The 301 will preserve some of the authority passed through from the www version of the link.
One note - Google sometimes has a rough time with consecutive 301s. Normally it's only a problem if there are several in a row. Here you have two. You might consider reducing that to 1...?
-
MIght as well, yes.
-
Hello Ian,
Thanks for your help as well. Question for you, I current have not set a preferred version in my google webmasters account. Do you think I should go ahead and establish the non www version as my setting?
Thanks.
-
Hello Ian,
Thanks for your help as well. Question for you, I current have not set a preferred version in my google webmasters account. Do you think I should go ahead and establish the non www version as my setting?
Thanks.
-
Hi Jared,
Thank you very much for answering my question. So if someone is linking to me from another site, but uses the www version of a url does it not help my seo?
And if this is the case, what do you recommend I do?
Thanks.
-
Hi Joshua,
It looks like you're redirecting from the 'www' version to the non 'www' version. The 301 redirect is set up just fine.
2 things to check first:
- In Google Webmaster Tools, do you have the preferred domain set to the 'www' version? That might cause this confusion.
- In robots.txt, you're blocking Google Image Bot from crawling that folder. Once, I saw an instance where that screwed up Googlebot as well, and removing the disallow fixed the problem.
Ian
-
The link you referenced has 'www' in it, is that how the link is targeted on your website? If so, its probably the double redirect that is causing the issue. Since WP is set to 'non-www' - every time there is a call for the www version of a url, WP automatically 301 redirects it to the non-www version. There is nothing wrong with this.
Its when there is a call for a 'www' version of a URL that has also been redirected, as the one you cited has, where a double redirect now takes place:
http://www.luxuryhome..../sunnyilses.html
to the 'non-ww' version:
http://luxuryhome.../sunnyisles.html
then from there to the new html file version:
http://luxuryhome.../sunny-isles.html
The header check shows a normal www to non-www redirect first (WP is doing this), and then the 301 redirect that changes the sunnyisles to sunny-isles. Both server responses seem OK so the redirects themselves seem to be working. What you want to make sure of is:
Any internal links linking to the old sunnyisles.html page do not contain 'www'. (And in any event, these links should be changed to point to the new page anyway).
Any inbound links from external sources do not reference the 'www' version.
It would be helpful if we cound see the htaccess file as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
We're a SaaS company and have a pretty extensive help center resource on a subdomain (help.domain.com). This has been set up and managed over a few years by someone with no knowledge of SEO, meaning technical things like 404 links, bad redirects and http/https mixes have not been paid attention to. Every page on this subdomain is set to NOT be indexed in search engines, but we do sometimes link to help pages from indexable posts on the main domain. After spending time fixing problems on our main website, our site audits now flag almost solely errors and issues on these non-indexable help center pages every week. So my question is: is it worth my time fixing technical issues on a help center subdomain that has all its pages non-indexable in search engines? I don't manage this section of the site, and so getting fixes done is a laborious process that requires going through someone else - something I'd rather only do if necessary.
Technical SEO | | mglover19880 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
Dupliacate page content from Moz which I need soem help with
Hi Just checked a diagnostic report in moz and we are getting duplicate page content for http://domain.co.uk and http://www.domain.co.uk attached is the screen shot Does anyone know how to fix this in magento? Thanks duplicate.png
Technical SEO | | tidybooks0 -
Off-page SEO and on-page SEO improvements
I would like to know what off-page SEO and on-page SEO improvements can be made to one of our client websites http://www.nd-center.com Best regards,
Technical SEO | | fkdpl2420 -
I have custom 404 page and getting so much 404 error on Google webmaster, what should i do?
I have a custom 404 page with popular post and category links in the page, everyday i have 404 crawl error on webmaster tools, what should i do?
Technical SEO | | rimon56930 -
No crawl code for pages of helpful links vs. no follow code on each link?
Our college website has many "owners" who want pages of "helpful links" resulting in a large number of outbound links. If we add code to the pages to prevent them from being crawled, will that be just as effective as making every individual link no follow?
Technical SEO | | LAJN0 -
Google is somehow linking my two sites that aren't linked! HELP
Good Morning... In my Google webmaster account it is showing an increase of backlinks between one site i own to the other.... This should not happen, as there are no links from one site to the other. I have thoroughly checked many pages on the new site to see if i can find a backlink, but i can't. Does anyone know why this is showing like this (google now shows 50,000 links from one site to the other).. Can someone please take a look and see if you can find any link from one to the other... original site : http://goo.gl/JgK1e new site : http://goo.gl/Jb4ng Please let me know why you guys think this is happening or if you were actually able to find a link on the new site pointing back to the old site... thanks a lot
Technical SEO | | Prime850 -
Why is Google only indexing 3 of 8 pages?
Hi everyone, I have a small 8 page website I launched about 6 months ago. For the life of me I can not figure out why google is only indexing 3 of the 8 pages. The pages are not duplicate content in any way. I have good internal linking structure. At this time I dont have many inbound links from others, that will come in time. Am I missing something here? Can someone give me a clue? Thanks Tim Site: www.jparizonaweddingvideos.com
Technical SEO | | fasctimseo0