Help with pages Google is labeling "Not Followed"
-
I am seeing a number of pages that I am doing 301 redirects on coming up under the "Not Followed" category of the advanced index status in google webmasters. Google says this might be because the page is still showing active content or the redirect is not correct. I don't know how to tell if the page is still showing active content, and if someone can please tell me how to determine this it would be greatly appreciated. Also if you can provide a solution for how to adjust my page to make sure that the content is not appearing to be active, that would be amazing. Thanks in advance, here is a few links to pages that are experiencing this:
-
Hi Joshua -
If someone is linking to the www version then it doesnt pass as much juice as it would if it wasnt redirected (theres lots of info on this on the internet with varied options). Overall, most SEO's agree that an inbound link that points directly to a page without being 301 redirected has more of a positive SEO effect.
With that being said, in your case Google Webmaster Tools may be detecting this double redirect error simply because there is an external website somewhere linking to the 'www' version. You can find this using OSE or using the WMT by going to CRAWL ERRORS and looking for the sunny-isles url. Clicking on it (if its there) will show who is linking to you and from where.
BTW - when did you do the redirects, and how long since you noticed the new url wasnt indexed (and was the old URL indexed?)
-
The 301 will preserve some of the authority passed through from the www version of the link.
One note - Google sometimes has a rough time with consecutive 301s. Normally it's only a problem if there are several in a row. Here you have two. You might consider reducing that to 1...?
-
MIght as well, yes.
-
Hello Ian,
Thanks for your help as well. Question for you, I current have not set a preferred version in my google webmasters account. Do you think I should go ahead and establish the non www version as my setting?
Thanks.
-
Hello Ian,
Thanks for your help as well. Question for you, I current have not set a preferred version in my google webmasters account. Do you think I should go ahead and establish the non www version as my setting?
Thanks.
-
Hi Jared,
Thank you very much for answering my question. So if someone is linking to me from another site, but uses the www version of a url does it not help my seo?
And if this is the case, what do you recommend I do?
Thanks.
-
Hi Joshua,
It looks like you're redirecting from the 'www' version to the non 'www' version. The 301 redirect is set up just fine.
2 things to check first:
- In Google Webmaster Tools, do you have the preferred domain set to the 'www' version? That might cause this confusion.
- In robots.txt, you're blocking Google Image Bot from crawling that folder. Once, I saw an instance where that screwed up Googlebot as well, and removing the disallow fixed the problem.
Ian
-
The link you referenced has 'www' in it, is that how the link is targeted on your website? If so, its probably the double redirect that is causing the issue. Since WP is set to 'non-www' - every time there is a call for the www version of a url, WP automatically 301 redirects it to the non-www version. There is nothing wrong with this.
Its when there is a call for a 'www' version of a URL that has also been redirected, as the one you cited has, where a double redirect now takes place:
http://www.luxuryhome..../sunnyilses.html
to the 'non-ww' version:
http://luxuryhome.../sunnyisles.html
then from there to the new html file version:
http://luxuryhome.../sunny-isles.html
The header check shows a normal www to non-www redirect first (WP is doing this), and then the 301 redirect that changes the sunnyisles to sunny-isles. Both server responses seem OK so the redirects themselves seem to be working. What you want to make sure of is:
Any internal links linking to the old sunnyisles.html page do not contain 'www'. (And in any event, these links should be changed to point to the new page anyway).
Any inbound links from external sources do not reference the 'www' version.
It would be helpful if we cound see the htaccess file as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google keeps marking different pages as duplicates
My website has many pages like this: mywebsite/company1/valuation mywebsite/company2/valuation mywebsite/company3/valuation mywebsite/company4/valuation ... These pages describe the valuation of each company. These pages were never identical but initially, I included a few generic paragraphs like what is valuation, what is a valuation model, etc... in all the pages so some parts of these pages' content were identical. Google marked many of these pages as duplicated (in Google Search Console) so I modified the content of these pages: I removed those generic paragraphs and added other information that is unique to each company. As a result, these pages are extremely different from each other now and have little similarities. Although it has been more than 1 month since I made the modification, Google still marks the majority of these pages as duplicates, even though Google has already crawled their new modified version. I wonder whether there is anything else I can do in this situation? Thanks
Technical SEO | | TuanDo96270 -
Google not returning an international version of the page
I run a website that duplicates some content across international editions. These are differentiated by the country codes e.g. /uk/folder/article1/ /au/folder/article1/ The UK version is considered the origin of the content. We currently use hreflang to differentiate content, however there is no actual regional or language variation between the content on these pages. Recently the UK version of a specific article is being indexed by Google as I am able to access via keyword search, however when I try to search for it via: site:domain.com/uk/folder/article1/then it is not displaying, however the AU version is. Identical articles in the same folder are not having this issue. There are no errors within webmaster tools and I have recently refetched the specific URL. Additionally when checking for internal links to the UK and AU edition of the article, I am getting internal links for the AU edition of the article however no internal links for the UK edition of the article. The main reason why this is problematic is because the article is now no longer appearing on the UK edition of the site for internal site search. How can I find out why Google is not getting a result when the URL is entered but it is coming up when doing a specific search?
Technical SEO | | AndDa0 -
Duplicate Page Content for www and non-www. Help!
Hi guys, having a bit of a tough time here... MOZ is reporting duplicate content for 21 pages on eagleplumbing.co.nz, however the reported duplicate is the www version of the page. For example: http://eagleplumbing.co.nz and http://www.eagleplumbing.co.nz are considered duplicates (see screenshot attached) Currently in search console I have just updated the non-www version to be set as the preferred version (I changed this back and forth twice today because I am confused!!!). Does anyone know what the correct course of action should be in this case? Things I have considered doing include: changing the preferred version to the www version in webmaster tools, setting up 301 redirects using a wordpress plugin called Eggplant 301 redirects. I have been doing some really awesome content creation and have created some good quality citations, so I think this is only thing that is eaffecting my rank. Any help would be greatly appreciated. view?usp=sharing
Technical SEO | | QRate0 -
Suddenly disappear from google SERP : Pls Help
I am facing too much problems with my all sites and i am afraid with google SERP result. I was penalize by google in previous yea and again Today too. i have a website name is removalinmelbourne.com.au i was happy with my seo because it was coming on the first page and 2nd page with most of the keywords like removalists melbourne, removals melbourne, movers in melbourne, removalists in melbourne and today i was shocked with my result this not showing anywhere on google . Please someone help me . How can i get back .
Technical SEO | | Tufail0 -
Google Dropping Pages After SEO Clean Up
I have been using SEOmoz to clear errors from a site. There
Technical SEO | | Andy56
were over 10,000 errors to start with. Most of these were duplicate content, duplicate titles and too many links on a page. Most of the duplicate errors have now been
cleared. This has been done in two weeks (down to around 3000 errors now). But instead of improving my rankings, pages that were on the second page of Google have started to drop out of the listings altogether. The pages that are dropping out
are not related to the duplicate problems and get A grades when I run SEOmoz
page reports. Can you clean up too much too quickly or is there likely to be another reason for it?0 -
Errors - 7300 - Duplicate Page Content..Help me..
Hi, I just received the crawl report with 7300 errors of duplicate page content. Site built using php. list of errors will be like this.. http://xxxxx.com/channels/ http://xxxxx.com/channels/?page=1 http://xxxxxx.com/channels/?page=2 I am not good in coding and using readymade script for this website. could anyone guide me to fix this issue? Thanks.
Technical SEO | | vilambara0 -
Changed cms - google indexes old and new pages
Hello again, after posting below problem I have received this answer and changed sitemap name Still I receive many duplicate titles and metas as google still compares old urls to new ones and sees duplicate title and description.... we have redirectged all pages properly we have change sitemap name and new sitemap is listed in webmastertools - old sitemap includes ONLY new sitemap files.... When you deleted the old sitemap and created a new one, did you use the same sitemap xml filename? They will still try to crawl old URLs that were in your previous sitemap (even if they aren't listed in the new one) until they receive a 404 response from the original sitemap. If anone can give me an idea why after 3 month google still lists the old urls I'd be more than happy thanks a lot Hello, We have changed cms for our multiple language website and redirected all odl URl's properly to new cms which is working just fine.
Technical SEO | | Tit
Right after the first crawl almost 4 weeks ago we saw in google webmaster tool and SEO MOZ that google indexes for almost every singlepage the old URL as well and the new one and sends us for this duplicate metatags.
We deleted the old sitemap and uploaded the new and thought that google then will not index the old URL's anymore. But we still see a huge amount of duplicate metatags. Does anyone know what else we can do, so google doe snot index the old url's anymore but only the new ones? Thanks so much Michelle0 -
I need help on web page load time, its very bad!
Note: This is KILLING my customer experience. Here is my webpage: http://www.stbands.com Here is a speed test that may help you (look at the poor ratings in the upper corner) http://www.webpagetest.org/result/110628_MW_Y8CQ/1/details/ I have an F on "Cache Static Content" - anyone know how I can fix this? Also, it is a e-commerce website hosted through core commmerce. I have some access to code but not all of it. Some of it is dynamic. However, if you tell me specific things I can forward it to their very awesome tech department. They are very willing to work with me and are now considering implementing a CDN after I schooled them. Any help is greatly appreciated. Don't be afraid to get very technical - I may not understand it, but the engineers there will.
Technical SEO | | Hyrule0