Why are our sites top landing pages URL's that no longer exist and retrun 404 errors?
-
Digging through analytics today an noticed that our sites top landing pages are for pages that were part of the old www.towelsrus.co.uk website taken down almost 12 months ago. All these pages had the 301 re-directs which were removed a few months back but still have not dropped out of Googles crawl error logs.
I can't understand why this is happening but almost certainly the bounce rate on these pages (100%) mean we are loosing potential conversions.
How can I identify what keywords and links people are using to land on these pages?
-
I'm glad to help.
If the visits are not organic or referral then I wouldn't worry too much about this.
-
Hi,
Really appreciate all the info, a great help.
I have looked hard at Google Analytics and cannot see any referrals from these links or organic search results. Also having checked Webmaster tools there are no impression for these pages so it's i am clueless as to where they are coming from. One URL has had a least 30 to 40 visitors to the site this month with 100% bounce rate.
An odd one I think.
-
Hi,
If Google Analytics says that the visits are direct then these visits did not result from a link on a different website. A direct visit is from a person either typing in the URL directly into the browser, or clicking on a PDF (or word doc) link.
You might have some promotional material, a white paper or someone is just typing in the URL. Use your advanced segments to double check and make sure that the visitors are not landing on your site from referral links or organic search results.
If you really, really want to make sure that these pages are not ranking you can link your Google Analytics to your webmaster tools then check to see if there are any impressions for those pages. If there are no impressions then those pages have definitly been de-indexed.
For more info on how to link GA to GWMT visit http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1120006
If you get stuck just give me a shout! (You can get hold of me on Twitter - @shaungambit )
-
Hi,
Thanks for your help, just what i was after. Interesting the first landing page i selected show no keywords set and traffic was direct, so I can only assume that there is a link on some website/blog that people are using to enter our site via that url. What do you think?
-
Hi,
You can identify the keywords that people are using to land on those pages by digging into your analytics. I'll explain the process assuming that you are using Google Analytics:
Once you are on the correct profile in analytics click on the "content" tab, then click on "site content". You will now be able to click on "all pages" on the "site content" dropdown. In the filter box type the page url and press enter on your keyboard. Click on the correct URL in this filtered list. Just above the list you will find a tab called "Secondary dimention", select this and then select "Traffic Sources" and then "Keywords". This should give you a list of all the keywords that have been used to find those pages.
As for the links, instead of selecting "keywords" under traffic sources, you can select source and it'll give you the domains that the visitors are being referred from.
Tip: When looking for links you can use advanced segments to only include referral traffic
Please let me know if this works or if you have more questions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to optimize count of interlinking by increasing Interlinking count of chosen landing pages and decreasing for less important pages within the site?
We have taken out our interlinking counts (Only Internal Links and not Outbound Links) through Google WebMaster tool and discovered that the count of interlinking of our most significant pages are less as compared to of less significant pages. Our objective is to reverse the existing behavior by increasing Interlinking count of important pages and reduce the count for less important pages so that maximum link juice could be transferred to right pages thereby increasing SEO traffic.
Intermediate & Advanced SEO | | vivekrathore0 -
What can cause for a service page to rank in Google's Answer Box?
Hello Everyone, Have recently seen a Google result for "vps hosting" showing service page details in Answer Box. I would really like to know, what can cause a service page to appear in the Answer Box? Have attached a screenshot of result page. CaRiWtQUcAALn9n.png CaRiWtQUcAALn9n.png
Intermediate & Advanced SEO | | eukmark0 -
URL Errors in webmaster tools to pages that don't exist?
Hello, for sometime now we have URLs showing up in Google webmaster saying these are 404 errors but don't exist on our website.......but also never have? Heres an example cosmetic-dentistry/28yearold-southport-dentist-wins-best-young-dentist-award/801530293 The root being this goo.gl/vi4N4F Really confused about this? We have recently made our website wordpress? Thanks Ade
Intermediate & Advanced SEO | | popcreativeltd0 -
What if page exists for desktop but not mobile?
I have a domain (no subdomains) that serves up different dynamic content for mobile/desktop pages--each having the exact same page url, kind of a semi responsive design, and will be using "Vary: User-Agent" to give Google a heads up on this setup. However, some of the pages are only valid for mobile or only valid for desktop. In the case of when a page is valid only for mobile (call it mysite.com/mobile-page-only ), Google Webmaster Tools is giving me a soft 404 error under Desktop, saying that the page does not exist, Apparently it is doing that because my program is actually redirecting the user/crawler to the home page. It appears from the info about soft 404 errors that Google is saying since it "doesn't exist" I should give the user a 404 page--which I can make it customized and give the user an option to go to the home page, or choose links from a menu, etc.. My concern is that if I tell the desktop bot that mysite.com/mobile-page-only basically is a 404 error (ie doesn't exist), that it could mess up the mobile bot indexing for that page--since it definitely DOES exist for mobile users.. Does anyone here know for sure that Google will index a page for mobile that is a 404 not found for desktop and vice versa? Obviously it is important to not remove something from an index in which it belongs, so whether Google is careful to differential the two is a very important issue. Has anybody here dealt with this or seen anything from Google that addresses it? Might one be better off leaving it as a soft 404 error? EDIT: also, what about Bing and Yahoo? Can we assume they will handle it the same way? EDIT: closely related question--in a case like mine does Google need a separate sitemap for the valid mobile pages and valid desktop pages even though most links will be in both? I can't tell from reading several q&a on this. Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
Best solution to get mass URl's out the SE's index
Hi, I've got an issue where our web developers have made a mistake on our website by messing up some URL's . Because our site works dynamically IE the URL's generated on a page are relevant to the current URL it ment the problem URL linked out to more problem URL's - effectively replicating an entire website directory under problem URL's - this has caused tens of thousands of URL's in SE's indexes which shouldn't be there. So say for example the problem URL's are like www.mysite.com/incorrect-directory/folder1/page1/ It seems I can correct this by doing the following: 1/. Use Robots.txt to disallow access to /incorrect-directory/* 2/. 301 the urls like this:
Intermediate & Advanced SEO | | James77
www.mysite.com/incorrect-directory/folder1/page1/
301 to:
www.mysite.com/correct-directory/folder1/page1/ 3/. 301 URL's to the root correct directory like this:
www.mysite.com/incorrect-directory/folder1/page1/
www.mysite.com/incorrect-directory/folder1/page2/
www.mysite.com/incorrect-directory/folder2/ 301 to:
www.mysite.com/correct-directory/ Which method do you think is the best solution? - I doubt there is any link juice benifit from 301'ing URL's as there shouldn't be any external links pointing to the wrong URL's.0 -
301 Redirect All Url's - WWW -> HTTP
Hi guys, This is part 2 of a question I asked before which got partially answered; I clicked question answered before I realized it only fixed part of the problem so I think I have to post a new question now. I have an apache server I believe on Host Gator. What I want to do is redirect every URL to it's corresponding alternative (www redirects to http). So for example if someone typed in www.mysite.com/page1 it would take them to http://mysite.com/page1 Here is a code that has made all of my site's links go from WWW to HTTP which is great, but the problem is still if you try to access the WWW version by typing it, it still works and I need it to redirect. It's important because Google has been indexing SOME of the URL's as http and some as WWW and my site was just HTTP for a long time until I made the mistake of switching it now I'm having a problem with duplicate content and such. Updated it in Webmaster Tools but I need to do this regardless for other SE's. Thanks a ton! RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^www.yourdomain.com [NC] RewriteRule ^(.*)$ http://yourdomain.com/$1 [L,R=301]
Intermediate & Advanced SEO | | DustinX0 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0 -
Managing 404 errors
What is the best way to manage 404 errors for pages that are no longer on the server. For example, client deletes old site from server and replaces it with new site. Webmaster tools is reporting 100 + 404 errors form the old site. I've blocked the 404 pages with robot.text. Requested removal in google webmaster tools. And created a custom 404 page - http://www.tvsphoto.com/missingurlexample Is there anything else I can do?
Intermediate & Advanced SEO | | SEOProPhoto0