Find Pages with 0 traffic
-
Hi,
We are trying to consolidate the amount of landing pages on our site, is there any way to find landing pages with a particular URL substring which have had 0 traffic?
The minimum which appears in google analytics is 1 visit.
-
This is a really nice solution! Thanks for sharing. It's super quick as well, so a GA export and a few VLOOKUPs/pivots later and you're sorted - nice one!
-
No problem my friend : -))
-
My bad. I misunderstood and misread. Thanks for the update.
-
He is trying to consolidate or find the total number of landing pages that do not have any traffic at all. So, screaming frog seo spider can be used to crawl the entire website (with the substring in the URLs) and substitute the URLs that have driven at least 1 visitor. He is not trying to get a hold of his historic or old analytics data. The question is pretty straight forward unless I missed something.
-
Yes, but how does that help him get the old data he needs? Crawlers shouldn't know your traffic unless you install the code they give you or verify some other way. Find it to be a crawler causing the problem unlikely unless I misunderstood the problem/question. I sure hope they have a Linux host (most are) and can just check the apache logs while Google Analytics takes a few days to update.
-
What webhost are you using? Most keep analytics software enabled by default or at least lets you turn it on. (While you wait for Google.) Analytics are a key part to SEO so I use awstats (free), and webalizer. With most hosts if not enabled its as easy as clicking a button.
Depending on your host, you might be able to get the raw log info, but most hosts don't have this option unless you paid for a fancy account which allows root shell access, but maybe not it differs from site to site.
Google Analytics will only show 1 visit if you are the only visitor even if you refresh the page or come pack. It saves your IP address and hardware profile most likely is the method they use. Make sure you change Google Analytics to display as far back as possible.
-
Hi, you can use a crawler like Screaming Frog SEO Spider to come up with total number of pages with some unique string in the URLs, substitute the URLs that have traffic from these and the rest will be ones with no traffic.
You will have to use the paid version of Screaming Frog SEO Spider if you want to crawl more than 500 pages and here is the section of the user guide that tells you how to do a regex crawl:
http://www.screamingfrog.co.uk/seo-spider/user-guide/configuration/#9
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Tracking using Custom URLs - is this viable?
Hi Moz community! I’ll try to make this question as easy to understand as possible, but please excuse me if it isn’t clear. Just joined a new team a few months ago and found out that on some of our most popular pages we use “custom URLs” to track page metrics within Google Analytics. NOTE: I say “custom URLs” because that is the best way for me to describe them. As an example: This page exists to our users: http://usnews.rankingsandreviews.com/cars-trucks/Ram_HD/2012/photos-interior/ But this is the URL we have coded on the page: cars-trucks/used-cars/reviews/2012-Ram-HD/photos-interior/ (within the custom variance script labeled as “var l_tracker=” ) It is this custom URL that we use within GA to look up metrics about this page. This is just one example of many across our site setup to do the same thing Here is a second example: Available page to user: http://usnews.rankingsandreviews.com/cars-trucks/Cadillac_ATS/2015/ Custom “var l_tracker=” /cars-trucks/2015-Cadillac-ATS/overview/ NOTE: There is a small amount of fear that the above method was implemented years ago as a work-around to a poorly structured URL architecture. Not validated, but that is a question that arose. Main Questions: Is the above implementation a normal and often used method to track pages in GA? (coming from an Omniture company before – this would not be how we handled page level tracking) Team members at my current company are divided on this method. Some believe this is not a proper implementation and are concerned that trying to hide these from Google will raise red flags (i.e. fake URLs in general = bad) I cannot find any reference to this method anywhere on the InterWebs - If method is not normal: Any recommendations on a solution to address this? Potential Problems? GA is currently cataloguing these tracking URLs in the Crawl Error report. Any concerns about this? The team wants to hide the URLs in the Robots.txt file, but some team members are concerned this may raise a red flag with Google and hurt us more than help us. Thank you in advance for any insight and/or advice. Chris
Reporting & Analytics | | usnseomoz0 -
Google Analytics - how do you find out Unique Visitors ?
Hi Im trying to find out unique visitors per annum in GA, is this possible, is it simply users ? i know they changed definitions recently cheers dan
Reporting & Analytics | | Dan-Lawrence0 -
Huge Traffic Drop after 301, Keyword and Schema.org Fixes
Hello there, I'm first gonna explain what I did to my website: I was using a 302 redirect to send from http to https, fixed it to a 301. My url has a keyword and I was using many pages with keywords as well. ex) www.keywordhaha.com/keyword-the-best , www.keywordhaha.com/keyword-easiest-on-keyword-market Changed it to : www.keywordhaha.com/app , www.keywordhaha.com/games, etc... I was not using any crawler tools, so I added Schema.org, Json-LD and rdfa-node, which are all working properly. Synced my page with our Google+ page, which was recognised by Google Added a proper logo and fb:admins, and was recognised by facebook. After I did all this optimisations, I experienced an immediate traffic drop (10%) and my impressions/clicks according to the webmaster tools dropped 75%, in a 2 day period. Any ideas where there could have been a mistake? mPdhFdG.png
Reporting & Analytics | | jancpc0 -
Backlink Tool Finding Links That Aren't Visible
I'm using a variety of backlink toolsl; OSE, SEM Rush, Link Research Tools and Google Webmaster Tools. On some occasions the tools are telling me there is a link to my site on a certain page, however when I open that page I can't see any mention of my company. I have run all these reports today, so I think it is unlikely that Google is using old data. Does anyone know why this happens and if I should still be concerned about links that I cannot see but backlink tools are telling me are there? Thanks guys.
Reporting & Analytics | | AAttias0 -
Can a 100% bounce rate page hurt whole website?
Hello, So with trying to figure out what I can do to better my website, I noticed a post on here that mentioned bounce rates. So I went to my Google analytics pages and listed my bounce rates. My average is 44% But I have a picture page that is 100% and a contact page that is 80% Can pages like this cause an algo penalty that could hurt a whole site? Thank you for your valuable insights
Reporting & Analytics | | Berner0 -
Google Analytics shows wrong traffic source
I was reviewing out traffic sources on GA for this past week's traffic history, and noticed a couple of visits & conversions that GA cited as coming from campaigns from emails that were sent out in 2012 and early 2013. it seems odd that we would have traffic & conversions from these older campaigns - on a regular basis. it's not even from one campaign, but from multiple campaigns. is it possible that it is a cookie issue? - i wonder if these visits were really from the email campaigns - or if they were from organic searches or other sources. Thoughts? Thanks!
Reporting & Analytics | | S.S.N0 -
Senuke and traffic generator program is a good idea?? I think i got some problems now.
First of all thanks for reading, especially if you are the one whose bright ideas will help me out:) I started using senuke xcr about 3 months ago, obviously at the beginning i didnt make much success(not like I do now). Later i bought that inferno thingy and it actually works. First 2 weeks didnt make much difference(although i could see some little but stable uprising) but after 4th week ended, the average impression and queries doubled up, 6th went up again, its like every week or two it jumps up and keeps it there. Also the actual traffic from keywords went up! When about the second week finished, i started using a traffic generator program, first it leveled out the impressions and seemed to help a bit. Lately i think it messed it all up, plus about 2 weeks ago there was 2-3 dayswhen i sent a bit more traffic than usual and around that time the average rising of impressions didnt happened, it might even went down. Now i stopped using traffic g. and everything stayed the same no improvement!! Anyone could help me? I need to get it moving up again! Also im still nowhere near the top as the keywords are competitive well at least for me. What do i do wrong and what should i do? Also what about traffic generator? ps is it safe or/and or allowed to write that? Thanks
Reporting & Analytics | | Sugafree0 -
Getting traffic for another site
Hi Everyone, Our website url/brand is very close to another website url/brand. We are non-competing entities. It appears as though this other company has begun a marketing program which has resulted in our traffic skyrocketing. However, it seems to have also resulted in our Pages/Visit and Visit Duration to decrease and our Bounce Rate to increase. Can anyone suggest how to deal with this type of scenario? Thanks,
Reporting & Analytics | | AC_Pro
Robert0