Filtering Views in GA
-
Hi there,
Does anyone here any experience in filtering views in Google Analytics by TLD? I thought the filter type of hostname would have done what I was looking for but it hasn't and I can only find information online about doing it for subdomains rather than top level ones.
Many thanks in advance.
-
If you want to see it in the screenshot you gave me, I would imagine you would need a specific Google Analytics for the TLD. If that's not possible, you can also try to check out this article which I found (It's not by me, but when I searched your query this is what came out).
Aside from that, I'm not sure what else could work if you want to see how many views per TLD. If you want a full dashboard on your TLD, the method above might work although it is dated.
-
Yes I do
-
When you say TLD, you mean top-level domain correct?
like .com, .net, .ca, etc correct?
-
Thanks for your reply, however not sure I explained my issue properly.
I meant a view as in the top-level, as in the attached. Once you are in admin you can filter by the view but I can't work out how to do it by TLD.
-
I would go to Behaviour -- Site Content -- then enter my TLD in the search box, like ".whatever".
That should give you all of the views for content.whatever.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Quick View windows affect SEO?
I have the following website that I am building (http://www.jetabout.ca/cruises/). All the items listed link to quick view pop ups. I was wondering how does this affect SEO and will Google be able to pick up on this?
Intermediate & Advanced SEO | | cholidays0 -
Best-practice URL structures with multiple filter combinations
Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
Intermediate & Advanced SEO | | digitalcrc
or
www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!0 -
What exactly is an impression in Google Webmaster Tools search queries with the image filter turned on?
Is it when someone does an image search? Or does it count a regular search that has images in it? On an image search does the picture actually have to be viewed on the screen or can it be below in the infinite scroll?
Intermediate & Advanced SEO | | EcommerceSite0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
E-commerce category page optimization - filters vs. categories
Hi, We currently have a site where there are several subcategories for every main category. So this means that visitors will have to click through 3-4 subcategories before reaching products that they could have easily found if the site would be using filters on category pages. My question is - if a subcategory page with 4 products is currently a category page (optimized heading, description) and I'd want this category to be available through filters, how do I still keep it optimized for search engines? So under a category "Cleaners", we have all cleaning products. There are 8 "Cable cleaners" under this category. This is currently a subcategory, but I'd just solve this with a filter in the "Cleaners" screen. Not sure what's right from an SEO standpoint here.
Intermediate & Advanced SEO | | JaanMSonberg0 -
Canonical tags and GA tracking on premium sub-domain?
Hello! I'm launching a premium service on my site that will deliver two fairly distinct user experiences, but with nearly identical page content across the two. I'm thinking of placing the "upgraded" version on a subdomain, e.g. www.mysite.com, premium.mysite.com. Simple enough. I've run into two obstacles, however: -I don't want the premium site crawled separately, so I'd like to use canonical tags to pull all premium.* back to their www.* parents. --How different can page content be before canonical tags backfire? --Is there any other danger in using canonicals across subdomains like this? -Less importantly: with Google Analytics, if I track against the subdomain my visits will split naturally, and it should generate a second cookie for a new registrant who crosses subdomains. I could also use a visitor-level custom var. Good idea? Bad idea? Thanks! -m
Intermediate & Advanced SEO | | grumbles0 -
Rel="prev" and view all question
Okay, I've read the posts by Google about the new prev, next tags and the suggestion for using a view all option. I've also read the posts here on SEOMoz on the topic but none of them quite address what we have. First, Some of our main categories are very large (over 6000 pieces of jewelry) so a view all option would take forever to load be completely useless to a visitor. Second, our category home pages provide (here's an example😞 A description of the category with links to important sections and articles A row of new items A dozen of the popular items from the category. Links to related articles if applicable. So we have a real category home page with content instead of just categories that start immediately with pages of product. Should we set the canonical url for all of the browse pages to the main category page, create a view all page or just use the next and previous rel tags with the category home pages as the first in the series?
Intermediate & Advanced SEO | | IanTheScot0 -
How to prevent Google from crawling our product filter?
Hi All, We have a crawler problem on one of our sites www.sneakerskoopjeonline.nl. On this site, visitors can specify criteria to filter available products. These filters are passed as http/get arguments. The number of possible filter urls is virtually limitless. In order to prevent duplicate content, or an insane amount of pages in the search indices, our software automatically adds noindex, nofollow and noarchive directives to these filter result pages. However, we’re unable to explain to crawlers (Google in particular) to ignore these urls. We’ve already changed the on page filter html to javascript, hoping this would cause the crawler to ignore it. However, it seems that Googlebot executes the javascript and crawls the generated urls anyway. What can we do to prevent Google from crawling all the filter options? Thanks in advance for the help. Kind regards, Gerwin
Intermediate & Advanced SEO | | footsteps0