Filtering Views in GA
-
Hi there,
Does anyone here any experience in filtering views in Google Analytics by TLD? I thought the filter type of hostname would have done what I was looking for but it hasn't and I can only find information online about doing it for subdomains rather than top level ones.
Many thanks in advance.
-
If you want to see it in the screenshot you gave me, I would imagine you would need a specific Google Analytics for the TLD. If that's not possible, you can also try to check out this article which I found (It's not by me, but when I searched your query this is what came out).
Aside from that, I'm not sure what else could work if you want to see how many views per TLD. If you want a full dashboard on your TLD, the method above might work although it is dated.
-
Yes I do
-
When you say TLD, you mean top-level domain correct?
like .com, .net, .ca, etc correct?
-
Thanks for your reply, however not sure I explained my issue properly.
I meant a view as in the top-level, as in the attached. Once you are in admin you can filter by the view but I can't work out how to do it by TLD.
-
I would go to Behaviour -- Site Content -- then enter my TLD in the search box, like ".whatever".
That should give you all of the views for content.whatever.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Canonicalise all filter pages (URL parameters) to the main category
Hi guys, I am working on an e-commerce site that's running in Shopify. I noticed that the filter pages do not have canonical tags pointing to their respective main categories. I doubt that the action needed is to canonicalise each filter pages to the main category as it would take time (there are a lot of filter URLs involved). Do you know any technical coding to do in Shopify to have all filter pages canonicalise to its main category? Keen to hear from you. Cheers
Intermediate & Advanced SEO | | brandonegroup0 -
Can someone please tell me if my H1 Tag is in the right place on my View Source. It looks like its in the body and not in the head
Hi Mozzers, I've been looking at the View Source on my landing pages , and it looks to me that my H1 Tag etc is not in the head but in the body . My developer says its in the correct place , but can someone please confirm as it looks wrong to me. short url link - http://goo.gl/vfXeut Many thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked. I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page? We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
Intermediate & Advanced SEO | | merch_zzounds0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
HEADS UP - Did Google Grant WMT and GA admin access to your past employees or contractors?
Check your users and permissions in WMT and GA. I noticed that two Gmail accounts from a while back were given admin access to our accounts! That means someone that used to work for you could go in and remove your site from Googles index. Check your accounts folks just a heads up 😉 Here is an article talking about this potentially dangerous issue. http://thenextweb.com/google/2012/11/28/serious-google-security-glitch-gives-webmaster-tools-possibly-analytics-access-to-revoked-accounts
Intermediate & Advanced SEO | | irvingw1 -
Url structure for multiple search filters applied to products
We have a product catalog with several hundred similar products. Our list of products allows you apply filters to hone your search, so that in fact there are over 150,000 different individual searches you could come up with on this page. Some of these searches are relevant to our SEO strategy, but most are not. Right now (for the most part) we save the state of each search with the fragment of the URL, or in other words in a way that isn't indexed by the search engines. The URL (without hashes) ranks very well in Google for our one main keyword. At the moment, Google doesn't recognize the variety of content possible on this page. An example is: http://www.example.com/main-keyword.html#style=vintage&color=blue&season=spring We're moving towards a more indexable URL structure and one that could potentially save the state of all 150,000 searches in a way that Google could read. An example would be: http://www.example.com/main-keyword/vintage/blue/spring/ I worry, though, that giving so many options in our URL will confuse Google and make a lot of duplicate content. After all, we only have a few hundred products and inevitably many of the searches will look pretty similar. Also, I worry about losing ground on the main http://www.example.com/main-keyword.html page, when it's ranking so well at the moment. So I guess the questions are: Is there such a think as having URLs be too specific? Should we noindex or set rel=canonical on the pages whose keywords are nested too deep? Will our main keyword's page suffer when it has to share all the inbound links with these other, more specific searches?
Intermediate & Advanced SEO | | boxcarpress0 -
E-commerce Site - Filter Pages
Hi, We have a client who has a fairly large e-commerce site that went live quite recently. The site is near enough fully indexed by Google, but one thing I've noticed is that filtered search results pages are being indexed, all with duplicate page titles. Obviously this is an issue that needs to be looked at ASAP. My questions is this - would we be better tweaking site settings so that page titles are constructed from the filters (brand/price/size) and therefore unique (and useful for searchers who are after a specific brand or size of a given item). Or should we rel=canonical the filtered pages so that they are eventually dropped from the index (the safer of the two options)? Thanks in advance for your help!
Intermediate & Advanced SEO | | jasarrow0 -
Recommendation to fix Google backlink anchor text over optimisation filter penalty (auto)
Hi guys, Some of you may have seen a previous question I posted regarding a new client I started working with. Essentially the clients website steadily lost all non domain name keyword rankings over a period of 4-12 weeks, despite content changes and various other improvements. See following:: http://www.seomoz.org/q/shouldn-t-google-always-rank-a-website-for-its-own-unique-exact-10-word-content-such-as-a-whole-sentence After further hair pulling and digging around, I realised that the back link anchor text distribution was unnatural for its homepage/root. From OSE, only about 55/700 of links anchor text contain the clients domain or company name!....8%. The distribution of the non domain keywords isn’t too bad (most repeated keyword has 142 links out of the 700). This is a result of the client submitting to directories over the last 3 years and just throwing in targeted keywords. Is my assumption that it is this penalty/filter correct? If it is I guess the lesson is that domain name anchor texts should make up more of your links? MY QUESTION: What are some of the effective ways I can potentially remove this filter and get the client ranking on its homepage again? Ensure all new links contain the company name?
Intermediate & Advanced SEO | | Qasim_IMG
Google said there was no manual penalty, so not sure if there’s any point submitting another reconsideration request? Any advice or effective experiences where a fix has worked would be greatly appreciated! Also, if we assume company is "www.Bluewidget.com", what would be the best way to link most naturally: Bluewidget
Blue widget
Blue widget .com
www.bluewidget.com
http://www.bluewidget.com....etc I'm guessing a mix of the above, but if anyone could suggest a hierarchy that would be great.0