Duplicate page titles in SEOMoz
-
My on page reports are showing a good number of duplicate title tags, but they are all because of a url tracking parameter that tells us which link the visitor clicked on. For example, http://www.example.com/example-product.htm?ref=navside and http://www.example.com/example-product.htm are the same page, but are treated as to different urls in SEOMoz. This is creating "fake" number of duplicate page titles in my reports.
This has not been a problem with Google, but SEOMoz is treating it like this and it's confusing my data. Is there a way to specify this as a url parameter in the Moz software?
Or does anybody have another suggestion? Should I specify this in GWT and BWT?
-
The best way to handle this, for all crawlers including Google, Yahoo and Moz, is to make sure you have proper canonical tags on those URLs that point to the non-parameterized URL.
So http://www.example.com/example-product.htm?ref=navside will have a canonical that points to http://www.example.com/example-product.htm
-
My understanding is that the Moz crawlers should be checking the canonical, in which case it will ignore duplicate content and title tag issues. If you find this is not the case with your crawl, please let our help team know at help @ moz.com
-
No, Moz's tool won't check the canonical to see if that would ignore this.
-
Do you think that setting a canonical url tag might help fix this?
-
Hi Robert,
Yes, I'm sorry but you're overlooking something. Ignoring parameters is something you should do in regards to SEO. It won't stop Google Analytics tracking these parameters.
-
I'm not sure why I'd want to ask google to ignore those parameters... we're explicitly adding the ones that they suggested we use from here:
The issue that I'm having is that Moz analytics is showing duplicate pages as an issue to resolve when the only difference is that these params exist.
Am I overlooking something here?
-
Hi Robert,
I wouldn't handle this via robots.txt if you only want to do this for Rogerbot. The best way to tell Google to ignore your UTM parameters is via Google Webmaster Tools. Under Crawl > URL Parameters you've got the option to add parameters that don't change any of the content and are solely used for tracking purposes.
-
I'm having a similar issue. Is there an example of how to add this to the robots.txt file to ignore the utm stuff for RogerBot?
Our scenario is that we send out PDFs with links to pages on our site and those links have utm parameters included... and are showing up as duplicate content.
Thanks in advance,
Robby
-
I think this is the answer I was looking for... Yeah, GWT already has a bunch of our parameters added, and hasn't had a problem with this one. It's not showing these pages as duplicate like SEOMoz does.
Thanks guys!
-
Hi,
What you might want to do to get rid of this issues within SEOMoz is add the parameters to your robots.txt file and specifically target the user agent of SEOMoz: Rogerbot. This way SEOMoz won't crawl the links with this parameter and by doing that also won't warn you about these duplicate titles.
Hope this helps!
Btw. As James already mentioned I would also recommend to configure these parameters within Google Webmaster Tools.
-
You could set it up in GWT but it sounds like you are using utm tags on internal links so you can see which physical links on a page are driving clicks. If that's the case a cleaner solution is to upgrade your Google Analytics code for enhanced link attribution. I'm assuming you are using GA but if so this will allow you to see which links are driving which clicks and won't create tons of duplicate page titles in SEOmoz.
See link: http://support.google.com/analytics/bin/answer.py?hl=en&answer=2558867
Let me know if you have questions,
JS
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is my page rank disapointing
Hi fairly new here, so just getting used to everything one questions please. Just ran a crawl test of the website and this page http://www.livingphilosophy.org.uk/teaching-philosophy/index.htm came back with a page authority of 1. Other pages have a rank of 18 through 26 scratched my head for a few hours and came up with no ideas. thanks andy
Moz Pro | | livingphilosophy0 -
Does the Crawl Diagnosis - Duplicate Page Content account for a canonical meta tags?
I see the same page listed 3 time (with different query params). But on each I have a meta tag pointing to the correct canonical url. By still seeing all three listed, does that mean there is an error with my meta tag?
Moz Pro | | Simantel0 -
SEOMOZ Crawling Our Site
Hi there, We get a report from SEOMOZ every week which shows our performance within search. I noticed for our website www.unifor.com.au that it looks through over 10,000 pages, however our website sells less than 500 products so not sure why or how so many pages are trawled? If someone could let me know that would be great. It uses up a lot of bandwidth doing each of these searches so if the amount of pages being trawled reduced it would definitely assist. Thanks, Geoff
Moz Pro | | BeerCartel750 -
Moztool and on page ranking matching
How does the Moztool compare and filter the search phrases you enter in your campaign? Or more correctly, will it filter out stop words or is it an exact match? For example I enter a phrase to track that say: "book ski trip austria" Identified in Google I see that most users search for just that "book ski trip austria" But in content, I cant write that as that is uncorrect english and I want to maby write something like: "When you book a ski trip to austria you get..." How will this affect my on page SEO report, will it still match and mark a "V" in done or show a an error? Even more interesting is, what happen if you do phrases in different order like "An austrian skip trip will make you feel..."
Moz Pro | | Macaper0 -
SEOmoz tools: Keyword Difficuty + On-page Analysis
To analyse the current KWD situation. I would want to see the top 10 results link metrics (as in Keyword Difficulty & SERP Analysis) and each page's On page score (as in On-page Analysis) for the keyword. Those two figures would give me a pretty good picture of the current situation. Kind regards,
Moz Pro | | OscarSE0 -
What is the difference between the Rank Tracker and the On-page Optimization page?
Both of them track keywords. In the Rank Tracker, you add each keyword manually and you associate it with a URL. For On-page Optimization page, the URLs are generated automatically based on searches and traffic?
Moz Pro | | ehabd0 -
Need to find all pages that link to list of pages/pdf's
I know I can do this in OSE page by page, but is there a way I can do this in a large batch? There are 200+ PDF's that I need to figure out what pages (if any) link to the PDF. I'd rather not do this page by page, but rather copy-paste the entire list of pages I'm looking for. Any tools you know of that can do this?
Moz Pro | | ryanwats0 -
What is the best method to solve duplicate page content?
The issue I am having is an overwhelmingly large number of pages on cafecartel.com show that they have duplicate page content. But when I check the errors on SEOmoz it shows that the duplicate content is from www.cafecartel.com not cafecartel.com. So first of all, does this mean that there are two sites? and is this a problem I can fix easily? (i.e. redirecting the URL and deleting the extra pages) Is this going to make all other SEO useless due to the fact that it shows that nearly every page has duplicate page content? Or am I just completely reading the data wrong?
Moz Pro | | MarkP_0