Tool for scanning the content of the canonical tag
-
Hey All,
question for you. What is your favorite tool/method for scanning a website for specific tags? Specifically (as my situation dictates now) for canonical tags?
I am looking for a tool that is flexible, hopefully free, and highly customizable (for instance, you can specify the tag to look for). I like the concept of using google docs with the import xml feature but as you can only use 50 of those commands at a time it is very limiting (http://www.distilled.co.uk/blog/seo/how-to-build-agile-seo-tools-using-google-docs/).
I do have a campaign set up using the tools which is great! but I need something that returns a response faster and can get data from more than 10,000 links. Our cms unfortunately puts out some odd canonical tags depending on how a page is rendered and I am trying to catch them quickly before it gets indexed and causes problems. Eventually I would also like to be able to scan for other specific tags, hence the customizable concern. If we have to write a vb script to get it into excel I suppose we can do that.
Cheers,
Josh
-
No idea on that one - it's still pretty new. The developers actually chimed in on the post, so you could ask them in the comments.
-
Thanks Dr. Pete and Marcus.
I just finished reading the post. I have looked at Screaming Frog before but was hoping to be able to find a way to do it myself. Just didn't want to plop money down on something that seemed like it should be able to be done using tools I already had. But the software does look good. Any thought on if they will come out with a one time purchase instead of a yearly subscription?
Cheers!
Josh
-
Hey Dr. Pete, Joshua
I was just coming here to say that I had read the Dr. Pete post and this may do the job. It's a paid bit of a software but I will be picking it up later. I have my guys knocking up a canonical checker that will be free for all but that may take a day or so to get perfect.
Let me know if you have a play with Screaming Frog!
Marcus
-
I'm pretty sure that Screaming Frog SEO Spider will do it, but you need the paid version to custom-filter on the canonical tag. I've got a post going up about it tomorrow.
-
Great, really appreciate it! Many thumbs up
-
Hey Josh,
Right, cool. I have got a few jobs to sort out but I am going to have a bash at knocking this up this afternoon. Should be easy enough (he said, damning himself to hours of problems).
Leave it with me for 24 hours.
Marcus
-
Hey Marcus,
thanks for the quick response. That is exactly what I would be looking for. I do have a list of url's and that is also simple enough to get from something like xenu. Would love to work with you on this.
Thanks.
Josh
-
Hey, I am not aware of any such tool, but it should not be too hard to put one together, maybe a useful little tool as well.
If you have all of your pages in spreadsheet or database, it should be easy enough to write a little script that cycles through them.
Start Loop
-
request page
-
parse code to get canonical URL
-
compare page to canonical
-
output problem URLs
End Loop
Slightly over simplified and requires a list of all your URLs but would be willing to help put something like this together, could be useful for all of us, especially for those (like me) that work with a lot of CMS sites.
Cheers
Marcus
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tool recommendation for Page Depth?
I'd like to crawl our ecommerce site to see how deep (clicks from home page) pages are. I want to verify that every category, sub-category, and product detail page is within three clicks of the home page for googlebot. Suggestions? Thanks!
Moz Pro | | Garmentory0 -
Tags on my website cause duplicate content
Hi I just recently started a website and I am new to MOZ pro. What Moz pro detected on my website under high priority is that "duplicate page content" and what I realize about these duplicate page content is regarding the tags i put on my post. Because it is a wordpress blog, we are allow to add tags on the side before we publish our post. And because of these tags, it linked to the same page but different url. for example website.com/tags/whatever website.com/tags/whatever 2 and both these url direct to the same page So how do i solve this? do i just stop tagging whenever i write a post? delete all tags while it is not necessary? i seen method like 301 redirect or rel=canonical but is there anyway to solve this problem so I do not face this issue whenever i make a new post in my blog? I mean it doesnt make sense to redirect 301 to every single tags i have whenever i write a new post right? thanks guys
Moz Pro | | andzon0 -
Why does the Keyword Difficulty Tool shows no Search Volume?
I've run a full report in many keywords on the Keyword Difficulty Tool, but in the Search Volume field all of them have the message "not enough data". Screenshot attached. Why is that? I have keywords like "magnesium" that popular and have quite some traffic. Any help is appreciated, thanks! semoz-keywords-analytics.jpg
Moz Pro | | bernardovailati0 -
Excel tips or tricks for duplicate content madness?
Dearest SEO Friends, I'm working on a site that has over 2,400 instances of duplicate content (yikes!). I'm hoping somebody could offer some excel tips or tricks to managing my SEOMoz crawl diagnostics summary data file in a meaningful way, because right now this spreadsheet is not really helpful. Here's a hypothetical situation to describe why: Say we had three columns of duplicate content. The data is displayed thusly: | Column A | Column B | Column C URL A | URL B | URL C | In a perfect world, this is easy to understand. I want URL A to be the canonical. But unfortunately, the way my spreadsheet is populated, this ends up happening: | Column A | Column B | Column C URL A | URL B | URL C URL B | URL A | URL C URL C | URL A | URL B | Essentially all of these URLs would end up being called a canonical, thus rendering the effect of the tag ineffective. On a site with small errors, this has never been a problem, because I can just spot check my steps. But the site I'm working on has thousands of instances, making it really hard to identify or even scale these patterns accurately. This is particularly problematic as some of these URLs are identified as duplicates 50+ times! So my spreadsheet has well over 100K cells!!! Madness!!! Obviously, I can't go through manually. It would take me years to ensure the accuracy, and I'm assuming that's not really a scalable goal. Here's what I would love, but I'm not getting my hopes up. Does anyone know of a formulaic way that Excel could identify row matches and think - "oh! these are all the same rows of data, just mismatched. I'll kill off duplicate rows, so only one truly unique row of data exists for this particular set" ? Or some other work around that could help me with my duplicate content madness? Much appreciated, you Excel Gurus you!
Moz Pro | | FMLLC0 -
Links counted in Google Webmaster Tools vs. OpenSiteExplorer
Hi all SEOmozzers, site: http://f8phototravel.pl I have (in my GMT profile) 43 domains linking. Among them one of the biggest Polish like: onet.pl (5/187), gazeta.pl (8/393), grono.net.pl, canon-board.info, eturystyka.org and other. They are not present in OpenSiteExplorer. Any ideas why? Thanks, Marek */ Alexa rank (Poland/Global)
Moz Pro | | mad2k0 -
Notice rel canonical
Hi, Why does my sites get the crawler notice for rel canonical when using the PRO account crawlers?? The canonical is there and it works, and to me it looks just like any other canonical link, the canonical is only at some links but not everyone, why is that?
Moz Pro | | careeron0 -
How long has the keyword difficulty tool had these limits in place?
While working against a tight deadline, I was surprised to see the following message: "We're sorry. Currently we are only able to offer results for 300 keywords per user per day. Please come back tomorrow" How long has this limit been in place and is the limit listed anywhere during the signup process? I rarely use this tool for more than 10-20 keywords at a time, so I have not run into this issue before.
Moz Pro | | davidangotti0 -
SEOmoz keyword difficulty tool
Is anyone else having problems with this? Every search I do seems to throw up an error in the traffic fields.
Moz Pro | | neooptic0