Automatically check if URL has been optimised?
-
Hi guys,
I have a massive list of URLs and want to check if the primary keyword for each URL has been optimised.
I'm looking for something similar to Moz on-page grader which grades the URL and primary keyword with a single metric e.g. grade a, b, c
However, Moz doesn't offer an API to pull this score automatically.
I was wondering does anyone know of any tools which you can access their API to do something like this?
Cheers.
-
You could try http://mysiteauditor.com/
I've used them quite a bit. Also maybe try SEMrush.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL structure for am International website with subdirectories
Hello, The company I am working for is launching a new ecommerce website (just a handful of products).
Intermediate & Advanced SEO | | Lvet
In the first phase, the website will be English only, but it will be possible to order internationally (20 countries).
In a second phase, new languages and countries will be added. I am wondering what is the best URL structure for launch: Start with a structure similar to website.com/language/content (later on we will add other languages than english) Start with a structure similar to website.com/country/content
3) Start with a structure similar to website.com/country-language/content (at the beginning it will be all website.com/country-en/content) What do you think? Cheers
Luca0 -
URL structure - which one is better?
We are creating a new website and got stuck while deciding the URL structure. Our concern is which url is better in terms of SEO i.e. pune.fabogo.com/spa or fabogo.com/pune/spa and why. Also which one would rank faster if someone searches for **spas in pune if both **pages are same.
Intermediate & Advanced SEO | | fabogo_marketing0 -
How to change URL structure in google webmasters
Is there any way to ask google to indexed the website in following URL structure abc.com/category/postname (I have this structure on my website) But Currently google indexed my website posts as - abc.com/postname/category How I can tell google to follow the right structure?
Intermediate & Advanced SEO | | Michael.Leonard0 -
Is there a problems with putting encoding into the subdomain of a URL?
We are looking at changing our URL structure for tracking various affiliates from: https://sub.domain.com/quote/?affiliate_id=xxx to https://aff_xxx_affname.domain.com/quote/ Both would allow us to track affiliates, but the second would allow us to use cookies to track. Does anyone know if this could possibly cause SEO concerns? Also, For the site we want to rank for, we will use a reverse proxy to change the URL from https://aff_xxx.maindomain.com/quote/ to https://www.maindomain.com/quote/ would that cause any SEO issues. Thank you.
Intermediate & Advanced SEO | | RoxBrock0 -
How do you check the google cache for hashbang pages?
So we use http://webcache.googleusercontent.com/search?q=cache:x.com/#!/hashbangpage to check what googlebot has cached but when we try to use this method for hashbang pages, we get the x.com's cache... not x.com/#!/hashbangpage That actually makes sense because the hashbang is part of the homepage in that case so I get why the cache returns back the homepage. My question is - how can you actually look up the cache for hashbang page?
Intermediate & Advanced SEO | | navidash0 -
Can I, in Google's good graces, check for Googlebot to turn on/off tracking parameters in URLs?
Basically, we use a number of parameters in our URLs for event tracking. Google could be crawling an infinite number of these URLs. I'm already using the canonical tag to point at the non-tracking versions of those URLs....that doesn't stop the crawling tho. I want to know if I can do conditional 301s or just detect the user agent as a way to know when to NOT append those parameters. Just trying to follow their guidelines about allowing bots to crawl w/out things like sessionID...but they don't tell you HOW to do this. Thanks!
Intermediate & Advanced SEO | | KenShafer0 -
International URL Puzzle
Hello, I have 4 different URL's going to 4 different countries that all contain the same content and Google is seeing them as duplicate pages. For ecommerce reasons I have to have these 4 pages separated. Here is a example of the pages below so you can see the URL structure: www.example/com/canada www.example.com/australia www.example.com/usa www.example.com/UK How do I fix this duplicate content problem? Thanks!
Intermediate & Advanced SEO | | digitalops0 -
URL Error or Penguin Penalty?
I am currently having a major panic as our website www.uksoccershop.com has been largely dropped from Google. We have not made any changes recently and I am not sure why this is happening, but having heard all sorts of horror stories of penguin update, I am fearing the worst. If you google "uksoccershop" you will see that the homepage does not rank. We previously ranked in the top 3 for "football shirts" but now we don't, although on page 2, 3 and 4 you will see one of our category pages ranking (this didn't used to happen). Some rankings are intact, but many have disappeared completely and in some cases been replaced by other pages on our site. I should point out our existing rankings have been consistently there for 5-6 years until today. I logged into webmaster tools and thankfully there is no warning message from Google about spam, etc, but what we do have is 35,000 URL errors for pages which are accessible. An example of this is: | URL: | http://www.uksoccershop.com/categories/5_295_327.html | | Error details In Sitemaps Linked from Last crawled: 6/20/12First detected: 6/15/12Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. Is it possible this is the cause of the issue (we are not currently sure why the URL's are being blocked) and if so, how severe is it and how recoverable?If that is unlikely to cause the issue, what would you recommend our next move is?All help is REALLY REALLY appreciated 🙂
Intermediate & Advanced SEO | | ukss19840