On-Page Reports showing old urls
-
While taking a look at our sites on-page reports I noticed some of our keywords with very old urls that haven't existed for close to a year. How do I make sure moz's keyword ranking is finding the correct page and make sure I'm not getting graded on that keywords/urls that don't exist any more or have been 301'd to new urls? Is there a way to clean these out? My on-page reports say I have 62 reports for only a total of 34 keywords in rankings.
As you can see from the image most of the urls for "tax folder" have now been 301'd to not include /product or /category but moz is still showing them with the old url structure. BTW our site is minespress.com
-
Glad you are good to go Steven!
If anything else comes up, definitely let us know at help@moz.com if you wish to keep your info private.
Cheers!
-
Never mind solved it myself
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps and Indexed Pages
Hi guys, I created an XML sitemap and submitted it for my client last month. Now the developer of the site has also been messing around with a few things. I've noticed on my Moz site crawl that indexed pages have dropped significantly. Before I put my foot in it, I need to figure out if submitting the sitemap has caused this.. can a sitemap reduce the pages indexed? Thanks David. TInSM
API | | Slumberjac0 -
Crawler unable to access pages
Hi crawler is unable to access site and crawl properly. Mainly for the backlink checker, it's producing no results There is nothing in the robots.txt file blocking crawler access. Any help is much appreciated as it's driving me crazy!
API | | 2Cubedie0 -
How do batched URL metrics work in terms of rows and rate limit?
I am using the free API plan to get URL metrics and batching my calls like this: https://github.com/seomoz/SEOmozAPISamples/blob/master/php/batching_urls_sample.php How does this work in terms of rows and limits? If I do a batch of 10 urls does it count as 1 row? or 10? Do I have to wait 10 seconds before calling the next batch?
API | | MWS20 -
Moz Crawl: Can't check page optimization error https
Help needed, when I try to do a page optimization check i get the following error : The URL you entered does not appear to be returning a page successfully. Please make sure that you've entered the URL of valid, working page. But i can do a site crawl, what should be the problem? Checked with frog seo spider and add no problem, robots.txt its also clean. Anyone knows what can be wrong? Thanks
API | | Luis-Pereira0 -
3 result limit to Top Pages API call
I am using the MOZ API to make calls for the top pages for a particular URL. However, when I pass in any limit value greater than 3 the API only returns 3 results. I have even tried to put in URLs like 'www.moz.com' and still only 3 results. Sample call to the API below: http://lsapi.seomoz.com/linkscape/top-pages/www.moz.com?AccessID=member-xxxxxxxxx&Expires=1419020831&Signature=xxxxxxxxx&Cols=2052&Offset=0&Limit=50
API | | solodev0 -
API - Internal Links to page and related metrics
Hi dear moz Team! Currently I´m building a Java application accessing your API. But there are some metrics I urgently need which I can´t get out of the API until now: The total number of internal links to a page The total number of internal links to a page with partial anchor text match MozRank passed by all internal links w. part. match anchor text (would be nice) For example, if I try this by your links endpoint, my idea was: http://lsapi.seomoz.com/linkscape/links/http%3A%2F%2Fwww.jetztspielen.de%2F?AccessID=..
API | | pollierer
&Expires=..
&Signature=..
&Scope=domain_to_page
&Filter=internal
&Sort=domain_authority
&SourceCols=4 (or any other value)
&SourceDomain=www.jetztspielen.de
&Offset=0
&Limit=50 If I try this, the API says: {"status": "400", "error_message": "Cannot set a source domain when filtering for internal links."} Is there any way to get the data I need by your API endpoints? I´m currently writing my master thesis and it is very important to me to solve this somehow. Thank you very much in advance! Best, Andreas Pollierer1 -
On page reports
Hi everyone I have just been going through the online page I see that I have quite a few words that have an F rating I was wondering if I have all the keywords with an A ranking would that improve our Moz rating? Also of the below elements can anyone tell which, if any are more important than the others? Title URL Meta Desc H1 H2-4 Body B / Strong IMG ALT
API | | Hardley1110 -
Batch URL Error 413
Hello, i am using the free mozscape api to get the domain and page authority of urls. i am batching the url as shown in the sample code however i have over 500 URL i need to check and when running the request i get the following: stdClass Object ( [status] => 413 [error_message] => Too many urls in batch. Batches must be less than or equal to 200 urls. ) When i change it to 200 urls the request works fine, now is there a way to enable me to batch all 500 urls at once? i did read that the beta api is capable of batching urls in one request : http://moz.com/blog/400-higher-throughput-mozscape-api-now-in-beta-and-seeking-testers Has this been implemented yet into the current api? Thanks
API | | pauledwards0