Pagination/Crawl Errors
-
Hi,
Ive only just joined SEO moz and after they crawled my site they came up with 3600 crawl errors mostly being duplicate content and duplicate urls. After researching this it soon became clear it was due to on page pagination and after speaking with Abe from SEO mozhe advised me to take action by getting our developers to implement rel=”next” & rel=”prev” to review. soon after our developers implemented this code ( I have no understanding of this what so ever) 90% of my keywords I had been ranking for in the top 10 have dropped out the top 50! Can anyone explain this or help me with this?
Thanks
Andy
-
Hi,
Correlation doesnt always equal causation, what you reference is unlikely to be a cause of ranking decline inless something was done incorrectly e.g. have they rel prev'd your category page by mistake. (Even then im not sure that would cause an issue).
If you want to paste your site and an example url in here, ill gladly take a quick look when I have a minute.
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Issues / Partial Fetch Via Google
We recently launched a new site that doesn't have any ads, but in Webmaster Tools under "Fetch as Google" under the rendering of the page I see: Googlebot couldn't get all resources for this page. Here's a list: URL Type Reason Severity https://static.doubleclick.net/instream/ad_status.js Script Blocked Low robots.txt https://googleads.g.doubleclick.net/pagead/id AJAX Blocked Low robots.txt Not sure where that would be coming from as we don't have any ads running on our site? Also, it's stating the the fetch is a "partial" fetch. Any insight is appreciated.
Technical SEO | | vikasnwu0 -
403 forbidden error how to solve them
hi, i have been using a great tool today called screaming frog which was shown to me by Thomas Zickell when i used the tool i found some worrying things for my site www.in2town.co.uk. what i have found is, i have a large number of 403 forbidden status on my home page and i do not know why here is an example http://www.in2town.co.uk/emmerdale/emmerdale-debbie-hits-rock-bottom it loads fine but on the tool it shows it as an error and shows it as having no meta tags or anything but there is meta tags in there can anyone please let me know how to solve this and why it has happened many thanks
Technical SEO | | ClaireH-1848860 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Rel Canonical errors after seomoz crawling
Hi to all, I can not find which are the errors in my web pages with the tag cannonical ref. I have to many errors over 500 after seomoz crawling my domain and I don't know how to fix it. I share my URL for root page: http://www.vour.gr My rel canonical tag for this page is: http://www.vour.gr"/> Can anyone help me why i get error for this page? Many thanks.
Technical SEO | | edreamis0 -
Site Crawl
I was wondering if there was a way to use SEOmoz's tool to quickly and easily find all the URLs on you site and not just the ones with errors. The site that I am working on does not have a site map. What I am trying to do is find all the URLs along with their titles and description tags. Thank you very much for your help
Technical SEO | | pakevin0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
Most Common Errors & Warnings
Hello there, i would like to ask some basic tips.. regarding found common errors & Warnings. list : Tittle Element Too Long
Technical SEO | | Bretly
Duplicate Page Content
and Duplicate Page Tittle. how could i fixed this one? any help would be greatly appreciated regards,0 -
Google causing Magento Errors
I have an online shop - run using Magento. I have recently upgraded to version 1.4, and I installed a extension called Lightspeed, a caching module which makes tremendous improvements to Magento's performance. Unfortunately, a confoguration problem, meant that I had to disable the module, because it was generating errors relating to the session, if you entered the site from any page other than the home page. The site is now working as expected. I have Magento's error notification set to email - I've not received emails for errors generated by visitors. However over a 72 hour period, I received a deluge of error emails, which where being caused by Googlebot. It was generating an erro in a file called lightspeed.php Here is an example: URL: http://www.jacksgardenstore.com/tahiti-vulcano-hammock IP Address: 66.249.66.186 Time: 2011-06-11 17:02:26 GMT Error: Cannot send headers; headers already sent in /home/jack/jacksgardenstore.com/user/jack_1.4/htdocs/lightspeed.php, line 444 So several things of note: I deleted lightspeed.php from the server, before any of these error messages began to arrive. lightspeed.php was never exposed in the URL, at anytime. It was referred to in a mod_rewrite rule in .htaccess, which I also commented out. If you clicked on the URL in the error message, it loaded in the browser as expected, with no error messages. It appears that Google has cached a version of the page which briefly existed whilst Lightspeed was enabled. But I though that Google cached generated HTML. Since when does cache a server-side PHP file ???? I've just used the Fetch as Googlebot facility on Webmaster Tools for the URL in the above error message, and it returns the page as expected. No errors. I've had to errors at all in the last 48 hours, so I'm hoping it's just sorted itself out. However I'm concerned about any Google related implications. Any insights would be greatly appreciated. Thanks Ben
Technical SEO | | atticus70