One of my Campaigns have a problems!
-
Last week my Campaign "De Prevención" crawled many pages, and sudenly, only crawled one???? Canyou help me.
-
We did have an issue where some campaigns were only getting one page crawled. Alan gives some good advice here. I'd also email help@seomoz.org and ask them to take a look.
-
Not sure exactly why, I was able to crawl your site but here a few things that could be affecting , and will be affecting your rankings.
not sure whey you would have your robots text in the new directory http://www.deprevencion.com/new/robots.txt
and why you would want to block access to all the folders you have blocked either. Unless there is a really good reason to do so, you should let every thing be crawled.
Your domain http://www.deprevencion.com 301’s to http://www.deprevencion.com/new then to http://www.deprevencion.com/new/
You should not have linked 301’s, each 301 will leak some link juice and Bing will only pass link juice thought one.If I type in a rubbish address like http://www.deprevencion.com/new/fasfasdfafsd
i get a 200 response, meaning found, but there is not such page, it should return a 404, the page I am sent to has a image that is missing, that does return a 404, now first of all the 404 page should return a 404 not a 200, or its what is called a soft 404, you may see these errors in GWMT, bt if it did return a 404 you would cause a loop as there is a missing, that will cause another 404 to the 404 page with the missing image again and again.There are 1,348 redirects within your site, you should have no redirects on internall pages at all, as you have the power to point them to the final destination, all redirects leak link juice, so why have them when it is un-necessary.
You have many canonical problems, one in particular is your internal links point to http://www.deprevencion.com/new/index.php , this means that you link juice is never returned to your true home page.
Bing API gives a loop error
The URL 'deprevencion.com/new/index.php' has been crawled more than 500 times. This usually indicates an infinite loop in redirection logic.This would be because you home page has broken image links, that then get sent to your 404 page that also has a 404 missing image
Altogether, the redirections are a mess, I am afraid to say, you don’t need any redirects at all, so the questions is why have them?
OK after all that, I would say that you have a loop, that that rogerBot(seomoz) wont crawl
-
Url: www.deprevencion.com
Thanks
-
If you give us a url, i prorbably can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking drop from 6 op 23 in one day - freaking out
Dear, Moz We have been hard at work going some off site and on site SEO. However yesterday we got around 1600 404 errors from google, and ranking dropped from 7 in front page to 25. What we did: I found an error in Htacces, where my partner had this (rewritebase with double // and rule with // - I quess this started creating urls for google, because twww.website.com//category-category-cateory OK. But google says that they will not effect your rankings because 404s? Second think i found was that we had some urls, which had canonical tag to a page called search. Now that search (duplicate of homepage) we 301 to our main homepage. Can that effect ranking? You have 404s that have canonical to a page that itselft redirects (301) to homepage. We also removed the / splash. Nothing more.. Below is the htaccess, that had the double // error. Please comment. Options +FollowSymLinks
On-Page Optimization | | advertisingcloud
RewriteEngine On
RewriteBase //
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} (.+)/$
RewriteRule ^ %1 [R=301,L]
RewriteCond %{HTTP_HOST} ^www.website.com [NC]
RewriteRule ^(.)$ http://website.com/$1 [L,R=301]
RewriteCond $1 ^(index.php)?$ [OR]
RewriteCond $1 .(gif|jpg|css|js|png|ico)$ [NC,OR]
RewriteCond %{REQUEST_FILENAME} -f [OR]
RewriteCond %{REQUEST_FILENAME} -d
RewriteRule ^(.)$ - [S=1]
RewriteRule .//index.php [L]
DirectoryIndex index.php0 -
Deeplinking problem
Hello, I have a problem with the deeplinking between my website (http://www.nobelcom.com) and my app (nobelapp). When I search the app from the mobile, on Google, it sends me to the play store, even if the app is instaled. Can someone explain what I need to do on the website, and what i need to do on the mobile app so the deeplinking will work, in an easier way the Google documentation? I have on the homepage this rel alternate but still doesn't work. | android-app://com.nobelglobe.nobelapp/http/www.nobelcom.com" /> | Thanks, Florin
On-Page Optimization | | Silviu0 -
Google Places Problem
This may have been answered before but I have 2 questions. When I placed a business in Google Places, the "generic" ranking fell off the map. I now just have the 1 line Google places reference and that is all I can find. How can I get around that and get my 4 line description to show again? Do I have to delete my Places account? Before the Google Places account was built, the company was moving up the SERP ranks, now he is on pg 1 for Places but the other SERP positions have disappeared. This is true for all the keywords we are targeting. If there is not a Places reference he shows on Pg 3-5 (given the website is 4 weeks old, I think this is not bad). For the same client, he that services many of the surrounding communities. How do I get Google to recognize the various towns he services during a search? He places well for his "home" town but not at all for the other towns. if it helps any, the website is www.myairstat.com. Thanks for the help. Scott
On-Page Optimization | | scott5180 -
Crawl Diagnostics - Duplicates and canonical problem
SEOmoz crowl diagnostic reports duplicates (title, content) issue on this addres: http://www.meblobranie.pl/biurowe/fotele-biurowe/promocje page already has canonical tag - is this a bug of crowler, or smth wrong on page?
On-Page Optimization | | SITS0 -
Moving Top rank Page urls off my Home page and nesting them on one page? Good idea?
I am basically trying to cut down the amount of links on my home page to make it less eye boggling and move stuff around. So i have of my Urls on my home page that lead to pages that rank very well within google. My questions is can i remove those urls to a separate page to group them together and then showcase that one link to that page on my home page. Is that a good idea or i am going to loose my link juice and position in search? The physical urls on those pages wont change at all.
On-Page Optimization | | Dante130 -
Is It better to have a blogroll style Homepage to your website? or one main post?
Does a site rank better or is it better to have a blogroll style homepage linking to all of your other post in your website. Or is it better to have one homepage post for the main keyword theme of the website, with the other post titles in a side column?
On-Page Optimization | | nyphenom0 -
100 links on one page
we're recommended 100 links or less on one page. is the 100 links including header and footer links?
On-Page Optimization | | jallenyang0 -
Problem with fresh content on homepage
On my site my homepage acts as sort of a landing page that is geared towards getting the customer sign up (almost like a PPC landing page aside from a few navigation options...about, blog, contact and the legal docs in the footer). My blog is geared towards other businesses in the industry and the like minded tech people. My problem:
On-Page Optimization | | JasonJackson
From a user perspective I don't feel that blog snippets would add anything useful to the homepage. However, I feel like I fresh content would help my SEO endeavors. Suggestions? Note:
Should be mentioned that all my social stuff is deeply integrated into my /blog so importing tweets, for example, is out of the question.0