Unable to submit sitemap in GWM.. Error
-
I recently published new EMD and installed WP on their. Now i installed the Plugin called Yoast <acronym title="Search Engine Optimization">SEO</acronym> and XML Sitemap. Now whenever i am trying to submit sitemap it shows "URL restricted by robots.txt" but you can see my robots file line written below:
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/But still it is showing same error.. i deactivate plugin and resubmit the sitemap but still no luck.. please help
-
Now google accepted sitemap like for tags and images but now it is showing another error in post_sitemap.xml file .. Check this screenshot - http://i47.tinypic.com/5lxlwj.jpg
-
don't see a problem there. Looks like the sitemap is now being indexed though (according to my spider). Try resubmitting.
-
Here is the .htaccess file
BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" # END W3TC Browser Cache # BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTPS} =on RewriteRule . - [E=W3TC_SSL:_ssl] RewriteCond %{SERVER_PORT} =443 RewriteRule .* - [E=W3TC_SSL:_ssl] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule .* - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.freepapertextures.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.4) [NC] RewriteCond "%{DOCUMENT_ROOT}/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPress
-
Hmm.. maybe you are blocking robots through htaccess?
-
Still still still same problem.. I cleared all cache through W3.. but still google not detecting sitemap. grrrrrrrrrrr... ..Now google also indexed the page which clearly written robots problem - site:http://www.freepapertextures.com
PLEASE HELP
-
Clear your W3 Total Cache. Your robots.txt looks fine but my spider is picking the sitemap up as "noindex, follow" as well.
-
I uninstalled the xml sitemap plugin.. In Yoast that feature is already enabled but now i untick and tick that feature and click save so now it is no more showing 404 error. Second when i submit sitemap_index.xml link .. google still unable to detect and showing same problem.. you can see the screenshot here - http://prntscr.com/g57h7
-
first i use Yoast sitemap feature so this problem happened than i installed xml sitemap
Uninstall the XML sitemap plugin.. Yoast plugin has a built in XML sitemap which is even better.
Yoast sitemap looks like this: http://www.backlinkbuild.com/blog/sitemap_index.xml
Yours returns a 404: http://www.freepapertextures.com/sitemap_index.xml
Make sure you adjust the WP settings: SEO > XML Sitemaps > Check this box to enable XML sitemap functionality.
and submit this url: http://www.freepapertextures.com/sitemap_index.xml
-
Yes.. first i use Yoast sitemap feature so this problem happened than i installed xml sitemap plugin again the same problem. Setting is correct. You can check my domain robot here - http://freepapertextures.com/robots.txt
-
Are those two separate plugins? I know Yoast has a sitemap plugin built in so you don't need another sitemap plugin.
I've never had any issues with it. Make sure your blog setting is set to 'Allow search engines to index this site.' (Settings > Privacy). Reset your cache (if you have one).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 errors & old unused pages
I am using shopify and I need to delete some old pages which are coming up as 404 errors (product no longer available!) does anyone know where you go to delete these pages which are no longer needed?
Web Design | | carleyb0 -
ECWID Ecommerce Sitemaps (Lack of)
Does anyone know if the lack of sitemaps on ECWID built sites is a negative for SEO? Does Google somehow index these sites and do they penalize because sites can't include the urls in sitemaps? Also, any idea how to build a sitemap to include ECWID shopping carts?
Web Design | | Atlanta-SMO0 -
404 error on phone numbers
Hi, I'm receiving a 404 error on my callto: phone number and wondered if there's a way to fix the problem. We've not experienced it before so I'm not sure if it's something to do with the crawl? Any help massively appreciated! Thanks Anne
Web Design | | SeeGreen0 -
Redirects (301/302) versus errors (404)
I am not able to convincingly decide between using redirects versus using 404 errors. People are giving varied opinions. Here are my cases 1. Coding errors - we put out a bad link a. Some people are saying redirect to home page; the user at least has something to do PLUS more importantly it does NOT hurt your SEO ranking. b. Counter - the page ain't there. Return 404 2. Product removed - link1 to product 1 was out there. We removed product1; so link1 is also gone. It is either lying in people's bookmarks, OR because of coding errors we left it hanging out at some places on our site.
Web Design | | proptiger0 -
Word Press Seo Errors/ Questions
Hi my name is Tina I am new here I hope you guys can help me out. I thought building my new site with Word Press was going to simplify things, however I have a ton of errors, and I am not sure what they are, or how to fix them. I am hoping someone could share with me a solution for these errors. I have 28 rel=canonical errors, I am not sure what this means, I understand it to mean my pages are similar, and this is to set a heirarchy between my pages. Please correct me if I am wrong. If I am correct would this be necessary to add if my main keyword was "widgets" and my home page was optimized for "widgets" and my next page was "blue widgets" and so on. While my pages are similar they are all optimized for different versions of my main keyword some using long tail keywords. Do you know of a plugin that can help solve this problem? Also does anyone have a plugin they recommend for G+ my G+ authorship verification is causing an error as well? I am using Head Space 2 I have used this seo plugin numerous times with great success it has been my favorite seo plugin. However, we have a portfolio that shows our clients websites, and on those pages Head Space will not let me enter a description tag. What plug in do you guys recommend with more control over each page? Another interesting issue is on one of our pages I optimized it for our Canadian clients, and now every page has been listed in Google.ca for the keywords it should have on Google.com. We are listed on Google maps, verified in Google places, and our address is on the site so they know we're from the USA however, the majority of our keywords are only listed in Google.ca. We're on page one for all of them, we are in the top three on most of them so that's not bad, but we want to be listed in Google.com as well. Any suggestions on this?
Web Design | | TinaGammon1 -
Mobile Sitemap for Site with Media Queries
I'm doing SEO for a site. It uses Media Queries and the CSS to automatically resize the site for the screen size in use. I.e. the site detects the screen size of say an iPhone and the CSS knows which elements to hide for that screen size and still make it look good. This is great because it will automatically cut down the content to display nicely on small screens - obviating the need for a separate mobile site. What kind of sitemap should be generated since the urls are for desktop and mobile use? Yoast (sweet SEO) said it should have both regular and mobile style sitemap to get both the regular and mobile bots to visit, but didn't elaborate on how that sitemap should look. Do you have a recommendation for how exactly the sitemap should look? Should the sitemap have the urls all twice, i.e. once regular and once with the mobile indicator?
Web Design | | GregoryHaze1 -
Getting tons of duplicate content and title errors on my asp.net shopping cart, is there way to resolve this?
The problem I am having is that the web crawlers are seeing all my category pages as the same page thus creating duplicate content and duplicate title errors. At this time I have 270 of these critical errors to deal with. Here is an example: http://www.baysidejewelry.com/category/1-necklaces.aspx http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=1 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=2 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=3 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=4 All of these pages are see as the same exact page by the crawlers. Because these pages are generated by a SQL database I don't have a way I know of to fix it.
Web Design | | bsj20020 -
Is it necessary to redirect every Error page (404 or 500) found?
If I have Hundreds of pages with 404 and 500 erros should set up 301 redirects for all of them? Some of the pages have external links, some don't.
Web Design | | jmansd0