Unable to submit sitemap in GWM.. Error
-
I recently published new EMD and installed WP on their. Now i installed the Plugin called Yoast <acronym title="Search Engine Optimization">SEO</acronym> and XML Sitemap. Now whenever i am trying to submit sitemap it shows "URL restricted by robots.txt" but you can see my robots file line written below:
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/But still it is showing same error.. i deactivate plugin and resubmit the sitemap but still no luck.. please help
-
Now google accepted sitemap like for tags and images but now it is showing another error in post_sitemap.xml file .. Check this screenshot - http://i47.tinypic.com/5lxlwj.jpg
-
don't see a problem there. Looks like the sitemap is now being indexed though (according to my spider). Try resubmitting.
-
Here is the .htaccess file
BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" # END W3TC Browser Cache # BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTPS} =on RewriteRule . - [E=W3TC_SSL:_ssl] RewriteCond %{SERVER_PORT} =443 RewriteRule .* - [E=W3TC_SSL:_ssl] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule .* - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.freepapertextures.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.4) [NC] RewriteCond "%{DOCUMENT_ROOT}/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPress
-
Hmm.. maybe you are blocking robots through htaccess?
-
Still still still same problem.. I cleared all cache through W3.. but still google not detecting sitemap. grrrrrrrrrrr... ..Now google also indexed the page which clearly written robots problem - site:http://www.freepapertextures.com
PLEASE HELP
-
Clear your W3 Total Cache. Your robots.txt looks fine but my spider is picking the sitemap up as "noindex, follow" as well.
-
I uninstalled the xml sitemap plugin.. In Yoast that feature is already enabled but now i untick and tick that feature and click save so now it is no more showing 404 error. Second when i submit sitemap_index.xml link .. google still unable to detect and showing same problem.. you can see the screenshot here - http://prntscr.com/g57h7
-
first i use Yoast sitemap feature so this problem happened than i installed xml sitemap
Uninstall the XML sitemap plugin.. Yoast plugin has a built in XML sitemap which is even better.
Yoast sitemap looks like this: http://www.backlinkbuild.com/blog/sitemap_index.xml
Yours returns a 404: http://www.freepapertextures.com/sitemap_index.xml
Make sure you adjust the WP settings: SEO > XML Sitemaps > Check this box to enable XML sitemap functionality.
and submit this url: http://www.freepapertextures.com/sitemap_index.xml
-
Yes.. first i use Yoast sitemap feature so this problem happened than i installed xml sitemap plugin again the same problem. Setting is correct. You can check my domain robot here - http://freepapertextures.com/robots.txt
-
Are those two separate plugins? I know Yoast has a sitemap plugin built in so you don't need another sitemap plugin.
I've never had any issues with it. Make sure your blog setting is set to 'Allow search engines to index this site.' (Settings > Privacy). Reset your cache (if you have one).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'Security error' for links accessed via Facebook on Android phones
Hi, This is not strictly a SEO/inbound marketing question, so please excuse me for that--- but I think this awesome community could certainly help 🙂 We recently migrated a client website to https (SSL from Godaddy; the hosting provider is a different one). All that went fine. The problem though is that when a link from the website is shared on Facebook or sent via Whatsapp, and a user tries to open the page on any Android device, it throws up a Security Error. On the Facebook app, it doesn't allow the user to go any further. It seems that this problem is not unique and many others have raised it in various forums -- we've tried many of the options mentioned; have tried to work with Godaddy support as well ---- but the problem persists. Any solution(s)/fixes will be greatly appreciated. Thanks, Manoj
Web Design | | ontarget-media0 -
Do we need both an .XML Sitemap and a .aspx sitemap?
Hi Mozers, We recently switched servers and it came to my attention that we have two sitemaps a XML version of the sitemap and a .aspx version of the sitemap. This came to light as the .aspx version of the sitemap is causing the site to come to a screeching halt as it has some complex code and lists over 80,000 products. My question is do we need both versions of the sitemap? My understanding is that the XML version is for Search Engine bots and the .aspx version is for customers. I can't imagine that anyone is using our .aspx version as it is basically a page with 80,000 links and it's buried away on the site, so we were hoping to kill off the .aspx version of the sitemap and keep the .xml version for Search Engine Bots. I wanted to check here first to make sure we did not any negative search engine implications. Any help would be most appreciated. Thanks so much! Patrick
Web Design | | gatorpool0 -
When Site:Domain Search Run on Google, SSL Error Appears on One URL, Will this Harm Ranking
Greetings MOZ Community: When a site:domain search is run on Google, a very strange URL appears in the search results. The URL is http://www.nyc-officespace-leader.com:2082/ The page displays a "the site's security certificate is not trusted." This only appears for one URL out of 400. Could this indicate a wider problem with the server's configuration? Is this something that needs to be corrected, and if so how? Our ranking has dropped a lot in the last few months. Thanks,
Web Design | | Kingalan1
Alan0 -
W3C My site has 157 Errors, 146 warning(s) Is it an issue?
Is having this number of W3C errors & warnings an issue and will be impacting my site's performance? When the site was built 6 months ago my developers told me that it "was nothing to worry about", but I have read that any errors aren't good, let alone the huge number my site has? Your advice please Thanks Ash
Web Design | | AshShep10 -
4XX (Client Error) on Wordpress Wesbite
I've just taken over the management of a website and am getting 4x 4XX (client Error) issues. Example: http://inter-italia.com/en/wp-login.php?action=lostpassword Can anyone give any guidance as how to fix this wordpress? I also see a lot of 'temporary redirects' due to multilingual plugin - is there anything I can do to fix this?
Web Design | | skehoe0 -
Unable to set preferred domain, can I verify a site that's already redirected?
I'm in the process of trying to set a preferred domain in webmaster tools -- to set our www version as preferred vs. the non www. version. IT is already redirecting non-www to www, but I get this message when trying to change settings "Part of the process of setting a preferred domain is to verify that you own http://mnn.com/. Please verify http://mnn.com/." While we own the domain, I am not sure how we can have Google access a file at [http://mnn.com/some_file when we are forwarding all requests for non-www to our www site.
Web Design | | Aggie
Note: The apache rewrite predates me and I'm not sure how / why we have two domains set up, but I'm trying to fix the preferred domain now.Am I able to verify the non version once the redirect is in place.Any ideas??? Help???Thanks!Lisa0 -
Are HTML sitemaps still in use today?
I'm trying to help a client understand the importance of having a well-organized HTML site map as a method of helping usability. As part of this process, I spent some time searching for good examples of well-organized HTML site maps, and found that many sites don't offer one (including SEOmoz). I'm wondering if webmasters and/or SEOers think they aren't valuable any longer?
Web Design | | EricVallee340