Unable to submit sitemap in GWM.. Error
-
I recently published new EMD and installed WP on their. Now i installed the Plugin called Yoast <acronym title="Search Engine Optimization">SEO</acronym> and XML Sitemap. Now whenever i am trying to submit sitemap it shows "URL restricted by robots.txt" but you can see my robots file line written below:
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/But still it is showing same error.. i deactivate plugin and resubmit the sitemap but still no luck.. please help
-
Now google accepted sitemap like for tags and images but now it is showing another error in post_sitemap.xml file .. Check this screenshot - http://i47.tinypic.com/5lxlwj.jpg
-
don't see a problem there. Looks like the sitemap is now being indexed though (according to my spider). Try resubmitting.
-
Here is the .htaccess file
BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" # END W3TC Browser Cache # BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTPS} =on RewriteRule . - [E=W3TC_SSL:_ssl] RewriteCond %{SERVER_PORT} =443 RewriteRule .* - [E=W3TC_SSL:_ssl] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule .* - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.freepapertextures.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.4) [NC] RewriteCond "%{DOCUMENT_ROOT}/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPress
-
Hmm.. maybe you are blocking robots through htaccess?
-
Still still still same problem.. I cleared all cache through W3.. but still google not detecting sitemap. grrrrrrrrrrr... ..Now google also indexed the page which clearly written robots problem - site:http://www.freepapertextures.com
PLEASE HELP
-
Clear your W3 Total Cache. Your robots.txt looks fine but my spider is picking the sitemap up as "noindex, follow" as well.
-
I uninstalled the xml sitemap plugin.. In Yoast that feature is already enabled but now i untick and tick that feature and click save so now it is no more showing 404 error. Second when i submit sitemap_index.xml link .. google still unable to detect and showing same problem.. you can see the screenshot here - http://prntscr.com/g57h7
-
first i use Yoast sitemap feature so this problem happened than i installed xml sitemap
Uninstall the XML sitemap plugin.. Yoast plugin has a built in XML sitemap which is even better.
Yoast sitemap looks like this: http://www.backlinkbuild.com/blog/sitemap_index.xml
Yours returns a 404: http://www.freepapertextures.com/sitemap_index.xml
Make sure you adjust the WP settings: SEO > XML Sitemaps > Check this box to enable XML sitemap functionality.
and submit this url: http://www.freepapertextures.com/sitemap_index.xml
-
Yes.. first i use Yoast sitemap feature so this problem happened than i installed xml sitemap plugin again the same problem. Setting is correct. You can check my domain robot here - http://freepapertextures.com/robots.txt
-
Are those two separate plugins? I know Yoast has a sitemap plugin built in so you don't need another sitemap plugin.
I've never had any issues with it. Make sure your blog setting is set to 'Allow search engines to index this site.' (Settings > Privacy). Reset your cache (if you have one).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 errors & old unused pages
I am using shopify and I need to delete some old pages which are coming up as 404 errors (product no longer available!) does anyone know where you go to delete these pages which are no longer needed?
Web Design | | carleyb0 -
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Sitemap Question (aspx, XML, HTML)
Hey everyone! My company uses a tool called SEOQuake. We are trying to hit all of their "checkmarks" when we run a diagnosis for them. One of the only things we can not figure out how to pass is their section for Site Compliance ---> XML Sitemaps. Our client's websites that we have built are all using .aspx URL structures, and when I view them, it clearly states that it is an XML file. It has this text written at the top of the .aspx page: "This XML file does not appear to have any style information associated with it. The document tree is shown below." Does anyone know what is happening here?
Web Design | | TaylorRHawkins
Thank you!1 -
Wordpress Plugin To Capture Form Completion Data, Before Visitor Hits "Submit"?
Greetings MOZ Community: Visitors frequently start to enter contact information in the forms on our website, but then chicken out and don't hit the submit button. I noticed this watching the recordings of visitor web site visits using Mouse Flow. Is there a Wordpress Plugin that would allow us to capture data entered in forms, where the visitor does not finally hit the "submit" button? Obviously this would be very, very valuable as this scenario occurs in one out of three or four instances. Thanks!!!
Web Design | | Kingalan1
Alan0 -
4XX (Client Error) on Wordpress Wesbite
I've just taken over the management of a website and am getting 4x 4XX (client Error) issues. Example: http://inter-italia.com/en/wp-login.php?action=lostpassword Can anyone give any guidance as how to fix this wordpress? I also see a lot of 'temporary redirects' due to multilingual plugin - is there anything I can do to fix this?
Web Design | | skehoe0 -
Many errors from previous ecommerce site. Domain is now just a localized wordpress site.
Many errors from previous ecommerce site. Do I need to redirect every single page that no longer exists at this domain? loveyourcabinets.com used to be loveyourkitchenandbath.com but we have since changed course. We want loveyourkitchenandbath.com to be our local site on Long Island and NYC. Loveyourcabinets.com will be an ecommerce project that I'll be revamping in the coming months. I think Moz as well as Google still has all of the old ecommerce pages indexed. And of course, Moz is shooting me a bunch of error all regarding pages from the ecommerce site that used to be on loveyourkitchenandbath.com. Any thoughts? Commentary? Thx
Web Design | | loveyourkitchen0 -
Is it common to have some of error/warning(currency duplicate,redirect, etc...) in most website that rank well?
Hi could any body could give me some idea on 'on page optimisation' Currently in my campaign I have around 3000+ errors, 14,000+ warning, 7000+ notices for the following reasons: Overly-Dynamic URL
Web Design | | LauraHT
Temporary Redirect
Title Element Too Long (> 70 Characters)
Duplicate Page Title
etc... First of all I know these have negative effect on SEO. Now to fix towards those issues it involve lots of works and times. At the same time most of our important keywords/url rank position have not changed over the last 12 months. Does that mean the above has only limited negative effect? I just want to know is it worthy to invest the man/hour/money to clean those issues. As it involves decent development time. Is it common to have some of error/warning in most website that rank well? (e.g. I 've seem may big website have duplicate title/meta-desc on their currency variant page)0 -
Crawl Diagnostics, 57 5XX (Server Error)'s
Hi guys, My name is Rob, and I work at Burst! Creative Group in Vancouver BC. We are having issues with the 5XX(server Error)'s. We have 57 of them, and can't figure out why! We have never had this problem before, but have recently added a blog to our website, and figure that this addition may have caused some problems. I'm hoping that I can get some helpful information from you MOZers to eliminate these crawl errors. You can take a look at our website at www.burstcreativegroup.com . Thank you in advance for all of your input! Rob
Web Design | | burst0