Unable to submit sitemap in GWM.. Error
-
I recently published new EMD and installed WP on their. Now i installed the Plugin called Yoast <acronym title="Search Engine Optimization">SEO</acronym> and XML Sitemap. Now whenever i am trying to submit sitemap it shows "URL restricted by robots.txt" but you can see my robots file line written below:
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/But still it is showing same error.. i deactivate plugin and resubmit the sitemap but still no luck.. please help
-
Now google accepted sitemap like for tags and images but now it is showing another error in post_sitemap.xml file .. Check this screenshot - http://i47.tinypic.com/5lxlwj.jpg
-
don't see a problem there. Looks like the sitemap is now being indexed though (according to my spider). Try resubmitting.
-
Here is the .htaccess file
BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" # END W3TC Browser Cache # BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTPS} =on RewriteRule . - [E=W3TC_SSL:_ssl] RewriteCond %{SERVER_PORT} =443 RewriteRule .* - [E=W3TC_SSL:_ssl] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule .* - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.freepapertextures.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.4) [NC] RewriteCond "%{DOCUMENT_ROOT}/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPress
-
Hmm.. maybe you are blocking robots through htaccess?
-
Still still still same problem.. I cleared all cache through W3.. but still google not detecting sitemap. grrrrrrrrrrr... ..Now google also indexed the page which clearly written robots problem - site:http://www.freepapertextures.com
PLEASE HELP
-
Clear your W3 Total Cache. Your robots.txt looks fine but my spider is picking the sitemap up as "noindex, follow" as well.
-
I uninstalled the xml sitemap plugin.. In Yoast that feature is already enabled but now i untick and tick that feature and click save so now it is no more showing 404 error. Second when i submit sitemap_index.xml link .. google still unable to detect and showing same problem.. you can see the screenshot here - http://prntscr.com/g57h7
-
first i use Yoast sitemap feature so this problem happened than i installed xml sitemap
Uninstall the XML sitemap plugin.. Yoast plugin has a built in XML sitemap which is even better.
Yoast sitemap looks like this: http://www.backlinkbuild.com/blog/sitemap_index.xml
Yours returns a 404: http://www.freepapertextures.com/sitemap_index.xml
Make sure you adjust the WP settings: SEO > XML Sitemaps > Check this box to enable XML sitemap functionality.
and submit this url: http://www.freepapertextures.com/sitemap_index.xml
-
Yes.. first i use Yoast sitemap feature so this problem happened than i installed xml sitemap plugin again the same problem. Setting is correct. You can check my domain robot here - http://freepapertextures.com/robots.txt
-
Are those two separate plugins? I know Yoast has a sitemap plugin built in so you don't need another sitemap plugin.
I've never had any issues with it. Make sure your blog setting is set to 'Allow search engines to index this site.' (Settings > Privacy). Reset your cache (if you have one).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help with error: Not Found The requested URL /java/backlinker.php was not found on this server.
Hi all, We got this error for almost a month now. Until now we were outsourcing the webdesign and optimization, and now we are doing it in house, and the previous company did not gave us all the information we should know. And we've been trying to find this error and fix it with no result. Have you encounter this issue before? Did anyone found or knows a solution? Also would this affect our website in terms of SEO and in general. Would be very grateful to hear from you. Many thanks. Here is what appears on the bottom of the site( www.manvanlondon.co.uk) Not Found The requested URL /java/backlinker.php was not found on this server. <address>Apache/2.4.7 (Ubuntu) Server at 01adserver.com Port 80</address> <address> </address> <address> </address>
Web Design | | monicapopa0 -
When Site:Domain Search Run on Google, SSL Error Appears on One URL, Will this Harm Ranking
Greetings MOZ Community: When a site:domain search is run on Google, a very strange URL appears in the search results. The URL is http://www.nyc-officespace-leader.com:2082/ The page displays a "the site's security certificate is not trusted." This only appears for one URL out of 400. Could this indicate a wider problem with the server's configuration? Is this something that needs to be corrected, and if so how? Our ranking has dropped a lot in the last few months. Thanks,
Web Design | | Kingalan1
Alan0 -
AJAX endpoints returning 404 errors in GWT. Why!?!?
Hi guys, So I'm working through a large dataset of 404 errors and trying to clean up the site's crawl-ability. A piece of the puzzle I can't seem to wrap my head around has to do with AJAX endpoints. It looks like GWT thinks these are URLs that don't exist and, therefore, is reporting them as 404 errors. Anyone experience this before?
Web Design | | brad_dubs0 -
Nav / Sitemap Question. Using a "services" page vs just linking directly to individual service page?
Okay, so our company offers video production, web design, and web marketing services. While we do offer these services individually, our goal is to get our clients to integrate these services together. Our nav is currently like so : home - about - video - web design - web marketing - blog - contact Now I've seen businesses and agencies also use a nav with a "services" button instead of listing out their service offerings (if they have more than 1, like us). The services button usually links to a category page or has a drop down with links to the company's individual services. I'm wondering if there is any benefit to having a main services page like this and linking to the individual pages off of it (video ,web design, marketing, etc). Or if we should just keep it the way we have it now (since we've already got some page authority on the individual service pages). I know this may not be the most important aspect of our site and we may be over-thinking it but any thoughts/ideas would be greatly appreciated, thanks!
Web Design | | RenderPerfect0 -
'Increase in soft 404 errors' Webmasters notification. What to do?
I've received a Webmasters notification about an 'increase in soft 404 errors'. When we had the new site launched three months ago we did away with some old pages, which we either 301 to new equivalents, or, we return a 'Oops, that page seems to be missing' 404 page which has some links to important parts on the site that might be of use to the visitor. Any ideas why Webmasters is issuing the warning? Any suggestions as to what to do? Thanks
Web Design | | Martin_S0 -
Which Content Causes Duplicate Content Errors
My Duplicate Content list starts off with this URL: http://www.nebraskamed.com/about-us/branding/bellevue-medical-center-logo Then it lists the five below as Duplicate Content: http://www.nebraskamed.com/about-us/branding/fonts http://www.nebraskamed.com/about-us/branding/clear-zone http://www.nebraskamed.com/about-us/social-media http://www.nebraskamed.com/about-us/branding/order-stationery http://www.nebraskamed.com/about-us/branding/logo I do notice that most of these pages have images and/or little or no content outside of our sites template. Is this causing SEOmoz to see it as duplicate? Should I use noindex, follows to fix this? This error is happening with branding pages so noindex is an option. What should I do if that's not an option? Should I change our mega menus to be ajax driven to so the links aren't showing up in the code of every page?
Web Design | | Patrick_at_Nebraska_Medicine0