Site Map Problems or Are They?
-
According to webmaster tools my Sitemap contains urls which are blocked by robots.txt
Our site map is generically generated and encompasses all web pages, whether I have excluded them using the robots.txt file
As far as I am aware this has never been an issue until recently. Is this hurting my rankings and how do I fix it?
Secondly, webmaster tools says there is over 5,000 error/warnings on my site map. But site map is only 1,400 or so pages submitted. How do I see what is going on?
-
Thanks for the robots.txt help - kind of what I figured it's just never said anything about it before.
Now for the 5000+ errors/warnings for the site map, it only allows me to see a few of them - how do I see the rest? the issue with the robot.txt problem is all i can see right now
-
Usually, you would only put links you want indexed into the sitemap.xml file. GWT is just letting you know some of those links you posted are being blocked by robots (and wont be indexed). Shouldn't hurt your rankings if you don't want to index those pages anyways.
Check out the Optimization > Sitemaps > # errors link in GWT for more details.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A problem with duplicate content
I'm kind of new at this. My crawl anaylsis says that I have a problem with duplicate content. I set the site up so that web sections appear in a folder with an index page as a landing page for that section. The URL would look like: www.myweb.com/section/index.php The crawl analysis says that both that URL and its root: www.myweb.com/section/ have been indexed. So I appear to have a situation where the page has been indexed twice and is a duplicate of itself. What can I do to remedy this? And, what steps should i take to get the pages re-indexed so that this type of duplication is avoided? I hope this makes sense! Any help gratefully received. Iain
Technical SEO | | iain0 -
Auto genrated content problem?
Hi all, I operate a Dutch website (sneeuwsporter.nl), the website is a a database of European ski resorts and accommodations (hotels, chalets etc). We launched about a month ago with a database of about 1700+ accommodations. Of every accommodation we collected general information like what village it is in, how far it is from the city centre and how many stars it has. This information is shown in a list on the right of each page (e.g. http://www.sneeuwsporter.nl/oostenrijk/zillertal-3000/mayrhofen/appartementen-meckyheim/). In addition a text of this accomodation is auto generated based on some of the properties that are also in the list (like distance, stars etc). Below the paragraph about the accommodation is a paragraph about the village the accommodation is located in, this is a general text that is the same with all the accommodations in this village. Below that is a general text about the resort area, this text is also identical on all the accommodation pages in the area. So a lot of these texts about the village and area are used many times on different pages. Things went well at first and every day we got more Google traffic, and more and more pages. But a few days ago our organic traffic took a near 100% dive, we are hardly listed anymore and if we are at very low places. We expect the Google gave us a penalty. We expect this to be the case because of 2 reasons: we have auto generated text that only vary slightly per page we re-use the content about villages and area's on many pages We quickly removed the content of the villages and resort area's because we are pretty sure that this is definitely something Google does not want. We are less sure about the auto generated content, is this something we should remove as well? These are normal readable text, they just happen to be structured more or less the same way on every page. Finally, when we made these and maybe some other fixes, what is the best and quickest ways to let Google see us again and show them we improved? Thanks in advance!
Technical SEO | | sneeuwsporter0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
Correct Indexing problem
I recently redirected an old site to a new site. All the URLs were the same except the domain. When I redirected them I failed to realize the new site had https enable on all pages. I have noticed that Google is now indexing both the http and https version of pages in the results. How can I fix this? I am going to submit a sitemap but don't know if there is more I can do to get this fixed faster.
Technical SEO | | kicksetc0 -
How to handle a merge with another site
I own a gaming site at legendzelda.net Recently a site zelda-temple.net wanted to merge communities and sites. We pretty much scrapped their content (a lot was articles I already had topics on), merged the user databases, and redirected the domain to mine. What is the best seo way to redirect that domain. I tried to set up a 301 on the entire domain to gain all the backlinks that site had (A LOT) but it seems as if a lot are not being picked up by the open site explorer and other tools. Advice?
Technical SEO | | webfeatseo0 -
Mini site links?
Can anyone point me to information about the "mini" site links on the Google search results or tell me how to get them set up? These aren't the full site links that show 3 by 3 under the first listing but small text links that appear for certain results. (See attached image for reference.) Are these something that can controlled/requested? NAj6E.png
Technical SEO | | DVanSchepen0 -
Why does my site have a PageRank of 0?
My site (www.onemedical.com) has a PageRank of 0, and I can't figure out why. We did a major site update about a year ago, and moved the site from .md to .com about 9 months ago. We are crawled by Google and rank on the first page for many of our top keywords. We have a MozRank of 4.59. I figured this is something that would just take time to work out of the system, but nothing seems to change while we patiently wait. One more thing to note - when a user comes to the homepage (city selector) and selects their region they will then be cookied and directed to their relevant city site on subsequent visits. But even our city-specific pages (ie www.onemedical.com/sf) have pageranks of 0. My management team keeps asking me about this and I suspect there is something silly that we keep overlooking...but for the life of me, can't figure it out. Any help would be appreciated.
Technical SEO | | OneMedical0 -
Problem with my site
the site is casino.pt we created the site 7-8 month ago, we started to push it by good and natural links (http://www.opensiteexplorer.org/www.casino.pt/a!links!!filter!all!!source!external!!target!page), links in sites with content rich and most of them related to gambling and sport topics. During the first 3-5 months, the rankings were better and better, after the 6 months, the site lose all its rankings. Aditional details http://www.casino.pt/robots.txt http://www.google.pt/#hl=pt-PT&source=hp&biw=1280&bih=805&q=site:http%3A%2F%2Fwww.casino.pt&aq=f&aqi=&aql=&oq=&fp=2651649a33cd228 no critical errors in google webmaster tools any idea how can I fix it? thanks
Technical SEO | | Yaron530