XML Sitemap & Bad Code
-
I've been creating sitemaps with XML Sitemap Generator, and have been downloading them to edit on my pc. The sitemaps work fine when viewing in a browser, but when I download and open in Dreamweaver, the urls don't work when I cut and paste them in the Firefox URL bar. I notice the codes are different. For example, an "&" is produced like this..."&". Extra characters are inserted, producing the error.
I was wondering if this is normal, because as I said, the map works fine when viewing online.
-
Thanks guys! Upon further research what's happening is "Entity Escaping", where symbols have to use a code...ie & =
&, so it's all good.
-
It's probably normal within Dreamweaver, however a browser will see the & probably like a & so that won't be a problem for Google I'd guess if you want to submit your sitemap to the search engines.
-
Dreamweaver does funky stuff when you go from visual to code. Try opening the xml sitemap in notepad and copying/pasting from there and see if you get the same problem.
But based on my experience with that site, you should be fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pending Sitemaps
Hi, all Wondering if someone could give me a pointer or two, please. I cannot seem to get Google or Bing to crawl my sitemap. If I submit the sitemap in WMT and test it I get a report saying 44,322urls found. However, if I then submit that same sitemap it either says Pending (in old WMT) or Couldn't fetch in the new version. This couldn't fetch is very puzzling as it had no issue fetching the map to test it. My other domains on the same server are fine, the problem is limited to this one site. I have tried several pages on the site using the Fetch as Google tool and they load without issue, however, try as I may, it will not fetch my sitemap. The sitemapindex.xml file won't even submit. I can confirm my sitemaps, although large, work fine, please see the following as an example (minus the spaces, of course, didn't want to submit and make it look like I was just trying to get a link) https:// digitalcatwalk .co.uk/sitemap.xml https:// digitalcatwalk .co.uk/sitemapindex.xml I would welcome any feedback anyone could offer on this, please. It's driving me mad trying to work out what is up. Many thanks, Jeff
Intermediate & Advanced SEO | | wonkydogadmin0 -
In Search Console, why is the XML sitemap "issue" count 5x higher than the URL submission count?
Google Search Console is telling us that there are 5,193 sitemap "issues" - URLs that are present on the XML sitemap that are blocked by robots.txt However, there are only 1,222 total URLs submitted on the XML sitemap. I only found 83 instances of URLs that fit their example description. Why is the number of "issues" so high? Does it compound over time as Google re-crawls the sitemap?
Intermediate & Advanced SEO | | FPD_NYC0 -
Should I implement Structure Data Markup before implementing AMP?
I am about to implement AMP and structured data markup on my site which one should be done first?
Intermediate & Advanced SEO | | Leebi0 -
Thin Content, Ecommerce & Reviews
I've been reading a lot today about thin content and what constitutes thin content. We have an ecommerce site and have to compete with large sites in Google - product pages in terms of content quantity are low and obviously competitors all have similar variations of the same product descriptions. Does Google still consider ecommerce sites as with thin content as low quality? A product page surely shouldn't have too much content which doesn't help the user. My solution to start was to get our customer reviews added to the product pages to help improve the amount of quality content on this page, then move into adding video etc when we have resource. Thanks
Intermediate & Advanced SEO | | BeckyKey0 -
Canonical code set up correctly?
Please let me know if this makes sense. I have a very limited knowledge of technical SEO but I am almost positive that my web developer did something wrong. I have a wordpress blog and he did add canonical code to some of the pages. However he directs the site to the same URL! Does this mean that the canonical code is setup incorrectly and actually harming my SEO performance. Also if I have one webpage with just the first paragraph of a blog post I wrote and a completely seperate page for the blog post itself, could this be considered duplicate content? Thanks!!
Intermediate & Advanced SEO | | DR700950 -
PR & DA
What are the best ways to increase a website's page rank and domain authority?
Intermediate & Advanced SEO | | WebMarkets0 -
Is hidden content bad for SEO?
I am using this plugin to enable Facebook comments on my blog:
Intermediate & Advanced SEO | | soralsokal
https://wordpress.org/plugins/fatpanda-facebook-comments/ This shows the comment in an Facebook iFrame. The plugin author claims it's SEO friendly, because the comments are also integrated in the WordPress database. The are included in the post but hidden. Is that bad for SEO?0 -
Tool to check XML sitemap
Hello, Can anyone help me finding a tool to have closer look of the XML sitemap? Tks in advance! PP
Intermediate & Advanced SEO | | PedroM0