RSS feed issue
-
My Wordpress blog RSS feed is not working correctly and I can't figure why. This is the error I am getting in my sidebar where the RSS feed used to work properly. My Blog is http://www.seadwellers.com/key-largo-diving-blog/
RSS Error: This XML document is invalid, likely due to invalid characters. XML error: not well-formed (invalid token) at line 274, column 32
Any insight would be appreciated greatly!
Rob
-
Ok - wasn't willing to let this go until I could definitively track down the issue. I've never come across this before.
Rob, you've somehow managed to get a Unicode Escape character entered at the beginning of one of your paragraphs in your "Photography Series Part 1 Picking your Camera" post. This character is invisible to the reader, but because it's considered an invalid XML character, your RSS feed (which is just a specifically formatted XML file) is erroring out, as you noted.
To fix this, you're going to have to go back into your WordPress dashboard to edit that post.
Scroll to the second paragraph in the Smart Phone section that begins "You can buy a waterproof housing..."
Place your cursor at the beginning of the word "buy" and then use your backspace key to delete all the way back to the word "plenty" in the paragraph before.
- Then hit the Enter key to create the new paragraph, and retype the words you've deleted in that "You can buy..." sentence.
- Then retype the words into the end of the previous sentence.
- Then, click Update to save the new version of the post.
Believe it or not, that should fix the RSS feed validation problem.
Now, the next issue will be that many places where the RSS is used will cache the feed (i.e. store a copy of it) The only way to bust/refresh that cache will be to publish another post, so the cached version gets updated to show the new post and the fix to the old post.
If you are adding the RSS feed somewhere it hasn't been used before (e.g. to a newly created sidebar widget) it should be pulling the corrected version of the feed and work as expected.
Hope all that makes sense - if not, just holler. Sorry for all the previous confusion as I tried to work out what was actually happening.
Paul
[Ignore the original suggestion below - the fix required more research]
Rob, on your blog post titled Photography Series Part 1, Picking the Right Camera, the first subhead is "Which Camera is Right for you? Sea Dwellers Dive Center of Key Largo..."
Try removing the extra space after the question mark and before the words Sea Dwellers.
[EDITED TO ADD: Also remove the exclamation mark from the alt tag (alt="Dive and shoot!") for the associated photo.]
Then you'll have to clear the site's cache if it has one to clear any RSS cache.
Lemme know how you get on.
Paul
-
Thank you Dan for the response....
This is the error message I am getting when drag my rss widget to my sidebar...
RSS Error: This XML document is invalid, likely due to invalid characters. XML error: not well-formed (invalid token) at line 274, column 32
Here is my blog address..
http://www.seadwellers.com/key-largo-diving-blog/I'm entering my blog address in the widget as
http://www.seadwellers.com/key-largo-diving-blog/feed/Any help is greatly appreciated. Does your company offer one-time onpage seo service by any chance?
Rob
Sea Dwellers Dive Center of Key Largo -
Hi Rob
I don't currently see the error on that page, and do not see an RSS feed linked in your source code - did something change or can you provide more info? The URL to your RSS feed if possible?
Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are there any SEO issues we should be aware of on Gutenberg?
We are launching a new website and switching to WP 5.0 Gutenberg. Are there any issues we should be aware of related to SEO with the new platform?
Technical SEO | | AegisLiving0 -
Please let me know if I am in a right direction with fixing rel="canonical" issue?
While doing my website crawl, I keep getting the message that I have tons of duplicated pages.
Technical SEO | | kirupa
http://example.com/index.php and http://www.example.com/index.php are considered to be the duplicates. As I figured out this one: http://example.com/index.php is a canonical page, and I should point out this one: http://www.example.com/index.php to it. Could you please let me know if I will do a right thing if I put this piece of code into my index.php file?
? Or I should use this one:0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
I've had a sudden a increase in crawl issues as of yesterday (like 300 from a steady 10, does anyone else have this issue?
the main issue is that it's now indexing both www and http:// - anyone else got this issue or had any changes suddenly on their crawl results?
Technical SEO | | beckyhy0 -
How do I Address Low Quality/Duplicate Content Issue for a Job portal?
Hi, I want to optimize my job portal for maximum search traffic. Problems Duplicate content- The portal takes jobs from other portals/blogs and posts on our site. Sometimes employers provide the same job posting to multiple portals and we are not allowed to change it resulting in duplicate content Empty Content Pages- We have a lot of pages which can be reached via filtering for multiple options. Like IT jobs in New York. If there are no IT jobs posted in New York, then it's a blank page with little or no content Repeated Content- When we have job postings, we have about the company information on each job listing page. If a company has 1000 jobs listed with us, that means 1000 pages have the exact same about the company wording Solutions Implemented Rel=prev and next. We have implemented this for pagination. We also have self referencing canonical tags on each page. Even if they are filtered with additional parameters, our system strips of the parameters and shows the correct URL all the time for both rel=prev and next as well as self canonical tags For duplicate content- Due to the volume of the job listings that come each day, it's impossible to create unique content for each. We try to make the initial paragraph (at least 130 characters) unique. However, we use a template system for each jobs. So a similar pattern can be detected after even 10 or 15 jobs. Sometimes we also take the wordy job descriptions and convert them into bullet points. If bullet points already available, we take only a few bullet points and try to re-shuffle them at times Can anyone provide me additional pointers to improve my site in terms of on-page SEO/technical SEO? Any help would be much appreciated. We are also thinking of no-indexing or deleting old jobs once they cross X number of days. Do you think this would be a smart strategy? Should I No-index empty listing pages as well? Thank you.
Technical SEO | | jombay3 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue: Url blocked by robots.txt. Description: Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml Value: http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860 -
Duplicate page issue in website
i found duplicate pages in my website. seomoz is showing duplicate web pages this is issue or not please tell me?
Technical SEO | | learningall0 -
Canonical URL Issue
Hi Everyone, I'm fairly new here and I've been browsing around for a good answer for an issue that is driving me nuts here. I tried to put the canonical url for my website and on the first 5 or 6 pages I added the following script SEOMoz reported that there was a problem with it. I spoke to another friend and he said that it looks like it's right and there is nothing wrong but still I get the same error. For the URL http://www.cacaniqueis.com.br/video-caca-niqueis.html I used the following: <link rel="<a class="attribute-value">canonical</a>" href="http://www.cacaniqueis.com.br/video-caca-niqueis.html" /> Is there anything wrong with it? Many thanks in advance for the attention to my question.. 🙂 Alex
Technical SEO | | influxmedia0