RSS Hacking Issue
-
Hi
Checked our original rss feed - added it to Google reader and all the links go to the correct pages, but I have also set up the RSS feed in Feedburner. However, when I click on the links in Feedburner (which should go to my own website's pages) , they are all going to spam sites, even though the title of the link and excerpt are correct.
This isn't a Wordpress blog rss feed either, and we are on a very secure server.
Any ideas whatsoever? There is no info online anywhere and our developers haven't seen this before.
Thanks
-
Thanks so much for your help - I think this should fix it. You've saved me hours of time. It's our own cms so I should be able to fix it today.
-
I don't think you're being linked to spam, specifically. What you're seeing is the Feedburner page linking your post titles to feeds.feedburner.com/[whatever the guid of the post is] -- URLs of different feeds from different sites entirely.
I believe this is the problem referenced in the FeedBurner FAQ - http://www.google.com/support/feedburner/bin/answer.py?hl=en&answer=79014&topic=13190 - "Why don't my feed content item links work?"
In which case, the isPermalink attribute on the feed guids should be false. I'd post about this on the support forum for your CMS.
-
Mmm, actually maybe if I change that guid entry that came up in the validator to false that will fix it?
-
Some answers to your checks:
- Feed is correct - still my feed
- No FeedMedic reports -says everything is fine
- Feedburner url and url people are directed to from the blog are the same
- No malware reports
- Ran tool on blog article page, rss, feedburner page, and feedburner article link page - doesn't pick up any malware
- Validity check brings up one issue: guid must be a full URL, unless isPermaLink attribute is false:
129
- Current entry for guid for one article is <a id="l16" name="l16">
<guid ispermalink="true">129</guid>
</a>
Sure, here's the feed: http://feeds.feedburner.com/EnjoyTravelBlog (check in Chrome or IE as for some reason someone looking in Firefox didn't see them)
Here are screencasts of what I see if I click on any of the article titles:
- http://screencast.com/t/PNvrItea3ky - see articles 1 & 2
- http://screencast.com/t/bZI8qlg74 - what I see if I click on article 1 - clicking on link goes to spam site
- http://screencast.com/t/cER9Fm9RTunm - what I see if I click on article 2
Like this for every single article - even got some links to Baidu, Ebay and all sorts in there.
Would welcome suggestions on other forums to post on if this goes beyond technical seo!
-
A few avenues to check out:
- Log into your feedburner account and make sure the feed it's processing is still your blog's actual feed.
- Under feedburner's "Troubleshootize" tab, check if there are any FeedMedic reports, and under Tips and Tools run the feed validity checks.
- Check and make sure the Feedburner URL shown in your account is the same one people are being directed to on the blog.
- Go to Google Webmaster Tools. Under Diagnostics, check and see if there are any malware reports.
- Run a malware scan on the site URL and the Feedburner URL through a tool like http://sitecheck.sucuri.net/scanner/
Can you provide us more information? Screenshots showing links and the URLs they direct you to?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitted URL has crawl issue - Submitted URL seems to be a Soft 404 - but all looks fine
Google Search Console is showing some pages up as "Submitted URL has crawl issue" but they look fine to me. I have set them as fixed but after a month they were finally re-crawled and google states the issue persists. Examples are: https://www.rscpp.co.uk/counselling/175809/psychology-alcester-lanes-end.html
Technical SEO | | TommyNewmanCEO
https://www.rscpp.co.uk/browse/location-index/889/index-of-therapy-in-hanger-lane.html
https://www.rscpp.co.uk/counselling/274646/psychology-waltham-forest-sexual-problems.html There's also some "Submitted URL seems to be a Soft 404": https://www.rscpp.co.uk/counselling/112585/counselling-moseley-depression.html I also have more which are "pending", but again I couldn't see a problem with them in the first place. I'm at a bit of a loss as to what to do next. Any advice? Thanks in advance.0 -
Google Appending Blog URL inbetween my homepage and product page is it issue with base url?
Hi All, Google Appending Blog URL inbetween my homepage and product page. Is it issue or base url or relative url? Can you pls guide me? Looking to both tiny url you will get my point what i am saying. Please help Thanks!
Technical SEO | | amu1230 -
How to prevent duplicat content issue and indexing sub domain [ CDN sub domain]?
Hello! I wish to use CDN server to optimize my page loading time ( MaxCDN). I have to use a custom CDN sub domain to use these services. If I added a sub domain, then my blog has two URL (http://www.example.com and http://cdn.example.com) for the same content. I have more than 450 blog posts. I think it will cause duplicate content issues. In this situation, what is the best method (rel=canonical or no-indexing) to prevent duplicate content issue and prevent indexing sub domain? And take the optimum service of the CDN. Thanks!
Technical SEO | | Godad0 -
Internal Blog - Embed Categorized RSS Feeds into Site Web Pages
I am thinking about additional ways to repurpose blog posts through out my website. I have a blog - http://www.domainname.com/blog I would like to use the blog categories, which are aligned with the site structure, and create on-page RSS Feeds for my regular web pages. Anything here that might not be good for SEO? Thank you
Technical SEO | | evereffect0 -
Duplicate Title Tag issue due to Shopify CMS
Hi guys, I'm a novice really when it comes to SEO, yet have taken it in house for the next year or so, firstly because I have had my fingers burnt twice...and secondly, to allow me to recoup some of the loss from my prior campaigns. One thing I have noticed on my site (which uses a Shopify E-commerce CMS), is that Shopify duplicates a url for each my products. An example of this is http://www.vidahomes.co.uk/collections/designer-radiators-heating/products/reina-aliano
Technical SEO | | philscott2006
http://www.vidahomes.co.uk/products/reina-aliano Both products provide exactly the same information, yet appear in different ways subject to how the customer finds them. I contacted Shopify to find a fix to this issue when I noticed a high amount of Duplicate Title Tags in my SEO crawl. Their response was as follows. Using a rel canonical link will help prevent duplicate content issues with search engines. All you need to do is add this line of code: **<link rel="canonical" href="{{ canonical_url }}" />** ** before the tag in the theme.liquid file. It’s that simple :)** The theme liquid file basically generates the outer template for the whole site, and is only compromised when over-ruled. This all seems a little too easy for me, so I am hoping whether someone can elaborate as to whether this will work or not, as I'm not entirely sold on their response. I was always under the impression with canonical tags, that they should be added to the header section of the duplicate page in question, which refers back to the original page. The code I have been told to add above implies that the canonical tag would be added to every page in my site so the Google robot would have a hard time in finding anything at all of relevance Thanks in advance for any assistance with this. Kind Regards Phil Scott Vida Homes0 -
Duplicate content issue. Delete index.html and replace with www.?
I have a duplicate content issue. On my site the home button goes to the index.html and not the www. If I change it to the www will it impact my SERPS? I don't think anyone links to the index.html.
Technical SEO | | bronxpad1 -
Need help with Joomla duplicate content issues
One of my campaigns is for a Joomla site (http://genesisstudios.com) and when my full crawl was done and I review the report, I have significant duplicate content issues. They seem to come from the automatic creation of /rss pages. For example: http://www.genesisstudios.com/loose is the page but the duplicate content shows up as http://www.genesisstudios.com/loose/rss It appears that Joomla creates feeds for every page automatically and I'm not sure how to address the problem they create. I have been chasing down duplicate content issues for some time and thought they were gone, but now I have about 40 more instances of this type. It also appears that even though there is a canonicalization plugin present and enabled, the crawl report shows 'false' for and rel= canonicalization tags Anyone got any ideas? Thanks so much... Scott | |
Technical SEO | | sdennison0 -
Will same language different region (US/UK) geotargeting via subdirectory (& GWT) cause dupe content or other issues ?
If a UK hosted site on a .com, needs to target US now too but for keywords that are spelt differently in US is creating duplicate version of uk hosted .com site and putting it on a subdirectory .com/us/ and geotargeting via webmaster tools (to usa) ok ? I take it in this scenario no dupe content issues (or other issues) so long as is geotargeted via GWT ? Or are there ? Comments from anyone with experience doing similar (same language, different region geo-targeting dupe content with kw spelling being only difference, via a subdirectory or other route) much appreciated ? Many Thanks 🙂
Technical SEO | | Dan-Lawrence0