Crawl error vs RSS feed content
-
Hi,
I have a BuddyPress Multisite with sites dedicated to specific RSS feeds. I did this to pull in the content for my users. But they generate a 1000s of SEMOZ errors/warnings for dup content, dup titles, missing metatags. etc.
So does keeping the content help my site SEO less than the errors? The content is not under my control so I have no way to relate it to my keywords.
Any opinions?
I can recreate these RSS sites, so I will probably delete them to see what happens.
Larry
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal blog with history and some SEO value versus new external blogs with specialized content?
We operate a blog inside a folder on our site and considering the launch of 4 highly focused blogs with specialized content which are now categories on the internal blog. Wondering if there is more value in using the external new blogs or just keep growing the internal blog content. Does fact that the internal blog is buried amongst millions of pages have any impact if we want the content indexed and value given to the links from the blog content to our main site pages.
Content Development | | CondoRich0 -
Why is content getting longer?
I find it odd that with the way life is today -- the gotta-have-it-now, instant gratification, can't hold someone's attention span for longer than 3 seconds -- why Google is wanting content to be REALLY long?? I've read articles saying content should be as long as 2,000 words per page. This just seems nuts to me. No one wants to read anymore. Look at how short Twitter posts are and how videos are so prevalent now. Any thoughts?
Content Development | | SEOhughesm0 -
Free Duplicate Content Checker Tools ?
Hi Moz, I am really looking for free tools which can carry my content duplication issue, as i visited http://moz.com/community/q/are-there-tools-to-discover-duplicate-content-issues-with-the-other-websites suggested copyscape which is paid. I want FREE to handle my duplication issue.' Thanks in Advance. Best,
Content Development | | Futura
Teginder1 -
Duplicate Legal Content
Oftentimes lawyer websites will publish laws (codes, statutes, regulations, case law, etc). They add no value to the text, it's just copy pasted. Therefore, the same text/content may be on potentially hundreds of websites. Does google interpret this as duplicate content, or does it recognize government content as special? I want to have the laws on my website as well, however I am debating whether to add no follow tags or not. Or I'm thinking about adding value to the content by breaking down the specific law. However, even then at least 50% of the content on the page will still be the law, and I'm not sure if that is enough to be considered duplicate content.
Content Development | | irnikij0 -
Nearely identical content
Hi Everybody, I'm just checking the warnings from Seomoz an realized that on our site there are a lot of duplicate page content problems. In fact some of them are not really duplicated content because there are subtle differencies ie. colour or pack of products: http://www.szepsegbolt.hu/termekek/david_beckham_intimately_yours_for_man_eau_de_toilette_30_ml.html http://www.szepsegbolt.hu/termekek/david_beckham_intimately_yours_for_man_eau_de_toilette_50_ml.html What do you suggest, ignore this warning or change something on the site? Thank you in advance Balint
Content Development | | SanomaMediaseo0 -
Is it advisable to have unique pages for different cities/states though there wouldnt be any actual differentiation in the actual content.
Is it advisable to have unique pages for different cities/states though there wouldnt be any actual differentiation in the content. For example should we have separate pages for "hammers in california" & "hammers in new york". The product is same and content more or less the same. The search volume for individual queries is low but collectively makes a large number. The unique title tag automatically will generate traffic. So does it make sense to make 50 such pages. Else is there any way to uniquely target 50 such queries/month/city
Content Development | | DYo0 -
Duplicate content - 6 websites, 1 IP. Is the #1 site knocked down too?
Yes I know, running multiple websites on 1 IP isn't smart. 6 Websites with duplicate content on 1 IP is even worse. It's a technical issue we can't solve quickly. Thing is, our #1 website, which has the highest DA and PR, was the first website with all this content. All other websites we're running were launched a few months, and some a few years, later. All content was copied from the #1 website. I'd say the other websites would get knocked down by Google, because they duplicated the content. Google should see that our #1 website was the first that uploaded this content. Therefore our #1 website should rank normally. Questions is: What does Google think of duplicate content when all websites are on 1 IP? Is, or will our #1 website get punished as well?
Content Development | | Webprint0 -
404 errors
Im experiencing a different problem. I created a website using wordpress "www.xxxxx.com". When I publish any post (eg:"www.xxxxxx.com/abc/"), along with the post, its also creating another page like "www.xxxxxx.com/abc/www.xxxxx.com" which is a 404 error (showing in SEOmoz reports) automatically the website name is coming at the end (one at first and one at last) and creating a 404 error. Please let me know the reason why its happening and the solution for redirecting all 404 and misspelt pages to home page. Thankyou Raja
Content Development | | rajaletstalk0