False Soft 404s, Shadow Bans, and Old User Generated Content
-
What are the best ways to keep old user generated content (UGC) pages from being falsely flagged by Google as soft 404s? I have tried HTML site maps to make sure no page is an orphaned but that has not solved the problem.
Could crawled currently not indexed by explained by a shadow ban from Google? I have had problems with Google removing pages from SERPs without telling me about it.
It looks like a lot of content is not ranking due to its age. How can one go about refreshing UGC without changing the work of the user?
-
I solved most of the Soft 404s. I just had to change the URLs, submit a sitemap with the new URLs, and redirect the old URLs to the new ones. This impacted 85 URLs and as of now 60 are back in Google while the others are listed as discovered, but currently not indexed.
I also added this HTML sitemap just for those URLs so that they are not orphaned.
I am still puzzled by the initial Soft 404 status' however. The new URLs differ only in URL and everything else is the same.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old Content after 301 Redirect Success
Hi, I want to ask what need I do to the old content after my 301 redirect to the new domain with the same content success? Do I need to remove that old content? Nothing bad happen right? Thanks
Technical SEO | | matthewparkman0 -
Content from Another Site
Hi there - I have a client that says they'll be "serving content by retrieving it from another URL using loadHTMLFile, performing some manipulations on it, and then pushing the result to the page using saveHTML()." Just wondering what the SEO implications of this will be. Will search engines be able to crawl the retrieved content? Is there a downside (I'm assuming we'll have some duplicate content issues)? Thanks for the help!!
Technical SEO | | NetStrategies1 -
Duplicate Content Problems
Hi I am new to the seomoz community I have been browsing for a while now. I put my new website into the seomoz dashboard and out of 250 crawls I have 120 errors! So the main problem is duplicate content. We are a website that finds free content sources for popular songs/artists. While seo is not our main focus for driving traffic I wanted to spend a little time to make sure our site is up to standards. With that said you can see when two songs by an artist are loaded. http://viromusic.com/song/125642 & http://viromusic.com/song/5433265 seomoz is saying that it is duplicate content even though they are two completely different songs. I am not exactly sure what to do about this situation. We will be adding more content to our site such as a blog, artist biographies and commenting maybe this will help? Although if someone was playing multiple bob marley songs the biography that is loaded will also be the same for both songs. Also when a playlist is loaded http://viromusic.com/playlist/sldvjg on the larger playlists im getting an error for to many links on the page. (some of the playlists have over 100 songs) any suggestions? Thanks in advance and any tips or suggestions for my new site would be greatly appreciated!
Technical SEO | | mikecrib10 -
How different should content be so that it is not considered duplicate?
I am making a 2nd website for the same company. The name of the company, our services, keywords and contact info will show up several times within the text of both websites. The overall text and paragraphs will be different but some info may be repeated on both sites. Should I continue this? What precautions should I take?
Technical SEO | | savva0 -
Rebuilding an old website
Since we have a strong website; meaning high traffic, but we got 2 issues 1. the framework of the design is not user friendly. 2. the current platform is really old; therefor it comes up with technical problems daily/ We are worried about our links which will affect in our new design, what would be wise to do? Thanks
Technical SEO | | apexcue0 -
Duplicate content error from url generated
We are getting a duplicate content error, with "online form/" being returned numerous times. Upon inspecting the code, we are calling an input form via jQuery which is initially called by something like this: Opens Form Why would this be causing it the amend the URL and to be crawled?
Technical SEO | | pauledwards0 -
Blocking AJAX Content from being crawled
Our website has some pages with content shared from a third party provider and we use AJAX as our implementation. We dont want Google to crawl the third party's content but we do want them to crawl and index the rest of the web page. However, In light of Google's recent announcement about more effectively indexing google, I have some concern that we are at risk for that content to be indexed. I have thought about x-robots but have concern about implementing it on the pages because of a potential risk in Google not indexing the whole page. These pages get significant traffic for the website, and I cant risk. Thanks, Phil
Technical SEO | | AU-SEO0 -
Why is my website banned?
IMy website is Costume Machine at www.costumemachine.com . My site has been banned for 1 year now. I have requested that google reconsider my site 3 times without luck. The site is dynamic and basically pulls in feeds from affiliate sites. We have added over 1,500 pages of original content. The site has been running great since 2008 without any penalties. I don't think I got hit with any linking penalty. I cleaned up all questionable links last November when the penalty hit. Am I being hit with a "thin" site penalty? If that is the issue what is the best way to fix the problem?
Technical SEO | | tadden0