Sites with dynamic content - GWT redirects and deletions
-
We have a site that has extremely dynamic content.
Every day they publish around 15 news flashes, each of which is setup as a distinct page with around 500 words. File structure is bluewidget.com/news/long-news-article-name. No timestamp in URL.
After a year, that's a lot of news flashes. The database was getting inefficient (it's managed by a ColdFusion CMS) so we started automatically physically deleting news flashes from the database, which sped things up.
The problem is that Google Webmaster Tools is detecting the freshly deleted pages and reporting large numbers of 404 pages. There are so many 404s that it's hard to see the non-news 404s, and I understand it would be a negative quality indicator to Google having that many missing pages.
We were toying with setting up redirects, but the volume of redirects would be so large that it would slow the site down again to load a large htaccess file for each page.
Because there isn't a datestamp in the URL we couldn't create a mask in the htaccess file automatically redirecting all bluewidget.com/news/yymm* to bluewidget.com/news
These long tail pages do send traffic, but for speed we only want to keep the last month of news flashes at the most.
What would you do to avoid Google thinking its a poorly maintained site?
-
Get someone to look at the database queries in coldfusion. Unless you have tens of millions of flashes it should be able to handle it on even a reasonably modest server for your traffic levels. It doesn't sound like it should be taxing.
However it sounds like your problem is some badly structured queries. The good news though is that this is probably quicker and easier too fix than upgrading hosting, coding new removal behaviour or any other "work-around"
What would you do to avoid Google thinking its a poorly maintained site?
Sorry to sound glib, but the answer is "maintain it better".
-
Well, to be honest Chris, It is a poorly maintained site. I mean deleting the past news is not at all the solution... I guess the right solution is to work on hosting side and enhance the data base or else the problem will be continuous.
If you are going to use too many redirections this will cause another problem and at the same time will slow down your website speed.
Actually 404s are generating because the URL is available in Google index but there is no page on your website available against it and this causes a 404... The easy way is to reduce 404 is to send a removal request to Google from Google webmaster tool.
And for future... instead of removing pages you should try investing on database and better hosting.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Launching a new website. Old inherited site cannot be saved after lifted penalty. When should we kill the old site and how?
Background Information A website that we inherited was severely penalized and after the penalty was revoked the site still never resurfaced in rankings or traffic. Although a dramatic action, we have decided to launch a completely new version of the website. Everything will be new including the imagery, branding, content, domain name, hosting company, registrar account, google analytics account, etc. Our question is when do we pull the plug on the old site and how do we go about doing it? We had heard advice that we should make sure we run both sites at the same time for 3 months, then deindex the old site using a noindex meta robots tag.We are cautious because we don't want the old website to be associated in any way, shape or form with the new website. We will purposely not be 301 redirecting any URLs from the old website to the new. What would you do if you were in this situation?
Intermediate & Advanced SEO | | peteboyd0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
How should I manage duplicate content caused by a guided navigation for my e-commerce site?
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
Intermediate & Advanced SEO | | FireMountainGems0 -
Does duplicate content penalize the whole site or just the pages affected?
I am trying to assess the impact of duplicate content on our e-commerce site and I need to know if the duplicate content is affecting only the pages that contain the dupe content or does it affect the whole site? In Google that is. But of course. Lol
Intermediate & Advanced SEO | | bjs20100 -
New Site Structure and 404s. Should I redirect everything?
Hi fellow Mozzers, I've recently re-released a site and also took the opportunity to change/clean up the URL structure. As a result of this Google is starting to report many 404s such as below; blog/tag/get-fit/ blog/tag/top-presents/ Most of these 404 errors are from tag or category pages which simply don't exist any more (because they were unnecessary, crap or irrelevant). Although there's also a few posts I've removed. My question is whether it's worth redirecting all these tags and pages to the root directory of the site's new blog (as there isn't really a new page which is similar or appropriate) or just to leave them as 404 errors. Bearing in mind; They don't really rank for anything There's little if any links pointing to these pages Thanks.
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
Is there a way to redirect pages from an old site?
I have no access to an old wordpress site of a client's, but have parked the domain on their new site, gone into webmaster central and requested a change of address and wait... the old domain still shows in the search listings in place of the new site domain and the log files show 404 errors from links to the old site which go nowhere - can anyone suggest a way of managing this on the new site - is there a workaround to what should have been done - 301 redirects on the old site before it was taken down. many thanks
Intermediate & Advanced SEO | | Highlandgael0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
Does having many 302 redirects on a site hurt rankings?
I am working with an affiliate website which has many product listings but the "Buy Now" button on each product listing is an external link (affiliate link) to the appropriate product page on the actual website where the product can be bought. Each of these external links passes through an internal redirect, which is implemented as a 302. Consequently, when SEOmoz crawls that site, it gives me a warning that there are hundreds of 302 redirects on the site. Do you think this will hurt the site's potential to rank? Should I remove the redirects and leave them as direct links to the external site?
Intermediate & Advanced SEO | | AlexFusman0