Temporary Blog Removal Process Question
-
Hello,
We need to temporarily remove our entire blog from one of our sites. They rely very heavily on the blog content; as the rest of the site does not have much. Is there a 'best' process for going about this? – Should we put a 302 redirect on the entire blog and if so should it be to the homepage or an under-construction type page?
Any extra insight or suggestions would be helpful and appreciated.
Looking forward to hearing from you!
Thank you in advance for the help.
Best,
-
No problem at all.
The legal implications certainly make it more complicated! Unfortunately the only suggestion I could really offer here is to block out a time where you can get through them all at once or withing a day or two max.
The longer they're down, the greater the chance you'll drop in rankings and not see an easy recovery.
It may even be worth doing them in batches of 10 - 20 posts and re-publishing them as you finish each batch. At least this way you're steadily minimising the number of pages that are offline rather than taking the all or nothing approach.
-
Thank you, Chris. We were thinking of taking the 302 redirect approach. The issue is actually with all of the images used on the blog articles. There are 100’s of posts and many from a few years ago need to be evaluated for copyright concerns. While we sort through all of them we will need to take the blog down temporarily.
Let us know if you have any other suggestions/approaches you think would be best. The extra insight is appreciated.
Looking forward to hearing from you!
Thanks again.
Best,
-
That's a very tough situation. Unless by "temporary" you mean a day or two, removing the only section of the site that's providing real value is almost definitely going to see a pronounced downturn in your rankings.
I suppose the best way to handle it is the 302 to a "Sorry, our blog is undergoing some maintenance and will be back in March" type of page.
What I would actually suggest to a client in this scenario though is to build the new blog (assuming that's what's happening here) and have it ready to go before taking the current one down. You've probably thought of that though so I imagine there's something in this scenario that makes it impossible?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Location in URLs question
Hi there, my company is a national theater news publisher. Quick question about a particular use case. When an editor publishes a story they can assign several discrete locations, allowing it to appear on each of those locations within our website. This article (http://www.theatermania.com/denver-theater/news/full-casting-if-then-tour-idina-menzel_74354.html), for example, appears in the Los Angeles, San Francisco, and Denver section. We force the author to choose a primary location from that list, which controls the location displayed in the URL. Is this a bad practice? I'm wondering if the fact that having 'Denver' in the URL is misleading and hurts SEO value, particularly since that article features several other cities.
Intermediate & Advanced SEO | | TheaterMania0 -
SEO of blogging websites
What are the best practices of doing SEO of article/blogging websites.
Intermediate & Advanced SEO | | Obbserv0 -
Scraping / Duplicate Content Question
Hi All, I understanding the way to protect content such as a feature rich article is to create authorship by linking to your Google+ account. My Question
Intermediate & Advanced SEO | | Mark_Ch
You have created a webpage that is informative but not worthy to be an article, hence no need create authorship in Google+
If a competitor comes along and steals this content word for word, something similar, creates their own Google+ page, can you be penalised? Is there any way to protect yourself without authorship and Google+? Regards Mark0 -
Un Natural Links Removal Strategy
Hi All, I want to discuss one of my strategy that i applied on my website to dilute the value of TOXIC links those are coming on my website. Issue: Poor, Spam quality links were created for the home page and some inner pages. Google considered those links Unnatural and took manual action All rankings were disappeared Strategy Deleted all landing pages those are over linked from spam websites. Created new landing pages with some modifications and new content. Because home page(index.html) was also penalized by Google, i made the changes in index.html and put no follow no index tag so that bad links value couldn't pass from index.html to other inner pages (Newly created pages and pages those were not over optimized). Created new index.php page. Give option to the user to the Enter the Website from Index.html (Default Home Page) to index.php. Blocked all bad URLs (Un Natural Links) through .htaccess file. When user or Google bot will come through those blocked URLs (Un Natural Links), server responses 403 (Access Denied). The domain for which i did above experiment is http://www.thebaildepot.com/index.php Now, i have doubts on below points: Blocking unnatural links (403, access denied) from .htaccess file will really work? No follow no index to default page and than give option to the user to navigate to newly create index.php I did this experiment around 10 days before still rankings are not coming in Google top 100.
Intermediate & Advanced SEO | | RuchiPardal0 -
Ecommerce Internal Linking Questions
I am a bit confused at internal linking for ecommerce site. Is it wise to link say all "boots" term in the review section to the boots page? Zappos is doing this. Wouldn't this incur penguin penalty? Since all internal anchor to that page is "boots" ? Scroll down to the bottom and checkout their reviews: http://www.zappos.com/tony-lama-6071l Is this the wise way to go about doing internal linking? Thanks
Intermediate & Advanced SEO | | WayneRooney0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Separate IP Address for Blog
Our developers are recommending we sign up for a cloud based LAMP (Linux, Apache, MySQL, & PHP) server to install 3<sup>rd</sup> party software (Wordpress). They said "the blog will be on a separate IP address and potentially can have some impact with SEM/SEO." Can anyone expand on what impact this might have versus having it on the same IP?
Intermediate & Advanced SEO | | pbhatt0