Unpublish and republish old articles
-
This might be a dumb question but we had an incident where a new SEO guy thought it would be a good idea to un-publish and republish all of your 200+ blog posts which we carefully scheduled over the last 6 months. He did not update the content and did not change anything. His intention was to send out google a sign to recheck the sites or something. Now, the entire blog looks like it wen't live in one day, which I don't think is good? Should we load a backup and get our old publishing dates back, should we keep it with the new publishing dates? What are the consequences? Will it effect our SEO?
-
You guys are awesome!! Thank you so much! That is exactly what we thought! Thank you for confirming, we loaded a backup.
-
From a user standpoint, you should definitely roll back the old dates. Otherwise, it's going to raise suspicion as to why they've all been published on the same date and for current event articles, having the wrong date is not helpful at all for those who are looking at the relevant time frame.
If the content was updated, on the other hand, then it's worth updating the publish date on the article to the date it was updated. And the article will probably be all the better for it, in terms of SEO!
However, as it is, I don't think it's going to have any noticeable effect on the SEO to have the dates as they were or as they are now. But think of the users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After you remove a 301 redirect that Google has processed, will the new URL retain any of the link equity from the old URL?
Lets say you 301 redirect URL A to URL B, and URL A has some backlinks from other sites. Say you left the 301 redirect in place for a year, and Google had already replaced the old URL with the new URL in the SERPs, would the new URL (B) retain some of the link equity from URL A after the 301 redirect was removed, or does the redirect have to remain in place forever?
Technical SEO | | johnwalkersmith0 -
Google rejected my reconsideration request of unnatural link manual action, and list one blog article twice as example?
Hi Moz Community, On April 22 my site received a manual action in Google Webmaster telling me it's caused by unnatural links. After some a deep cleaning of all the sitewide links, which I think is the major problem of my external links, I requested a reconsideration request on May 4. And Google rejected my reconsideration request of unnatural link manual action on May 29, and list one blog article twice as example, which is quite weird to me. Is it normal for Google to list one URL twice as example in the feedback? I don't quite see the reason for that. Does anybody have any idea about that? This is really quite frustrating to me. And to be honest, I don't see much problems about the article Google listed as well. Yeah it's all about our product and it has 3 do-follow links to our site. But it contains no words such as sponsor, advertisement, or rewards... And the blog itself is quite healthy as well. The post also get rather high engagement, with organic comments and shares. How did Google flag that out? I don't think it's possible that Google will go into all our site links one by one... Hope you guys can help me with that. Thanks in advance! Ben
Technical SEO | | Ben_fotor0 -
Updating product pages with new images - should I redirect old images ?
Hello,
Technical SEO | | ninjahippo
We have approx 900 products on our website. Over the coming months we will be replacing the product images. At the moment they have file names like 'green_widget_54eb3a78620be.jpg'
the random jumble at the end of the filename was apparently to keep file names unique. We have removed the jumble part and will have file names like:
'black_widget_with_stripe_001.jpg' The CMS removes the old main image when a new main image is uploaded. But we could change this to leave the old image, but not use it. My question is should we: redirect the old file name to the new file name? upload the new image, and leave the old image in place Or do we just ignore.0 -
Old pages - should I remove them from serps?
Hi guys, I need an advice from you, a recommendation. I have some old LPs from old campaigns, around 70 pages indexed on Google, campaigns that are not available anymore. I have removed them from my DB, but they still remained on server so Google still sees them as URLs on my site, witch I totally agree. What should I do with this pages? Should I remove them completely? (url removal tool) or use rel=canonical? How will this affect my domain authority and rankings? This pages doesn't bring traffic any more, maybe a view now and then, but overall this pages don't bring traffic.
Technical SEO | | catalinmoraru0 -
Syndicated content outranks my original article
I have a small site and write original blog content for my small audience. There is a much larger, highly relevant site that is willing to accept guest blogs and they don't require original content. It is one of the largest sites within my niche and many potential customers of mine are there. When I create a new article I first post to my blog, and then share it with G+, twitter, FB, linkedin. I wait a day. By this time G has seen the links that point to my article and has indexed it. Then I post a copy of the article on the much larger site. I have a rel=author tag within the article but the larger site adds "nofollow" to that tag. I have tried putting a link rel=canonical tag in the article but the larger site strips that tag out. So G sees a copy of my content on this larger site. I'm hoping they realize it was posted a day later than the original version on my blog. But if not will my blog get labeled as a scraper? Second: when I Google the exact blog title I see my article on the larger site shows up as the #1 search result but (1) there is no rich snippet with my author creds (maybe because the author tag was marked nofollow?), and (2) the original version of the article from my blog is not in the results (I'm guessing it was stripped out as duplicate). There are benefits for my article being on the larger site, since many of my potential customers are there and the article does include a link back to my site (the link is nofollow). But I'm wondering if (1) I can fix things so my original article shows up in the search results, or (2) am I hurting myself with this strategy (having G possibly label me a scraper)? I do rank for other phrases in G, so I know my site hasn't had a wholesale penalty of some kind.
Technical SEO | | scanlin0 -
Old massive site, should I nofollow all out going links?
The company I work for is in the process of rebuilding our entire website profile. Our biggest site, FrenchQuarter.com ranks pretty well for our main term "French Quarter Hotels" And we use that to drive business directly to our hotel businesses. The site is very old and one I inherited, rebuilding it won't be a priority for another 9 months or so. This site act's as a bit of a directory for the city. There are links everywhere and it's probably passing link juice to a lot of businesses scott free. In the mean time would it benefit or hurt us if I went through and no-followed most of the links? Would nofollowing links help frenchquarter.com to rank any better than it does? And could I then direct some of that link juice directly at our hotel websites to boost those as well? My goal is to get our hotel websites to rank 1st page, we get exposure in the locations pack, for 2 of our 5 hotels, but placing below that is impossible with all the competition from the OTA's (expecia, bookit.com, etc) Seems near impossible no matter what my backlink profile looks like. Thanks for any feedback, Cyril
Technical SEO | | Nola5041 -
Changed cms - google indexes old and new pages
Hello again, after posting below problem I have received this answer and changed sitemap name Still I receive many duplicate titles and metas as google still compares old urls to new ones and sees duplicate title and description.... we have redirectged all pages properly we have change sitemap name and new sitemap is listed in webmastertools - old sitemap includes ONLY new sitemap files.... When you deleted the old sitemap and created a new one, did you use the same sitemap xml filename? They will still try to crawl old URLs that were in your previous sitemap (even if they aren't listed in the new one) until they receive a 404 response from the original sitemap. If anone can give me an idea why after 3 month google still lists the old urls I'd be more than happy thanks a lot Hello, We have changed cms for our multiple language website and redirected all odl URl's properly to new cms which is working just fine.
Technical SEO | | Tit
Right after the first crawl almost 4 weeks ago we saw in google webmaster tool and SEO MOZ that google indexes for almost every singlepage the old URL as well and the new one and sends us for this duplicate metatags.
We deleted the old sitemap and uploaded the new and thought that google then will not index the old URL's anymore. But we still see a huge amount of duplicate metatags. Does anyone know what else we can do, so google doe snot index the old url's anymore but only the new ones? Thanks so much Michelle0 -
Old Blog
I have an old blog that I started long ago and it has tons of content. I'm thinking about migrating it my current blog but am worried about panda and bringing over mediocre content. The content is fine, not bad not good. Should I bring it over or should I just delete the blog?
Technical SEO | | tylerfraser0