Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Any Tips for Reviving Old Websites?
-
Hi,
I have a series of websites that have been offline for seven years. Do you guys have any tips that might help restore them to their former SERPs glory?
Nothing about the sites themselves has changes since they went offline. Same domains, same content, and only a different server. What has changed is the SERPs landscape. I've noticed competitive terms that these sites used to rank on the first page for with far more results now. I have also noticed some terms result in what seems like a thesaurus similar language results from traditionally more authoritative websites instead of the exact phrase searched for. This concerns me because I could see a less relevant page outranking me just because it is on a .gov domain with similar vocabulary even though the result is not what people searching for the term are most likely searching for.
The sites have also lost numerous backlinks but still have some really good ones.
-
We would highly recommend writing very high-quality evergreen content marketing.
We would also recommend building very high quality do follow no follow backlinks.
You must also make sure that your web design company designs a website, which offers a good user experience, so it's simple for shoppers to use the website.
-
Content Refresh: Update outdated content, add new information, and improve formatting to make it more engaging and relevant to current trends.
SEO Audit: Conduct a thorough SEO audit to identify and fix issues such as broken links, outdated keywords, and poor site structure.
Mobile Optimization: Ensure your website is mobile-friendly, as more users are accessing the internet through mobile devices.
Speed Optimization: Improve page loading speed by optimizing images, minifying CSS and JavaScript files, and using caching techniques.
Backlink Analysis: Review and disavow low-quality or spammy backlinks while seeking opportunities to acquire high-quality backlinks from reputable sources.
User Experience Enhancement: Enhance user experience by improving navigation, implementing clear calls-to-action, and optimizing for readability.
Social Media Integration: Promote your website through social media channels to increase visibility and attract more traffic.
Update Design: Modernize the website design to reflect current design trends and improve overall aesthetics.
Regular Updates: Commit to regularly updating the website with fresh content, news, or blog posts to keep visitors engaged and encourage return visits.
Analytics Monitoring: Use website analytics tools to monitor traffic, user behavior, and conversion rates, and make data-driven decisions to optimize performance.
By implementing these strategies, you can breathe new life into your old website and improve its visibility, usability, and overall effectiveness.
-
Improving the Organic SEO for on an old company website, is the same SEO, as you would apply to a brand new company website; that is white hat seo.
you do need high-quality content marketing and good-quality backlinks. We own a summerhouse company, and this is how we got the business on the first page of Google.
-
If you are reviving an old website make sure it is mobile friendly. Then you will need to refresh the content and update page titles and meta descriptions. Also make sure you add new content regularly.
-
That's a good question and I'd agree - I imagine that references to your website in published books online could be treated similarly to mentions across the web. Whether Google gives it any extra weight or not is unclear, but I'd agree that the implication is that a mention in a published book could carry some weight.
-
Thank you for the replies. They give me more hope because I was thinking along similar lines.
I certainly plan on reaching out to the authors of old articles that lost link, but I am not so sure sometimes. One of the old websites specifically got its coverage from being controversial so I am not sure if they unlinked due to it being down or due to complaints from people pointing out how they were helping it by linking to it. I have been noticing articles like https://moz.com/learn/seo/backlinks and I would hate to risk losing mentions on high quality sites by drawing attention to new editors that might just delete the articles entirely.
Another question I have related to mentions is mentions in books. I have noticed a site of mine showing up in Google Books from a couple of published books discussing it. Does that help SEO like a brand mention on a high quality site?
I would think that Google would consider sites mentioned in published books to be more authoritative than ones just mentioned in blogs or news stories.
-
Hi there,
I'd suggest a few things:
1. If you have old analytics data or log file data to show you which content performed best when the site was last live, take a look at that and prioritise restoring and updating the content which worked well previously.
2. Go through the content and update with fresh information, data, images, links etc to give everything a freshen up. Don't worry if content is still relevant and evergreen, but just do some checks to make sure.
3. Once you've updated the content and you're happy with it, generate some new XML sitemaps and submit to Google Search Console to prompt Google to crawl the pages again and get them into the index.
4. In addition, perhaps submit the homepage and a few key pages to Google Search Console for crawling and indexing.
5. Once the pages are indexed, keep an eye on Search Console to see how pages are performing and use this data to update the most popular pages.
6. In terms of links, if you can restore any valuable lost ones by reaching back out to the websites, letting them know that the site has relaunched and seeing if they can restore the links, that may give it a nudge too.
I hope that helps!
Paddy
-
Hi,
As previously stated by seotoolshelp5 with addition of
1. Check for any issues with dead links leading to this websites
2. Check crawling errors
3. check website speed and improve it if necessary
4. Prioritize mobile version (if you don't have one, create it)
That's all for now what I can think of.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Snippet Update in Search Console?
I have a company that I started working with that has an outdated and inaccurate snippet coming up. See the link below. They changed their name from DK on Pittsburgh Sports to just DK Pittsburgh Sports several years ago, but the snippet is still putting the old info, including outdated and incorrect description. I'm not seeing that title or description anywhere on the site or a schema plugin. How can we get it updated? I have updated titles, etc. for the home page, and done a Fetch to get re-indexed. Does Snippet have a different type of refresh that I can submit or edit? Thanks in advance https://g.co/kgs/qZAnAC
Intermediate & Advanced SEO | | jeremyskillings0 -
Redirected Old Pages Still Indexed
Hello, we migrated a domain onto a new Wordpress site over a year ago. We redirected (with plugin: simple 301 redirects) all the old urls (.asp) to the corresponding new wordpress urls (non-.asp). The old pages are still indexed by Google, even though when you click on them you are redirected to the new page. Can someone tell me reasons they would still be indexed? Do you think it is hurting my rankings?
Intermediate & Advanced SEO | | phogan0 -
How to de-index old URLs after redesigning the website?
Thank you for reading. After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain). It would be nonsense to 301 redirect them as there are to many URLs. (or would it be nonsense?) What is the best way to deal with this issue?
Intermediate & Advanced SEO | | Chemometec0 -
Website completely delisted - reasons?
Hi, I got a request from a potential client as he do not understand why his website cannot be found on Google. I've checked that and found out that the complete website is not listed (complete delist) at all - expect just one pdf file.
Intermediate & Advanced SEO | | TheHecksler
I've checked his robots.txt - but this is ok. I've checked the META Robots - but they are on index,follow ... ok so far. I've checked his backlinks but could not found any massive linking from bad pages - just 6 backlinks and only four of them from designdomains.com which looks like a linklist or so. I've requested access to their GWT account if available in hope to find more infos, but does anyone of you may have a quick idea what els it could be? What could be the issue? I think that they got delisted due to any bad reason ... Let me know your Ideas 🙂 THANX 🙂 Sebi0 -
Archiving a festival website - subdomain or directory?
Hi guys I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!) We don't archive our past festivals online, but I'd like to start doing so for a number of reasons 1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival. 2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage) Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012 However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc. My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes? Hope this all makes sense. Many thanks!
Intermediate & Advanced SEO | | cos20300 -
Effects of having both http and https on my website
You are able to view our website as either http and https on all pages. For example: You can type "http://mywebsite.com/index.html" and the site will remain as http: as you navigate the site. You can also type "https://mywebsite.com/index.html" and the site will remain as https: as you navigate the site. My question is....if you can view the entire site using either http or https, is this being seen as duplicate content/pages? Does the same hold true with "www.mywebsite.com" and "mywebsite.com"? Thanks!
Intermediate & Advanced SEO | | rexjoec1 -
Transfer link juice from old to new site
Hi seomozzers, The design team is building a new website for one of our clients. My role is to make sure all the link juice is kept. My first question is, should I just make 301s or is there another technique to preserve all the link juice from the old to new site that I should be focusing on? Second Question is that ok to transfer link juice using dev urls like www.dev2.example.com (new site) or 182.3456.2333? or should I wait the creation of real urls to do link juice transfer? Thank you 🙂
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
SeoMoz Crawler Shuts Down The Website Completely
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
Intermediate & Advanced SEO | | Jury0