How to Implement Massive SEO Modifications
-
Hi everyone,
I'm implementing some fairly significant changes on a clients website and wanted to know if it was better to implement all the changes at once or if I should implement the changes gradually.
The changes are:
1. Amended information architecture
2. Completely new URL's
3. New meta data and some new on page content
4. Meta robots 'no index, follow' approximately 90% of the site
Can I make all these changes in one go (that would be my preference), or should I gradually implement? What are the risks?
Many thanks
James
-
Hi Joe,
Thanks for the response. Having had a variety of different opinions, and still not being 100% on the right answer, I spent a LOT of time crawling through SEOmoz Q&A:
Takeaways from my digging around are:
- Changes to title tags and URL's should be implemented separately. As you state above, reason for this is so that you can pinpoint problems if they arise (see point 3 of the answer) http://www.seomoz.org/qa/view/49136/revising-urls
- Title tag changes should also be implemented in stages. Homepage, top 50 pages, everything else (again, see point 3 of the answer): http://www.seomoz.org/qa/view/39946/title-tags-global-changes. (As an interesting aside, Dr Pete clearly states that when making sitewide changes, dont make more than one set of changes per page, it could cause an over-optimisation penalty)
- URL structure changes should be implemented all in one go: http://www.seomoz.org/qa/view/45183/update-url-structure (this link is an amazing guide from everett sizemore on exactly how to implement URL changes, recommended reading!)
I appreciate there's no right and wrong answer, but I think that with the above in mind, the approach I'm going to take to these changes is a scientific one. Make a change, assess results, move forward.
1. Implement title tag changes in stages (monitoring site performance at every stage). Homepage/Category Pages/Everything else.
2.Add new on page content.
3. Add new information architecture (couple of new categories- nothing significant)
4. Implement URL changes through 301 redirects all in one go. Keep old site XML sitemap in place. Once site has been crawled (and new pages found) move to new sitemap and update internal links
4. Implement meta robots 'noindex, follow' to various sections of the site. Not all in one go, but section by section, monitoring results and then moving on if no issues arise
Would be interested to know what you think of that as a plan? Also, need to send out love to Dr Pete and Everett Sizemore for their Q&A answers!
James
-
#2 - completely new url's says it all for me. The others are all subs of that change. If possible you need to address these changes in some form of 301 redirect so that the spiders can follow your changes. update the .htaccess file or even create static php redirect headers or similar if you have to. This should prevent the search engines from reporting the dreaded 404 and getting the page dumped.
#4 - The no-index is not something you have to worry about as you are removing pages from the SERPs, not trying to get them ranked. Any page that is getting no-index is out of the SEO equation at this point.
#3 - this will improve rankings/search-ability so you are not looking at seeing a negative effect here. Updates on these pages, if done correctly, generally have favorable results and at the worst have 'no change' in the SERPs
#1 - I would need to know more detail on this one, but the new architecture will probably be reflected in #2's urls so if that it solved, so is number 1. Again, a more clear, easily accessible architecture hopefully allows the spiders to effectively categorize the sections of your site. The new IA will probably be more pleasing to users which will have its own benefits as well.
---- And the final vote... All at once, just address the 404's and you should be ok
-
NoIndex 90% of a site? Interested to hear why that makes sense in any situation. Maybe only implement have of those noindex tags at first to see if you get the desired result.
As for the title, meta and content, all at once is fine. Hopefully your new stuff is better than the old! Best of luck!
-
I came here to tell him to do the exact opposite! I was going to suggest doing one change at a time to measure and or A/B test results to make sure maximum benefit of each was given. After reading your response and his issues, i've changed my opinion and agree with you that its probably best to do all of these at once in one MAJOR revision and then tweak after that.
-
Considering how massive the changes are, I'd say it's best to do them all at once. This will let you start rebuilding as soon as possible. Making one big change and then waiting to start ranking again, followed by another big change that could drop them out of the rankings again would likely cause a longer period of your client not getting traffic. I wouldn't say that the on-page and metadata changes need to be made at the same time, if there are limited resources.
One problem with doing this all at once is that it will be more difficulty to evaluate the effect of each change. This might not be a huge deal to you, but sometimes it is nice to know what return came from each change.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
Do uncrawled but indexed pages affect seo?
It's a well known fact that too much thin content can hurt your SEO, but what about when you disallow google to crawl some places and it indexes some of them anyways (No title, no description, just the link) I am building a shopify store and it's imposible to change the robots.txt using shopify, and they disallow for example, the cart. Disallow: /cart But all my pages are linking there, so google has the uncrawled cart in it's index, along with many other uncrawled urls, can this hurt my SEO or trying to remove that from their index is just a waste of time? -I can't change anything from the robots.txt -I could try to nofollow those internal links What do you think?
Intermediate & Advanced SEO | | cuarto7150 -
Does the coding of a website really what matter for SEO
Hello, Just wondering if a site that is not coded correctly can hurt your seo even though as a human I can totally understand what is going on on the page and the structure of the website. Thanks,
Intermediate & Advanced SEO | | seoanalytics0 -
How have your SEO Audits evolved over time?
Hi guys, SEO Audits used to be so focused on the basics like title tags, meta descriptions etc. but now more than ever, SEO is a discipline which encompasses everything from content marketing to UX design. Would be great to hear from everyone how their audit templates have evolved over time to keep up with the latest updates in SEO and algorithm changes! Which new sections have you added and which have you removed?! Cheers, Daniel
Intermediate & Advanced SEO | | Daniel_Morgan0 -
Is tabbed content bad for SEO?
I work for a Theater show listings and ticketing website. In our show listings pages (e.g. http://www.theatermania.com/broadway/this-is-our-youth_302998/) we split our content into separate tabs (overview, pricing and show dates, cast, and video). Are we shooting ourselves in the foot by separating the content? Are we better served with keeping it all in a single page? Thanks so much!
Intermediate & Advanced SEO | | TheaterMania0 -
How do I Syndicating Content for SEO Benefit?
Right now, I am working on one E-Commerce website. I have found same content on that E-Commerce website from manufacturer website. You can visit following pages to know more about it. http://www.vistastores.com/casablanca-sectional-sofa-with-ottoman-ci-1236-moc.html http://www.abbyson.com/room/contemporary/casablanca-detail http://www.vistastores.com/contemporary-coffee-table-in-american-white-oak-with-black-lacquer-element-ft55cfa.html http://www.furnitech.com/ft55cfa.html I don't want to go with Robots.txt, Meta Robots NOINDEX & Canonical tag. Because, There are 5000+ products available on website with duplicate content. So, I am thinking to add Source URL on each product page with Do follow attribute. Do you think? That will help me to save my website from duplicate content penalty? OR How do I Syndicating Content for SEO Benefit?
Intermediate & Advanced SEO | | CommercePundit0 -
Use of Apache mod_rewrite module for SEO?
I'm curious if anyone here running a large, complex, dynamic site has used the Apache server mod_rewrite module to simplify their site's URLs by rewriting them in a standard format. The chief use of this module for SEO purposes would be to aid in canonicalization and reduce duplicate content. For example, you could easily convert all of you ALL CAPS or MixedCase URLs to lower case, change all "/index.html" URLs to just point to "/", change all word seperators to hyphens, and so on. Any server-side ninjas out there with stories to tell? 🙂
Intermediate & Advanced SEO | | jcolman0