Best method to update navigation structure
-
Hey guys,
We're doing a total revamp of our site and will be completely changing our navigation structure. Similar pages will exist on the new site, but the URLs will be totally changed. Most incoming links just point to our root domain, so I'm not worried about those, but the rest of the site does concern me.
I am setting up 1:1 301 redirects for the new navigation structure to handle getting incoming links where they need to go, but what I'm wondering is what is the best way to make sure the SERPs are updated quickly without trashing my domain quality, and ensuring my page and domain authority are maintained.
The old links won't be anywhere on the new site. We're swapping the DNS record to the new site so the only way for the old URLs to be hit will be incoming links from other sites.
I was thinking about creating a sitemap with the old URLs listed and leaving that active for a few weeks, then swapping it out for an updated one. Currently we don't have one (kind of starting from the bottom with SEO)
Also, we could use the old URLs for a few weeks on the new site to ensure they all get updated as well. It'd be a bit of work, but may be worth it.
I read this article and most of that seems to be covered, but just wanted to get the opinions of those who may have done this before. It's a pretty big deal for us.
http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well
Am I getting into trouble if I do any of the above, or is this the way to go?
PS: I should also add that we are not changing our domain. The site will remain on the same domain. Just with a completely new navigation structure.
-
It all depends on what you're trying to achieve. If you want people to see a 404 page then serve them a useful 404 page.
If you're trying to redirect link value then you should 301 to the most relevant page to what that URL used to have on it.
For me a 404 error is a great opportunity to catch the visitor and give them something of use.
If you redirect 404s you'll also reduce your site's general server errors, which can only be a positive thing, right?
-
Thanks for the RE.
About redirecting pages that don't exist anymore, I thought of doing that, however isn't that what the 404 page is for? I was going to redirect all other pages to the root, but that would likely mean we'd never get a 404 response.
Maybe I'm not understanding the programming logic involved in something like that.
-
We changed our domain a few months back so here's a few observations
- Where possible ensure effective 301's are in place
- If a page URL does not have to change don't change it. It is possible to create a better website structure/navigation without altering URLs.
- Ensure a full sitemap is submitted when you roll out the new design
- Be patient, you may see a drop for a short while, as the 301's take time to attribute value from old->new URLs.
- Get any sites linking to old URLs (the non-home ones) updated to the new URLs when you know them.
- In a few months, if you have any old URLs in Google (do a site:www.website.com) search then use the URL removal tool in GWT to get rid of old URLs.
- You may want to consider redirecting any pages that don't exist at all any more, to your home page or the next nearest match in terms of content.
Hope this helps to get you started!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Server update to ipv6, SEO consequences
Hi all, I read the article from 2014 on MOZ regarding ipv6.
Intermediate & Advanced SEO | | AdenaSEO
https://moz.com/blog/ipv6-cblocks-and-seo Our technical department is about to change our server from ipv4 to ipv6.
Are there any things we have to consider regarding SEO / rankings / duplicate content etc.. with this transition? I hope you have a little spare time to answer this question. Regards,
Tom1 -
Crawl Budget and Faceted Navigation
Hi, we have an ecommerce website with facetted navigation for the various options available. Google has 3.4 million webpages indexed. Many of which are over 90% duplicates. Due to the low domain authority (15/100) Google is only crawling around 4,500 webpages per day, which we would like to improve/increase. We know, in order not to waste crawl budget we should use the robots.txt to disallow parameter URL’s (i.e. ?option=, ?search= etc..). This makes sense as it would resolve many of the duplicate content issues and force Google to only crawl the main category, product pages etc. However, having looked at the Google Search Console these pages are getting a significant amount of organic traffic on a monthly basis. Is it worth disallowing these parameter URL’s in robots.txt, and hoping that this solves our crawl budget issues, thus helping to index and rank the most important webpages in less time. Or is there a better solution? Many thanks in advance. Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Best-practice URL structures with multiple filter combinations
Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
Intermediate & Advanced SEO | | digitalcrc
or
www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!0 -
One site, two blogs, URL structure?
I address a two sided market: consumer research and school fundraising. Essentially parents answer research surveys to generate proceeds for their school. My site will have a landing page at www.centiment.co that directs users to two different sub-landing pages, one related to research and one related to school fundraising. I am going to create two blogs and I am wondering if I should run off one installation of wordpress.org or two? The goal here is to optimize SEO. Separate URL paths by topic are clean but they require two installations of wordpress.org www.centiment.co/research/blog www.centiment.co/fundraising/blog If were to use one installation of wordpress it would be www.centiment.co/blog and then I would have a category for fundraising and a category for research. This is a little simpler. My concern is that it will confuse google and damage my SEO given general blog posts about fundraising are far different then those about research. Any suggestions? Again I don't want to compromise my SEO as I'm creating a blog to improve my SEO. Any insights are much appreciated. Thank you!
Intermediate & Advanced SEO | | kurtw14
Kurt0 -
How should i best structure my internal links?
I am new to SEO and looking to employ a logical but effective internal link strategy. Any easy ways to keep track of what page links to what page? I am a little confused regarding anchor text in as much as how I should use this. e.g. for a category page "Towels", I was going to link this to another page we want to build PA for such as "Bath Sheets". What should I put in for anchor text? keep it simple and just put "Bath Sheets" or make it more direct like "Buy Bath Sheets". Should I also vary anchor text if i have another 10 pages internally linking to this or keep it the same. Any advise would be really helpful. Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0 -
New URL : Which is best
Which is best: www.domainname.com/category-subcategory or www.domainname.com/subcategory-category or www.domainname.com/category/subcategory or www.domain.com/subcategory/category I am going to have 12 different subcategories under the category
Intermediate & Advanced SEO | | Boodreaux0 -
Are widgets dangerous after the Panda update?
My site provides widgets (online polls) which were developed so that each one would embed a do follow text link into the customers website. With Panda's unnatural link algorithm now in place should I modify these links to be nofollow and give up on this strategy or alternatively just set the text as my sites domain name? The only other option I could think of was to only embed links where the customers site had a certain page rank or above? Any thoughts?
Intermediate & Advanced SEO | | Blendfish1 -
What is the best tool to crawl a site with millions of pages?
I want to crawl a site that has so many pages that Xenu and Screaming Frog keep crashing at some point after 200,000 pages. What tools will allow me to crawl a site with millions of pages without crashing?
Intermediate & Advanced SEO | | iCrossing_UK0