How to Recover From Unstable Site Structure History
-
I have a site that has suffered several phases of restructuring. Apparently its owners were unsure as to which direction to take when it came to structuring their content and URL schema and subjected the site to several rounds of poorly thought through implementations (i.e. example.com/content/page-title, example.com/page-title, example.com/"silo"/page-title, etc.), all within a 8 month period. I posted the originating question here on this Q&A Forum. I want to thank EGOL and Cody for taking a stab at it.
What would be a good strategy to help a site like the one I describe above begin ranking again?
-
Hello Lance, please pardon the belated reply.
Your answer is very precise, thanks for elaborating in such detail. Number 7 - "410 errors are reportedly a bit faster at getting content removed" - is a gem. I see the logic behind # 8, makes lots of sense, and so does number 9. Nice recipe for a recovery.
Again, thank you for the insight.
-
I have had direct experience with this issue. Here are a few of the things I have done:
- Make sure the current URLs are rock solid and can be long lived.
- Ensure all links to the old structure are completely purged from the content. No good to propping up the old patterns.
- Get a clear picture of the off-site back links. No sense worrying about pages that will have no value. If they changed that fast, there won't be many to worry about.
- For those that have good back links, make a direct 301 redirect to the new page.
- At the point of low ROI, redirect the rest on pattern matches. There could be a couple double jumps here but they won't mean that much anyway given #3. (side note, double jumps leak extra link equity so they should be avoided)
- Ensure your entire site can be fully and easily spidered - before resorting to xml sitemaps.
- Ensure you have a helpful even compelling 404 page that returns the proper status code. 410 errors are reportedly a bit faster at getting content removed so use if you can.
- Remove any restrictions you have in the robots.txt file to the old structure until the 404 and 301s take full effect.
- Submit remove requests to pages and folder. This is particularly important if the site is very large (compared to domain authority) and SEs won't get a full picture of the changes for weeks or months due to low crawl allowance.
Doing these got my sites back on track within a couple weeks.
EDIT: forgot a couple...
- remove any old xml sitemaps
- submit multiple sitemaps for different sections of the site. This helps narrow down problem spots easier.
-
I am very glad you asked this question. Go to http://distilled.net/u start at a level that is relevant to what you are doing so you can be efficient. However I do recommend finishing the entire lesson plan. This is not a lesson plan for people who don't know anything it is a highly effective lesson plan built for people who work for distilled.net
I've found it the most enlightening and extremely valuable when ever you are on the fence about something. I could not recommend it more
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Proper URL Structure. Feedback on Vendors Recommendation
Urgent! We're doing a site redesign and our vendor recommended new url structure as follows: website.com/folder/word1word2word3. Our current structure is website.com/word1-word2 They said that from SEO perspective, it doesn't make a difference if there are dashes between words or not and Google can read either URL. Is that true? I need experts to weigh on the above, as well as SEO implications if we were to implement their suggestion.
On-Page Optimization | | bluejay78780 -
SEO audit on a beta site
HI there, Is there much point conducting an SEO site audit on a site that has not yet launched and is protected behind a login? Presumably none of the usual SEO tools (Moz, Screaming Frog etc) can crawl this site becuase it is all locked behind a login. Would it be better to launch it and then do a site audit? Thanks
On-Page Optimization | | CosiCrawley0 -
"Turning off" content to a site
One site I manage has a lot of low quality content. We are in the process of improving the overall site content but we have "turned off" a large portion of our content by setting 2/3 of the posts to draft. Has anyone done this before or had experience with doing something similar? This quote from Bruce Clay comes to mind: “Where a lot of people don’t understand content factoring to this is having 100 great pages and 100 terrible pages—they average, when the quality being viewed is your website,” he explained. “So, it isn’t enough to have 100 great pages if you still have 100 terrible ones, and if you add another 100 great pages, you still have the 100 terrible ones dragging down your average. In some cases we have found that it’s much better, to improve your ranking, to actually remove or rewrite the terrible ones than add more good ones.” What are your thoughts? Thanks
On-Page Optimization | | ThridHour0 -
Do https sites rank as well as http sites?
2 Questions: Question 1 - We currently have our entire site running on https (the http pages 301-redirect to the https versions). Assuming that the https pages load as quickly as the http versions, is it a problem that the entire site is https? The only official answer I've been able to find is this 2011 video where Matt Cutts basically says "I don't know" - http://www.youtube.com/watch?v=xeFo4ytOk8M Question 2 - Is there any problem with having half our site running on https only (with the http pages redirected), and the other half (our blog) running on http only (with all https blog pages redirected to the http versions)? Thanks in advance for any input! Justin
On-Page Optimization | | JustinClark0 -
Question Regarding Site Structure
I have a quick question regarding site structure that I hope some of you guys could share your opinion on. I watched a white board friday from Rand a little while back where he explains that you need to try and make the site structure as flat as possible. He was saying try having no more that 3 links from the home page to get to the desired location. My question is this. I am looking at a site that has a pretty complex structure that I am trying to clear up as much as possible without making any of there rankings suffer. So they have www.domian.com/general-category/district/town/ and sometimes www.domian.com/general-category/district/town/item-specifics Now i know it is not good as it is, but they are dubious about changing too much as they have some serious traffic coming to the site. But, my question is that all the pages can be found from the home page through the menus/sub-menus. But do these count as a direct link from the home page. Also a problem is that because of this mozbot has detected that there are too many links from the home page and suggested that it should be below 200. But should I make these menu links no index or no follow. Obviously, by doing this, if the link does count as direct from the home page it wont after doing this. Thanks Jenson
On-Page Optimization | | jensonseo0 -
Directory site with an URL structure dilemma
Hello, We run a site, which lists local businesses and tag them by their nature of business (similar to Yelp). Our problem is, that our category and sub-category(i.e.: www.example.com/budapest/restaurant or www.example.com/budapest/cars/spare-parts) pages are extremely weak, and get almost no traffic, but most of the traffic (95+ percent) goes for the actual business pages. While this might be a completely normal thing, I still would like to strengthen our category (listing) pages as well, as these should be the ones targeted by some of general keywords, like ‘restaurant’ or ‘restaurant+budapest’. One of the issues I have identified as a possible problem, that we do not have a clear hierarchy within the site, so while the main category pages are linked from the homepage (and the sub-categories from here), there is no bottom-up linking from the business pages back to the category pages, as the business page URLs look like this: www.example.com/business/onyx-restaurant-budapest. I think, that the good site- and url structure for the above would be like this: www.example.com/budapest/restaurant/hungarian/onyx-restaurant. My only issue is, perhaps not with the restaurants but with others, that some of the businesses have multiple tags, so they can be tagged i.e. as car saloon, auto repair and spare parts at the same time. Sometimes, they even have 5+ tags on them. My idea is, that I will try to identify a primary tag for all the businesses (we maintain 99 percent of them right now), and the rest of their tags would be secondary ones. I would then use canonicalization and mark the page with the primary tag in the url as the preferred one for that specific content. With this scenario, I might have several URLs with the same content (complete duplicates), but they would point to one page only as the preferred one, while our visitors could still reach the businesses in any preferred ways, so either by looking for car saloons, auto-repair or spare parts. This way, we could also have breadcrumbs on all the pages, which now we miss completely. Can this be a feasible scenario? Might it have a side-effect? Any hints on how to do it a better way? Many thanks, Andras
On-Page Optimization | | Dilbak0 -
I changed my site from HTML to PHP and I need to get some help.
Ok...so the other day I went from HTML to PHP in every part of my website. I want to know the best option for me for redirecting my pages from HTML to php. I had my site scanned with SEOMoz and I was given many 404 errors which is not at all good. I do not have any pages of my site linking to any of these html pages. All of the site links have been updated. I have checked 3 times. I have never created a robots.txt file so I would love to get a little help with this part. I was thinking it would be best to tell Google not to worry about these pages in the file. I kept the pages up and I plan to remove all code with them so that no content shows up if someone visits but the issue with that is my site is already indexed as HTML. I want to have the HTML pages redirect to the PHP without worrying that my visitors will land on my site via Google onto an HTML page. I hope I am making sense. What is the best advice you can give me. I need all pages to redirect to PHP. I used an htaccess redirect from all HTML to PHP but when I get so many of them added I get an error on my site saying too many redirects. Seriously need help.
On-Page Optimization | | TrendyHost0 -
Is there any benefit in on-site duplicate content?
I have about 50 internal pages on my site that I want to add a "Do it yourself tutorial" to in an effort to build the quality of the pages. Is this going to de-value the content if I put it on all 50 pages? It's difficult to write similar content 50 different ways.
On-Page Optimization | | BradBorst0