Condensing content for web site redesign
-
We're working on a redesign and are wondering if we should condense some of the content (as recommended by an agency), and if so, how that will affect our organic efforts. Currently a few topics have individual pages for each section, such as (1) Overview (2) Symptoms and (3) Treatment. For reference, the site has a similar structure to http://www.webmd.com/heart-disease/guide/heart-disease-overview-fact.
Our agency has sent us over mock-ups which show these topics being condensed into one and using a script/AJAX to display only the content that is clicked on. Knowing this, if we were to choose this option, that would result in us having to implement redirects because only one page would exist, instead of all three.
Can anyone provide insight into whether we should keep the topic structure as is, or if we should take the agency's advice and merge all the topic content? *Note: The reason the agency is pushing for the merging option is because they say it helps with page load time.
Thank you in advance for any insight!
-
I think the general idea is a good one. Having one very thorough and authoritative page about the common cold should be more powerful than three weaker pages that all compete for the same keywords. In fact, we did something similar last year when we pulled coupons, deals and reviews into a single page, but our review pages hadn't quite taken off and we knew that people don't really search for deals the way they search for coupons, so consolidating made sense to beef up the content in a single authoritative place.
However, in the medical niche I'd be very wary of losing traffic that would have gone to symptom and treatment pages, just knowing (ok, I didn't look anything up, but I can guess) how often those are specifically searched and the indexing issues we've had with content inside collapsible divs. John Mueller has said before that if that content was really so important, you wouldn't be hiding it behind a click. It's a really big risk. If there's a way to test it on a handful of pages before rolling out any sitewide changes, I would absolutely do that.
-
Hey Vanessa. I'd ask a few additional questions about the pages before making a decision...
-
If you were to implement redirects, would the redirect go from Treatment (the page) -> Treatment (the Ajax-loaded content)? Or, would it go from Treatment (the page) -> topic page (and people would have to click a link to view treatment content)? If the redirect goes from the page, to the related content of the page then maybe this isn't too terrible an idea. That would mean the Ajax-loaded page section for treatment would have some unique kind of URL associated with it (like /topic-name#treatment).
-
Next question, though, is how much traffic does this affect? Of the traffic those pages get individually right now, how much of that traffic enters the site on those pages (from any source - direct, referral, social, organic, paid)? If right now almost everybody comes into the site via an overview page and then clicks to Symptoms or Treatment, then probably okay to consolidate those into a single page. That said, if all three pages are landing pages for a reasonable amount of visitors I'd be reluctant to make this kind of change to disrupt the traffic...esp. if the answer to question #1 is no.
-
What about links? Do you have a lot of links pointing to the individual pages within each section? Yes, redirects will help retain the link equity, but with any redirect you lose some. So, if a large percent of the links to your site are to these pages, I'd be hesitant to make any kind of change without further testing/research around the weight and importance of those links.
Along with those questions, I'm also wondering why the agency thinks this would help with load time. Why can't they improve load time on the individual pages? Are they talking about the load time from clicking to the Treatment page from Symptoms? If so, there are probably better ways to address that vs. removing pages from the site. When you run a speed test, what is slowing down the page load? Is it something with the server or content that can be tweaked? I'd start there before trying to consolidate pages and running the risk of disrupting any existing traffic.
I hope that helps as you work toward a decision.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page? e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
Algorithm Updates | | David-Stern0 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Are there any alternative ranking strategies for not a blog site other than on site SEO, speed improvement, building backlinks and social media engagement to improve rankings?
We own a horoscope website and looking for some SEO advice.However most of the websites are blog sites therefore most of the SEO content is about how to rank a blog site better. IE getting new quality content, use anchor text link out etc. However if your site is different by nature it is hard to find good advice on how to rank better in these scenarios. I would like to know if there are alternative ways of increasing rankings apart from the usual strategies of improving social media fan pages, building backlinks and optimising the site speed wise and making it accessible and understandable to crawlers and people too.
Algorithm Updates | | websitebuilder0 -
Duplicate Product Pages On Niche Site
I have a main site, and a niche site that has products for a particular category. For example, Clothing.com is the main site, formalclothing.com is the niche site. The niche site has about 70K product pages that have the same content (except for navigation links which are similar, but not dupliated). I have been considering shutting down the niche site, and doing a 301 to the category of the main site. Here are some more details: The niche sites ranks fairly well on Yahoo and Bing. Much better than the main site for keywords relevant to that category. The niche site was hit with Penguin, but doesn't seem to have been effected much by Panda. When I analyze a product page on the main site using copyscape, 1-2 pages of the niche site do show, but NOT that exact product page on the niche site. Questions: Given the information above, how can I gauge the impact the duplicate content is having if any? Is it a bad idea to do a canonical tag on the product pages of the niche site, citing the main site as the original source? Any other considerations aside from duplicate content or Penguin issue when deciding to 301? Would you 301 if this was your site? Thanks in advance.
Algorithm Updates | | inhouseseo0 -
Unable to increase the site traffic since 2 yrs
Hello friends, I am new to seomoz forum and this is my first query. Even i asked this query in many forums, i didnt get the right answer. it will be a big help if anyone answers my question. Since 2yrs i am doing seo for my site. even i am following all the white hat techniques and doing every submission manually. Still my site traffic is below 100 visits. Can any one help me to increase the site traffic? What are the techniques i need to follow to increase site visits? Also one of my sites recently got disappeared from google. I have checked all the pages listed in google for my site's major keywords. I didnt find the site anywhere. Can u hep me why this condition wll happen and what to do to overcome such issues?
Algorithm Updates | | Covantech0 -
Do links from unrelated sites dilute your rankings for your key phrases?
do links from unrelated sites dilute your rankings for your key phrases? i've always heard don't get links from unrelated sites but if that mattered, then how would sites with totally diverse pages such as newspaper sites, sears, and other catalogue sites rank for these diverse subjects on their site? How does Facebook rank when it gets 100,000 links a day from sites that have nothing to do with a social media site? I'd love to hear everyone's opinion on this. Also, Do links from unrelated sites give less push than related links? Take care,
Algorithm Updates | | Ron10
Ron0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
We have dynamically updated xml sitemaps which we feed to Google et al. Our xml sitemap is updated constantly, and takes minimal hands on management to maintain. However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone. So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
Algorithm Updates | | linklater0