How does having multiple pages on similar topics affect SEO?
-
Hey everyone,
On our site we have multiple pages that have similar content. As an example, we have a section on Cars (in general) and then specific pages for Used Cars, European Cars, Remodeled Cars etc. Much of the content is similar on these page and the only difference is some content and the additional term in the URL (for example car.com/remodeled-cars and /european-cars).
In the past few months, we've noticed a dip in our organic ranking and started doing research. Also, we noticed that Google, in SERPs, shows the general page (cars.com/cars) and not the specific page (/european-cars), even if the specific page has more content. Can having multiple pages with similar content hurt SEO? If so, what is the best way to remedy this? We can consolidate some of the pages and make the difference between them a little clearer, but does it make that much of a difference for rankings?
Thanks in advance!
-
Makes a lot of sense, thank you.
-
Some great points there Devanur. There is also the option of canonical but the problem is it would mean you're having less pages indexed but one page (the original) would be stronger.
Duplicate content can hurt you but the other side of that is Matt Cutts has mentioned a few times that it wont hurt you unless its spammy, and boiler plate terms you can also normally get away with. For a good nights sleep though its easier just to fix it and know its one less thing to worry about.
Good luck.
-
Hi,
Having multiple pages with similar or identical content confuses the search engines and the outcomes in SERPs will be undesirable. Here is the deal: No two unique URLs should serve substantially similar or identical content. If it is the case, you should decide the URL that you would like to rank in the search engines, make others point to it via rel=canonical attribute. In general, the page that targets the most search keyword/phrase can be made the canonical URL or the preferred URL.
If I were you, I would have added unique content to the existing pages targeting the main keyword for the page.
For example, if the page talks about, 'used cars', this would become my target term for the page:
I would also go ahead with a thorough keyword research & analysis to find which keywords/phrases are being searched more in your geo-location or target market, add corresponding pages with highly targeted content for each of these keywords/phrases (if not added already).
The key here is, content that is unique, up-to-date, highly relevant and useful to the visitors. Such content would bring in dramatic improvements to your overall SEO ROI and search engines like Google love such content and these pages will be awarded good positions in the SERPs going forward. As you know, high quality content is a natural link magnet.
Here is an action plan, if I were you:
1. Make these pages unique by adding unique content
2. Do a thorough keyword research & analysis to find new content opportunities in your niche
3. Add new pages with unique content based on the outcome of step 2.
4. Update the Sitemap.xml file and submit it to webmaster tools
5. Repeat steps from 2 to 4 once in 6 months based on the results or as and when required.
Those were my two cents my friend. Good Luck.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I optimize the login page? Will it affect the website SEO ranking?
I'm trying to resolve the site crawl issues that we have on our website. One of the links that has different issue types together is our login page. Currently we have two login pages that have the same content but different sub domains. **However I'm wondering if optimizing SEO on our login pages affects our website SEO ranking and if it's something better to do or not. ** To point out the details of the issues, the issue types that the logins pages have are "duplicate title", "duplicate content", "missing H1", "missing description", "thin content", "missing canonical tag" I'd appreciate your help, thank you!
Intermediate & Advanced SEO | | Kaylie0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
"No index" page still shows in search results and paginated pages shows page 2 in results
I have "no index, follow" on some pages, which I set 2 weeks ago. Today I see one of these pages showing in Google Search Results. I am using rel=next prev on pages, yet Page 2 of a string of pages showed up in results before Page 1. What could be the issue?
Intermediate & Advanced SEO | | khi50 -
Any downsides of (permanent)redirecting 404 pages to more generic pages(category page)
Hi, We have a site which is somewhat like e-bay, they have several categories and advertisements posted by customers/ client. These advertisements disappear over time and turn into 404 pages. We have the option to redirect the user to the corresponding category page, but we're afraid of any negative impact of this change. Are there any downsides, and is this really the best option we have? Thanks in advance!
Intermediate & Advanced SEO | | vhendriks0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Should I make multiple landing pages for different cities?
I am trying to market my company to North Carolina & West Virginia. This is a bit of a challenge since the name is "Decorative Concrete of Virginia." My idea was to create landing pages for the specific areas (Greensboro & Raleigh, NC for now).... A new landing page them that I purchased came with a plugin that would allow you to generate a ton of landing pages with little effort by replacing some elements of the landing page, depending on the URL... For example, I have these two URLs set up right now: http://www.decorativeconcreteofvirginia.com/northcarolina/test/raleigh/nc http://www.decorativeconcreteofvirginia.com/northcarolina/test/greensboro/nc My question is... Is merely changing the city in each landing page enough, or should I change some of the other content too? I was going to create one landing page for NC, and then try to include all of the cities on that one page... but perhaps it would be easier to rank if I had one for each city. Any thoughts on this would be greatly appreciated. Thanks! Tim
Intermediate & Advanced SEO | | Timvroom0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0