Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How can I avoid duplicate content for a new landing page which is the same as an old one?
-
Hello mozers!
I have a question about duplicate content for you...
One on my clients pages have been dropping in search volume for a while now, and I've discovered it's because the search term isn't as popular as it used to be. So... we need to create a new landing page using a more popular search term.
The page which is losing traffic is based on the search query "Can I put a solid roof on my conservatory" this only gets 0-10 searches per month according to the keyword explorer tool. However, if we changed this to "replacing conservatory roof with solid roof" this gets up to 500 searches per month. Muuuuch better!
The issue is, I don't want to close down and re-direct the old page because it's got a featured snippet and sits in position 1. So I'd like to create another page instead... however, as the two are effectively the same content, I would then land myself in a duplicate content issue.
If I were to put a rel="canonical" tag in the original "can I put a solid roof...." page but say the master page is now the new one, would that get around the issue?
-
@Virginia-Girtz To avoid duplicate content issues when creating a new landing page that is similar to an old one, consider the following strategies:
-
301 Redirect: If the old landing page is no longer needed, you can redirect its URL to the new landing page using a 301 redirect. This tells search engines that the old page has permanently moved to the new location.
-
Canonical Tags: Implement canonical tags on the new landing page pointing to the old landing page URL. This informs search engines that the content on the new page is a duplicate of the old page and should be indexed under the old page's URL.
-
Content Variation: Rewrite the content on the new landing page to make it sufficiently different from the old one. This could involve changing the wording, adding new information, or altering the layout.
-
Noindex Tag: If the old landing page is still relevant but you want to prioritize the new one, you can use a noindex tag on the old page. This prevents search engines from indexing the old page while still allowing users to access it.
-
Consolidate Content: Consider consolidating the content from both landing pages into a single, comprehensive page. This helps avoid duplication and can improve user experience by providing all relevant information in one place.
-
Robots.txt: Use the robots.txt file to block search engines from crawling one of the landing pages. However, this approach should be used cautiously as it may also prevent search engines from discovering other valuable content on your site.
I apply all these experiment on this of my client site
By implementing one or a combination of these strategies, you can effectively address duplicate content concerns while maintaining the visibility and relevance of your landing pages.
-
-
So what you want for every page and blog post on your website is unique, high-quality white hat content marketing.
We applied this white hat SEO method to a U.K garden room company, website and after we rewrote the pages, the organic visitor numbers increased.
-
What I've usually seen with canonicals is that Google either removes the noncanonical page from its index, or it ignores your canonical and treats them as two separate pages. I haven't seen an example where a canonical lets you get the best of both worlds.
I agree with Nozzle - you can tweak your existing content to target both phrases! Google understands synonyms, so if anything, you're just creating a more all around relevant page.
Good luck!
Kristina
-
Since it is effectively the same content you should be able to rank the same page for both phrases.
You just need to include the new keyword within the existing content and test out a few title tag variations to find one that helps you move up the rankings for the new keyword without dropping your ranking for the old keyword.
The first thing I'd test would be to change your title tag from "Can I put a solid roof on my conservatory?" to "Replacing Conservatory Roof with Solid Roof - Can I put a solid roof on my conservatory?". Wait until Google re-crawls the page and check how your rankings fared. If you lose your snippet or drop in rankings for the low volume phrase you can always test out the reverse, "Can I put a solid roof on my conservatory? Replacing Conservatory Roof with Solid Roof", and see what happens then.
Don't be scared to test many variations, even long title tags that seemingly don't follow best practice. You can always change it back to the original and your rankings will go back to what they were before you tested (assuming your competitors didn't gain some awesome back links to overtake you).
Don't mess with the section of content that is being pulled into the featured snippet though so as not to lose that snippet.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in Shopify - subsequent pages in collections
Hello everyone! I hope an expert in this community can help me verify the canonical codes I'll add to our store is correct. Currently, in our Shopify store, the subsequent pages in the collections are not indexed by Google, however the canonical URL on these pages aren't pointing to the main collection page (page 1), e.g. The canonical URL of page 2, page 3 etc are used as canonical URLs instead of the first page of the collections. I have the canonical codes attached below, it would be much appreciated if an expert can urgently verify these codes are good to use and will solve the above issues? Thanks so much for your kind help in advance!! -----------------CODES BELOW--------------- <title><br /> {{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
Intermediate & Advanced SEO | | ycnetpro101
{% if page_description %} {% endif %} {% if current_page != 1 %} {% else %} {% endif %}
{% if template == 'collection' %}{% if collection %}
{% if current_page == 1 %} {% endif %}
{% if template == 'product' %}{% if product %} {% endif %}
{% if template == 'collection' %}{% if collection %} {% endif %}0 -
Multiple Landing Pages and Backlinks
I have a client that does website contract work for about 50 governmental county websites. The client has the ability to add a link back in the footer of each of these websites. I am wanting my client to get backlink juice for a different key phrase from each of the 50 agencies (basically just my keyphrase with the different county name in it). I also want a different landing page to rank for each term. The 50 different landing pages would be a bit like location pages for local search. Each one targets a different county. However, I do not have a lot of unique content for each page. Basically each page would follow the same format (but reference a different county name, and 10 different links from each county website). Is this a good SEO back link strategy? Do I need more unique content for each landing page in order to prevent duplicate content flags?
Intermediate & Advanced SEO | | shauna70840 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
How to Remove Joomla Canonical and Duplicate Page Content
I've attempted to follow advice from the Q&A section. Currently on the site www.cherrycreekspine.com, I've edited the .htaccess file to help with 301s - all pages redirect to www.cherrycreekspine.com. Secondly, I'd added the canonical statement in the header of the web pages. I have cut the Duplicate Page Content in half ... now I have a remaining 40 pages to fix up. This is my practice site to try and understand what SEOmoz can do for me. I've looked at some of your videos on Youtube ... I feel like I'm scrambling around to the Q&A and the internet to understand this product. I'm reading the beginners guide.... any other resources would be helpful.
Intermediate & Advanced SEO | | deskstudio0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Can PDF be seen as duplicate content? If so, how to prevent it?
I see no reason why PDF couldn't be considered duplicate content but I haven't seen any threads about it. We publish loads of product documentation provided by manufacturers as well as White Papers and Case Studies. These give our customers and prospects a better idea off our solutions and help them along their buying process. However, I'm not sure if it would be better to make them non-indexable to prevent duplicate content issues. Clearly we would prefer a solutions where we benefit from to keywords in the documents. Any one has insight on how to deal with PDF provided by third parties? Thanks in advance.
Intermediate & Advanced SEO | | Gestisoft-Qc1