Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to combine 2 pages (same domain) that rank for same keyword?
-
Hi Mozzers,
A quick question. In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP. Always right next to each other (for example, position #7 and #8 or #3 and #4). So in the SERP it looks something like:
- www.mycompetition1.com
- www.mycompetition2.com
- www.mywebsite.com/page1.html
4) www.mywebsite.com**/page2.html**
5) www.mycompetition3.com
Now, I actually need both pages since the content on both pages is different - but on the same topic. Both pages have links to them, but page1.html always tends to have more. So, what is the best practice to tell Google that I only want 1 page to rank? Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1.
Does anybody have any experience in this? Any advice is much appreciated.
-
Hi there,
Realistically, the tag should be used for duplicates, yes. How "duplicated" a page is, is subjective: a page with 50% of the same content as another page is probably going to count as duplicated as far as Google goes... where that line of duplication acceptability goes isn't something any of us really know.
For pages where the content is totally different besides the header and footer, you technically shouldn't use canonicalisation. However, experiments have shown that Google honours the tag, even if the pages aren't duplicates. Dr. Pete did an experiment when the tag came out (admittedly a few years ago) where he showed that you could radically reduce the number of pages Google had indexed for a site by canonicalising everything to the home page. I personally had a client do this by accident a couple of years ago, and sure enough, their number of indexed pages dropped very quickly, along with all the rankings those pages had. As an ecommerce site that was ranking for clothing terms, this was very very bad. It took about six weeks to get those rankings back again after we fixed the tags, and the tags were fixed within about five days (should have been quicker but our urgent request went into a dev queue).
So the answer would be that Google seems to honour the tag no matter the content of the pages, but I am pretty sure that if you asked a Googler, they'd tell you that it should only be used for dupes or near-dupes.
-
Hi Jane,
Thanks for the advice. One question. I was under the impression that the rel="canonical" tag was for two pages that had the same content to let google know that the page it is pointing to is the original and should be the one to rank. Do you have any experience using them between 2 pages that have totally different content (minus the header and footer)?
Thanks again.
-
If you are happy for the second page to still exist but not rank, you should use the canonical tag to point the second page to the first one. This will lend the first page the majority of the strength of the second page and perhaps improve its authority and ranking as a result. However, the second page will no longer be indexed because the canonical tag tells Google: "ignore this page over here; it should be considered the same as the canonical version, here."
Again, this can benefit the first page, but it does mean that the second page will no longer rank at all. Only do this if you are okay with that scenario.
Cheers,
Jane
-
I'm afraid that there isn't a perfect solution, but there are various options to consider.
1.) The only way to "combine the SEO juice of both pages" is to 301 redirect one of the pages to the other (and add the content from the old page to the remaining one). However, this means that the second page will no longer exist for your website visitors (coming from organic search or not).
2.) You can use a rel=canonical tag pointing from the secondary page to the preferred one to encourage Google to list only the preferred one the pages in search results. In addition, you could use the robots.txt file or noindex meta tag (the meta tag is the preferred option) to block search engines from indexing the page and having it appear in search results. However, this will not "combine the SEO juice."
Assuming that it is crucial that the second page still exist on your website, I would probably not do anything. You appear twice in the first page of results -- great! Why mess with that? I would just focus on doing all the good SEO best practices and earning more links to those two pages to push them higher over time. (Of course, if I knew your exact situation, I would probably have additional suggestions.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page rank and menus
Hi, My client has a large website and has a navigation with main categories. However, they also have a hamburger type navigation in the top right. If you click it it opens to a massive menu with every category and page visible. Do you know if having a navigation like this bleeds page rank? So if all deep pages are visible from the hamburger navigation this means that page rank is not being conserved to the main categories. If you click a main category in the main navigation (not the hamburger) you can see the sub pages. I think this is the right structure but the client has installed this huge menu to make it easier for people to see what there is. From a technical SEO is this not bad?
Intermediate & Advanced SEO | | AL123al0 -
Keyword difficulty and time to rank
Hello, Is there a correlation between the keyword difficult and the time it takes to rank ? In other words let's say I try to rank for the keyword "seo" and it is going to take 2 years to rank 1 st whereas if I go for "best seo tools in 2018" and it takes just 2 weeks ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
PDF ranking higher than HTML pages, solution?
Hello Moz community I know this question has been asked before but it seems there is no real answer other than putting a summary of the PDF on the HTML page. My problem is other websites are using my PDFs, I have some PDFs with very high authority links and I would like to either pass the link juice on to my product/category page or do rel=canonical somehow. I'm using bigcommerce as my platform. My website is cwwltd.com. Any help would be greatly appreciated. Thank you
Intermediate & Advanced SEO | | Neverstop1231 -
Is there any effect on ranking if i disable right click on page??
Hello , I have site, in which client needs right click on All his pages, his traffic is very Good, But worried, if right click hurts its traffic, ?? any expert can help ?? Thx in Advance
Intermediate & Advanced SEO | | ieplnupur0 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
Keep multiple domains or combine them?
I need some help figuring out if I should combine multiple domains or if I should let them be separate? I have domain1.com, domain2.com, and domain3.com. Well, domain1.com owns domain2.com and domain3.com. And currently domain1.com points to domain2.com and domain3.com from the homepage. They are going through some changes at their business, and now the option is on the table to combine the domains or still let them be separate as long as they link to each other. What is the best way to handle this and are there more things I should go through before making a decision? None of them have a ton of links to them, and they aren't super robust, but would just to have some advice. Thanks a lot
Intermediate & Advanced SEO | | Rocket.Fuel0 -
Was moving up in SERPS then Got Stuck on Page 2
Hi, I was continuously acquiring quality back-links and my site was moving up in Google SERPS for 3 main keywords. Within a few weeks i was on Page 2 and 3 for these three keywords, but after reaching there I got stuck on these pages and positions despite no change in link building strategy / pattern. I have even increased the number and quality of links that I acquire per day, but I am still stuck at exact same positions. The website is10 months old and related to a software niche. I update this website once a week. For one keyword I am stuck at position 1 of page two (you can well imagine the frustration..!!). My question is that what do I need to do to get out of this "SERP lock"?
Intermediate & Advanced SEO | | RightDirection0