Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to combine 2 pages (same domain) that rank for same keyword?
-
Hi Mozzers,
A quick question. In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP. Always right next to each other (for example, position #7 and #8 or #3 and #4). So in the SERP it looks something like:
- www.mycompetition1.com
- www.mycompetition2.com
- www.mywebsite.com/page1.html
4) www.mywebsite.com**/page2.html**
5) www.mycompetition3.com
Now, I actually need both pages since the content on both pages is different - but on the same topic. Both pages have links to them, but page1.html always tends to have more. So, what is the best practice to tell Google that I only want 1 page to rank? Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1.
Does anybody have any experience in this? Any advice is much appreciated.
-
Hi there,
Realistically, the tag should be used for duplicates, yes. How "duplicated" a page is, is subjective: a page with 50% of the same content as another page is probably going to count as duplicated as far as Google goes... where that line of duplication acceptability goes isn't something any of us really know.
For pages where the content is totally different besides the header and footer, you technically shouldn't use canonicalisation. However, experiments have shown that Google honours the tag, even if the pages aren't duplicates. Dr. Pete did an experiment when the tag came out (admittedly a few years ago) where he showed that you could radically reduce the number of pages Google had indexed for a site by canonicalising everything to the home page. I personally had a client do this by accident a couple of years ago, and sure enough, their number of indexed pages dropped very quickly, along with all the rankings those pages had. As an ecommerce site that was ranking for clothing terms, this was very very bad. It took about six weeks to get those rankings back again after we fixed the tags, and the tags were fixed within about five days (should have been quicker but our urgent request went into a dev queue).
So the answer would be that Google seems to honour the tag no matter the content of the pages, but I am pretty sure that if you asked a Googler, they'd tell you that it should only be used for dupes or near-dupes.
-
Hi Jane,
Thanks for the advice. One question. I was under the impression that the rel="canonical" tag was for two pages that had the same content to let google know that the page it is pointing to is the original and should be the one to rank. Do you have any experience using them between 2 pages that have totally different content (minus the header and footer)?
Thanks again.
-
If you are happy for the second page to still exist but not rank, you should use the canonical tag to point the second page to the first one. This will lend the first page the majority of the strength of the second page and perhaps improve its authority and ranking as a result. However, the second page will no longer be indexed because the canonical tag tells Google: "ignore this page over here; it should be considered the same as the canonical version, here."
Again, this can benefit the first page, but it does mean that the second page will no longer rank at all. Only do this if you are okay with that scenario.
Cheers,
Jane
-
I'm afraid that there isn't a perfect solution, but there are various options to consider.
1.) The only way to "combine the SEO juice of both pages" is to 301 redirect one of the pages to the other (and add the content from the old page to the remaining one). However, this means that the second page will no longer exist for your website visitors (coming from organic search or not).
2.) You can use a rel=canonical tag pointing from the secondary page to the preferred one to encourage Google to list only the preferred one the pages in search results. In addition, you could use the robots.txt file or noindex meta tag (the meta tag is the preferred option) to block search engines from indexing the page and having it appear in search results. However, this will not "combine the SEO juice."
Assuming that it is crucial that the second page still exist on your website, I would probably not do anything. You appear twice in the first page of results -- great! Why mess with that? I would just focus on doing all the good SEO best practices and earning more links to those two pages to push them higher over time. (Of course, if I knew your exact situation, I would probably have additional suggestions.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Much Domain Age Matter In Ranking?
I am very confused about domain age. I read many articles about domain age, some experts say domain age does matter in ranking and some experts say it doesn't matter in the ranking. Kindly guide me about domain age.
Intermediate & Advanced SEO | | MuhammadQasimAttari0 -
Does having alot of pages with noindex and nofollow tags affect rankings?
We are an e-commerce marketplace at for alternative fashion and home decor. We have over 1000+ stores on the marketplace. Early this year, we switched the website from HTTP to HTTPS in March 2018 and also added noindex and nofollow tags to the store about page and store policies (mostly boilerplate content) Our traffic dropped by 45% and we have since not recovered. We have done I am wondering could these tags be affecting our rankings?
Intermediate & Advanced SEO | | JimJ1 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Is there any effect on ranking if i disable right click on page??
Hello , I have site, in which client needs right click on All his pages, his traffic is very Good, But worried, if right click hurts its traffic, ?? any expert can help ?? Thx in Advance
Intermediate & Advanced SEO | | ieplnupur0 -
Page Rank Worse After Optimization
For a long time, we had terrible on page SEO. No keyword targeting, no meta titles or descriptions. Just a brief 2-4 sentence product description and shipping information. Strangely, we weren't ranking too bad. For one product, we were ranking on page 1 of Google for a certain keyword. My goal to reach the top of page 1 would be easy (or so I thought). I have now optimized this page to rank better for the same keyword. I have a 276 word description with detailed specifications and shipping information. I have a strong title and meta description with keywords and modifers. I have also included a video demonstration, additional photos and an PDF of the owners manual. In my eyes, the page is 100% better than it ever was. In the eyes of MOZ, it's better also. I've got an A with the On-Page Grader. Why is this page now ranking on page 8 of Google? What have I done wrong? What can I do to correct it?
Intermediate & Advanced SEO | | dkeipper0 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
Google Ranking Generally in Germany - Keywords & Umlauts
Hi Mozzers, I was hoping i could get some advice/opinions on a website ranking problem i have been working on, in particular one of the pages. This is our German language website which is hosted from Germany and a flaunt German speaking member of staff from our German office moderates the text content of the website for us.Our website seems to get good traffic ,visitor navigation and conversions. One of the keywords i focus building around is Schallpegelmessgerät which is one way of basically saying Sound level meter in German. The keyword uses an umlaut which i cannot use in the URL, but google is picking up and putting into the snippets, but apart from that our on-page optimization is good according to the moz tool. I have been trying to improve our content and we post many blog articles around the topic/keyword but google.de seems to choose not to even display this on the first couple of pages and sometimes ranks our blog articles around the third page. We are even been outranked by some low quality cheap online shop websites some of which with low quality content and low page and domain authorities. I had accepted this but after looking at bing.de and doing a search i find our page in the top 5 results, i understand that google and bing's algorhythms are different but just struggling to get my head around it all. Here is our website & page - http://www.cirrusresearch.de/produkte/schallpegelmessgerat/ Any advice on this situation would be greatly appreciated, thank you very much for reading this James
Intermediate & Advanced SEO | | Antony_Towle0 -
Sudden rank drop for 1 keyword
A page of mine (http://loginhelper.com/networks/facebook-login/) was ranking in the top 10 for keyword (facebook login) and has been for at least 2 months, moving between 5th and 10th. Suddenly in the last 3 days the rank for the keyword dropped from 7th to 46th, yet none of the other keywords have been affected (they target other pages) and their ranks have continued to improve. I am trying to figure out what caused this sudden drop in the ranking of 1 page (the page has quality mainly text based content and isn't in the least bit shallow or spammy) I have been thinking perhaps a crawl or server error may be to cause leaving the page temporarily unavailable or with a big load time... Otherwise what could cause one page to drop so much so quickly whilst other pages improved their rank?
Intermediate & Advanced SEO | | Netboost0