2 pages competing
-
Hi all,
My website currently has 2 pages that address the theme 'property investment in manchester' or 'buy to let manchester'.
https://www.knightknox.com/developments/manchester/
https://www.knightknox.com/investments/manchester
I am a bit concerned that we are competing against ourselves for these keywords. In my opinion the /investments page provides better content, but the /developments page ranks higher in Google.
What do you think would be the best course of action?:
-
Leave as is
-
Merge the contents of both pages
-
Redirect the /developments to /investments
-
Or something else?
Any ideas welcome.
Thanks
-
-
Hello!
I wanted to make you a screen video as it's a bit more nuanced, here's the link:
https://www.useloom.com/share/ebddf32363444e6a9cbc2ddfab639612
(Sorry for my voice- had a cold!)
It's 5mins, definitely watch!
But in very brief summary - Google is ranking two types of content: property listing style pages, and articles. You can have two pages - one of each type and they won't cannibalize each other.
-
Hi Brian,
Keyword cannibalization is always a dilemma. I've checked those URLs in SEMrush and /investments/ doesnt rank high for any relevant kws.
My approach would be improving /developments/ trying to improve that high ranks and with a keyword research focus /investments/ towards other search terms.I dont recommend redirecting one to the other neither merging them, because you have the potential gain with other keywords and long tails.
It could be benefitial to write some blogposts about one or several topics related to those pages and point a backlink to /developments/ or /investments/ with the focus keyword so as tell google that its content relevant to that search term.
Hope it helps.
Best luck.
GR.PS: It would be awesome to hear back from you with the selected option and the results!
PS2: Some resources:
How to Identify & Eliminate Keyword Cannibalization to Boost Your SEO - SEJ
Keyword Cannibalization and SEO: What You Need to Know - Stone Temple
Why You Might Be Cannibalizing Your Own Keywords – Here’s Why #125 - Stone Temple
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page not ranking because of React.js ?
Hey guys, I'm struggling with this part of my website which uses react.js . My developers used this saying it's much better and much quicker (which I think so too) but we have really low traffic coming from google compared to the other parts of the website (not using react.js). Moz gives me a score of 85% for the page but we get less than 100 visits / day and we were targeting 10.000 visits/day giving the traffic of this section in our competitors website (our whole website has 60.000 visits / day). (Section is online since 3 months now) Can you help me see what is wrong there ? I'm in Belgium so we have the website in 3 languages (FR/NL/EN) but the most important ones are FR & NL. FR : https://gocar.be/fr/prix-voitures-neuves/Audi/A3/A3-Sportback/1-0-TFSI_39CER NL : https://gocar.be/nl/prijzen-nieuwe-wagens/Audi/A3/A3-Sportback/1-0-TFSI_39CER EN : https://gocar.be/en/price-new-cars/Audi/A3/A3-Sportback/1-0-TFSI_39CER Main competitors having a better ranking than us (exemple in FR) : https://www.moniteurautomobile.be/modele--audi--a3/prix.html https://www.vroom.be/fr/prix/audi-a3/citadine-2012/197 Cheers ! Jean-Philippe
Intermediate & Advanced SEO | | Gocar_be0 -
Links from non-indexed pages
Whilst looking for link opportunities, I have noticed that the website has a few profiles from suppliers or accredited organisations. However, a search form is required to access these pages and when I type cache:"webpage.com" the page is showing up as non-indexed. These are good websites, not spammy directory sites, but is it worth trying to get Google to index the pages? If so, what is the best method to use?
Intermediate & Advanced SEO | | maxweb0 -
Indexing Dynamic Pages
http://www.oreillyauto.com/site/c/search/Wiper+Blade/03300/C0047.oap?make=Honda&model=Accord&year=2005&vi=1430764 How is O'Reilly getting this page indexed? It shows up in organic results for [2005 honda accord windshield wiper size].
Intermediate & Advanced SEO | | Kingof50 -
Blog home page and SEO
Why do most blog owners not put content that is unique to the home page above the fold before posts begin?
Intermediate & Advanced SEO | | BobAnderson0 -
2-stage 301 redirects
Dear colleagues, I have quite an unusual situation with one of my client's websites, and I could use an advise from someone who experienced the same circumstances: They are currently planning on launching a new site under the same domain (by September), when several key current pages are intended to be replaced with new equivalent pages under new URLs. So far it's pretty simple, BUT - due to a merger with another company they will be migrating their entire website to a different domain within a year. My question is - what would be the optimal solution for redirects? We are considering a 301 from the current pages to the new pages under the same domain, and once the new domain is activated - aside from defining 301 redirects from the new pages under the same domain to the new domain, we will cancel the original 301 from the old pages to the new pages on the same domain, and instead define new 301 for those pages to the new domain. What do you think? Is there a better solution - like using 302 redirects for the first stage? Has anyone tried such a procedure? Your input will be highly appreciated! Thanks in advance, Omer
Intermediate & Advanced SEO | | Usearch0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
404 Redirecting to the home page
One of my clients that is managing their own server and website recently moved servers. Which then broke their custom 404 page. Instead of fixing this or putting the site back to the old server they redirected the 404 to the home page. I've been working on getting their 404's appropriately redirected, or old urls redirection using a 301 for a month or two. I read the HTTP Status Codes best practices. It just discusses usability. What technical seo back lash can happen?
Intermediate & Advanced SEO | | triveraseo0 -
How to Preserve PageRank for Disappearing Pages?
Pretend that USA Today has a section of their site where the sell electronics, which they locate at http://electronics.usatoday.com. The subdomain is powered by an online electronics store called NewCo via a white label. Many of the pages on this subdomain have relatively high PageRank. But few, if any, external sites link to the subdomain--the PageRank of the subdomain is largely due to internal links from the usatoday.com root domain. USA Today's deal with NewCo expires and they decide to partner with my startup instead. But, unlike NewCo, we won't be providing a white-label solution; rather, USA Today will be redirecting all of the electronics-related links on their root domain to my site instead of the electronics.usatoday.com subdomain. They also agree to direct all of the pages on electronics.usatoday.com to me. Ideally USA Today would add 301's to all of their pages on electronics.usatoday.com that direct to the corresponding pages on my site, but they don't have the engineering wherewithal or resources to do this. Therefore, what is the best way to pass the PageRank from the electronics.usatoday.com pages to my site? Would it work to have USA Today change the CNAME for electronics.usatoday.com to my site and then create pages on my site that mimic the USA today URL structure? For example, let's say there was a page located at electronics.usatoday.com/ipods. Could we give electronics.usatoday.com a CNAME form my site and then create a page on my site located at mysite.com/ipods that 301'ed to the ipod page on my site? Would that preserve the PageRank?
Intermediate & Advanced SEO | | jack789078900