Why does the on page report reports a full path link as Cannibalize link?
-
On the seomoz on page report i get a cannibalize error.
This is due to a link being full path.
When i change the link to relative path then there is no Cannibalize error.
Should i change the internal links of the site to relative path?
I would appreciate your help.
-
Hey Antony! I checked out your account and can't replicate the issue that you're seeing. I think you're talking about running an On-Page report for a URL/keyword combination. As far as I could tell, the only type of cannibalization you could get a warning for is keyword self-cannibalization, where it looks like keywords are being targeted for multiple pages instead of a single page on your site. As you probably read, we suggest staying away from linking internally to another page with the target keyword(s) as the exact anchor text, as well.
We should be crawling a full path URL in the same way we crawl a relative URL, so if it's changing solely based on you modifying the URL in your search, I'll need to see what's going on. Can you either post an example of the URLs and keywords you're seeing this for or send that information over to help (help@seomoz.org)? The more information and screenshots you can send, the better. Thanks Antony!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Is it better to try and boost an old page that ranks on page #5 or create a better new page
Hello Everyone, We have been looking into our placements recently and see that one of our blog posts shows on page #5 for a popular keyword phrase with a lot of search volume. Lets say the keyword is "couples fitness ideas" We show on page 5 for a post /couples-fitness-ideas-19-tips-and-expert-advice/ We want to try and get on the first page for that phrase and wanted to know if it is better if we did one of the following: 1. Create a new page with over 100 ideas with a few more thousands of words. with a new url (thinking /couples-fitness-ideas) 2. Create a new page with a new url (thinking /couples-fitness-ideas) with the same content as the currently ranking post. We would want to do this for more freedom with layout and design of the page rather than our current blog post template. Add more content, let's say 100 more ideas. Then forward the old URL to the new one with a 301 redirect. 3. Add more content to the existing post without changing the layout and change the URL. Look forward to your thoughts
On-Page Optimization | | MobileCause0 -
Header Links vs. In Page Links
We have lost considerable rank for some of our top search terms (department names) and the rank loss correlates to a change we made on our homepage. That change was to remove a secondary navigation to the major departments in the content of our homepage. Now all we have is the global header navigation on the homepage (and all other pages on the site). I have read that in-page links pass more value than sitewide header links and I'm wondering if this is really true. These were text links (not linked images) and our header also contains text links (and some javascript). We did not make any other changes on our site at this time and this was not around the time of any major algorithm updates. The site is www.ebags.com.
On-Page Optimization | | SharieBags1 -
404 link | How to remove the link so it is not found?
My report has listed a few links with 404 errors. They are internal links but are not found. Is there a way to remove that link so it is not found again? Thanks
On-Page Optimization | | SavingSense0 -
Old pages
I have a site where I have 5,000 new products each year, I never waned to deleted the old pages due to links pointing to them and keywords. But I now have 20,000 plus pages, does having that many pages spread out my link juice or does it effect me in any other ways over having a site with 5,000 pages or should I keep not deleting old pages so I dont loose any links? Along with that I currently do not link to my old pages from my site so Im guessing google does not get to them very often if at all, if you agree to still keep them should I link to them somewhere? Because the products are not that simiiar and they do bring added value I dont think canonical would work here
On-Page Optimization | | Dirty0 -
How could I avoid the "Duplicate Page Content" issue on the search result pages of a webshop site?
My webshop site was just crawled by Roger, and it found 683 "Duplicate Page Content" issues. Most of them are result pages of different product searches, that are not really identical, but very similar to each other. Do I have to worry about this? If yes, how could I make the search result pages different? IS there any solution for this? Thanks: Zoltan
On-Page Optimization | | csajbokz0 -
Prevent link juice to flow on low-value pages
Hello there! Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages. I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO. What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this? Put a rel="nofollow" attribute on those links? Put a "robots" meta tag containing "noindex,nofollow" on those pages? Put a "Disallow" for those pages in a "robots.txt" file? Use efficient Javascript links? (that crawlers won't be able to follow)
On-Page Optimization | | jonigunneweg0 -
Max # of recommended links per page?
I've heard it said that Google may choose to stop following links after the first 100 on a page. The landing/category pages for my site's product catalog have earned quite a respectable PR and positioning in search results, and I'm currently paginating their product listings (about 200 products in a category) so that only a couple dozen products are shown on the first page, with links to "next page" and "previous page" being accomplished via query string (i.e. "?page=3"). An alternative option I have is to link to 100% of the contained products within the category's landing page (which would increase my on-page link count to ~300) and use CSS/Javascript to allow the user to simulate browsing between pages on the client side. My goal is to see as many of my product pages indexed as possible. Is this done better using my current scheme (where Googlebot would have to navigate to, say, Landing Page -> Page 6 -> Deeply Buried Product Page) or in the alternative method above, where all the links are in a single page? Since my landing pages are currently treated pretty well by search engines, would that "trust" cause them to follow more links than might normally be done? Thank you!
On-Page Optimization | | cadenzajon0