Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to Remove Joomla Canonical and Duplicate Page Content
-
I've attempted to follow advice from the Q&A section.
Currently on the site www.cherrycreekspine.com, I've edited the .htaccess file to help with 301s - all pages redirect to www.cherrycreekspine.com.
Secondly, I'd added the canonical statement in the header of the web pages.
I have cut the Duplicate Page Content in half ... now I have a remaining 40 pages to fix up.
This is my practice site to try and understand what SEOmoz can do for me.
I've looked at some of your videos on Youtube ... I feel like I'm scrambling around to the Q&A and the internet to understand this product. I'm reading the beginners guide.... any other resources would be helpful.
-
Hi Todd,
Oh no, looks like all your canonicals are pointing towards the homepage....
<link rel="<a class="attribute-value">canonical</a>" href="[http://www.cherrycreekspine.com<](view-source:http://www.cherrycreekspine.com%3C/)"/>
Also looks like there's an extra carrot "<" at the end of the URL. Looks like it's coming from the wonky code.
Regardless, this isn't what you want. This basically tells search engines that every page is a canonical of the homepage - and that all these other pages aren't important on their own. It's likely search engines will start to drop these pages out of their index unless this tag is removed immediately. Reminds me of Dr. Pete's canonical nightmares.
In short... remove that canonical ASAP. It's probably better to have some duplicate content than a sitewide canonical that points to your homepage.
Have you tried the Joomla Canonicalisation Plugin? Haven't tried it myself, but it might be smidgen easier than trying to code the php yourself. You can find it here: http://extensions.joomla.org/extensions/site-management/seo-a-metadata/url-canonicalization-/5355
My guess is you can completely remove whatever manual canonical code you wrote, and the plugin will take care of the rest.
Remember, when the code is working properly, each page with point to itself (without extra parameters and so on) the way a proper canonical should.
Best of luck!
-
View page source shows HTML. If I can't see the PHP file i.e. which generates the HTML it's impossible for me to know how. I'm not very clear on this. You can PM the file to me if you want - but don't send me any passwords please.
-
Yikes... so, I checked the template html file and all including PHP info is there...
View page source did not show all the code.
So, still stuck.
-
Hold on .... not all the copy was properly entered by me... I'll fix and retest.
-
Mr., I replaced the line in the html but no change. I'm completely new to this, so please forgive me. All of these concepts are new to me.
Any other thoughts or direction? I looked at one of the Whiteboard videos... buy over my head.
Thanks in advance... Todd
-
You need to code it again:
In the HTML I can see on Line 15:
rel="canonical" href="http://www.cherrycreekspine.com<?php echo parse_url($canonical, PHP_URL_PATH); ?>"/>
You need to write everything after href="
and before "/>
in php; probably:
<link rel="canonical" href="<?php echo="" '<a="" class=" " href="http://www.cherrycreekspine.com%26lt%3B/?php%20echo%20parse_url($canonical,%20PHP_URL_PATH);%20?%3E" target="_blank">http://www.cherrycreekspine.com<'.parse_url($canonical, PHP_URL_PATH); ?>"/></link rel="canonical" href="<?php>
If it doesnt sort it, post again and I'll try to help you further.
Also, you either apply canonical values or remove duplicate content (the post title is slightly confused, I think)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | | EdenPrez0 -
Same product in different categories and duplicate content issues
Hi,I have some questions related to duplicate content on e-commerce websites. 1)If a single product goes to multiple categories (eg. A black elegant dress could be listed in two categories like "black dresses" and "elegant dresses") is it considered duplicate content even if the product url is unique? e.g www.website.com/black-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/elegant-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/black-elegant-dress unique url > this is the way my products urls look like Does google perceive this as duplicated content? The path to the content is only one, so it shouldn't be seen as duplicated content, though the product is repeated in different categories.This is the most important concern I actually have. It is a small thing but if I set this wrong all website would be affected and thus penalised, so I need to know how I can handle it. 2- I am using wordpress + woocommerce. The website is built with categories and subcategories. When I create a product in the product page backend is it advisable to select just the lowest subcategory or is it better to select both main category and subcategory in which the product belongs? I usually select the subcategory alone. Looking forward to your reply and suggestions. thanks
Intermediate & Advanced SEO | | cinzia091 -
Category Pages & Content
Hi Does anyone have any great examples of an ecommerce site which has great content on category pages or product listing pages? Thanks!
Intermediate & Advanced SEO | | BeckyKey1 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0