How to set cannonical link rel to CS CART
-
I whant to specify a link rel cannonical for each category page, how to do that without changing the code (just from admin section), because filters and sorting search are making the site dublicate content with their parameters;
If there is a way please specify the method,
i whant to avoid hours of working in a script like this.
Thank's.
-
thanks Tom
-
Take a look at the SEO forums on the CS Cart forums:
http://forum.cs-cart.com/forumdisplay.php?f=45
I bet they know the best way!
-
Thank you Thomas,
I think i'll have to find another way. I specified alredy the parameters in google webmastertools but it seems not to wrok in case of ajax filters
Best,
Ion
-
Hi there,
I am afraid there is no way of not working with a script, when using rel=canonical. Either you have to develop a script that generates the canonical URL by itself, or you have to insert the meta tag with the right URL manually.
I would strongly discourage NOT spending any time on rel canonical, as it can mess you up site's rankings completely. Read this post by Dr. Pete if you need to convince yourself to spend some our doing the proper development: http://www.seomoz.org/blog/catastrophic-canonicalization
Best,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can one back-link fluctuates ranking of website with thousands of back-links?
It happend to our website. We have seen major ranking fluctuations for our website because of one back-link. What kind of links those can be? Why Google is not stopping them even though they claim that such back-links will be taken care of?
Intermediate & Advanced SEO | | vtmoz0 -
Link Building Strategy?
If I use unique content written from writers & post it to good sites (free blogs, bookmarking, directory, articles sites, etc.) having nice good cache, good PR, different IP's do i still have a chance of get hit by Spam actions of google? Planning to do like just 30-50 a month all with unique content or say 1 unique content then re-written & used not more than 3 times. If not this then What else would you suggest? One more thing to add up, like i have 1000+ pages out of which i have like 80-90 pages that matters to me (important pages) then how do I spin the anchor text between all the pages. Should i spin them between 1000+ pages or use only 80-90 IMP pages. If the content is 300 words let say then how many anchor tags should i have?
Intermediate & Advanced SEO | | welcomecure0 -
Questions About Link Detox
Greetings: In April of 2014 an SEO firm ran a link removal campaign (identified spammy links and uploaded a disavow). The overall campaign was ineffective and MOZ domain rank has fallen to 24 from about 30 in the last year and traffic is 20% lower. I purchased a basic package for Link Detox and ran a report today (see enclosed) to see if toxic links could be contributing to our mediocre rankings. As a novice I have a few questions for you regarding this the use of Link Detox: -We scored a domain wide detox risk of 1,723. The site has referring root domains with 7113 links to our site. 121 links were classified as high audit priority. 56 as medium audit priority. 221 links were previously disavowed and we uploaded a spreadsheet containing the names of the previously disavowed links. We had LinkDetox include an analysis of no-follow links as they recommend this. Is our score really bad? If we remove the questionable links should we see some benefit in ranking? -Some of the links we disavowed last year are still linking to our site. Is it worthwhile to include those links again in our new disavow file? -Prior to filing a disavow we will request that Webmaster remove offending links. LinkDetox offers a package called Superhero for $469.00 that automates the process. Does this package effectively help with the entire process of writing and tracking the removal requests? Do you know of any other good alternatives? -A feature called "Boost" is included in the LinkDetox Super Hero package. It is suppose to expedite Google's processing of the disavow file. I was told by the staff at Link Detox that with Boost Google will process the disavow within a week. Do you have any idea if this claim is valid??? It would be great if it were true. -We never experienced any manual penalty from Google. Will uploading a disavow help us under the circumstances? Thanks for your feedback, I really appreciate it!!! Alan p2S6H7l
Intermediate & Advanced SEO | | Kingalan10 -
Link from Google.com
Hi guys I've just seen a website get a link from Google's Webmaster Snippet testing tool. Basically, they've linked to a results page for their own website test. Here's an example of what this would look like for a result on my website. http://www.google.com/webmasters/tools/richsnippets?q=https%3A%2F%2Fwww.impression.co.uk There's a meta nofollow, but I just wondered what everyone's take is on Trust, etc, passing down? (Don't worry, I'm not encouraging people to go out spamming links to results pages!) Looking forward to some interesting responses!
Intermediate & Advanced SEO | | tomcraig860 -
DNS Settings went wrong....
Hi, I'm going to have to give you a little bit of a back story here... In July last year we launched a brand new website, www.turnkeylandlords.co.uk. It was on a new domain. The IT department set it live, and unfortunately messed up the DNS settings so that the site was launched under the wrong domain, smartloan.co.uk. This error was rectified within hours. Unfortunately in those few hours, Google indexed it! I then had to set up webmaster tools for both domains, so I could use the 'remove URLs' tool in there, to remove all the URLs from the smartloan domain. That all worked fine, the Landlords site was probably set back a bit, but we're now achieving some quite good results for it. 2 weeks ago we launched Smartloan as a product, and of course launched the website we've been working on for months... You guessed it! Google is now looking for all those old Landlords pages under Smartloan. My first thought is that we should do 301s. Would that be the best course of action, do you think? Webmaster tools has found 25 of them so far, but I know there are more - the Landlords site launched with about 90 pages... And where should I send the 301's? To the Landlords site, or to the smartloan root? Is there anything else I should do? Thanks for your help! Amelia
Intermediate & Advanced SEO | | CommT0 -
Site Wide Link Situation
Hi- We have clients who are using an e-commerce cart that sits on a separate domain that appears to be providing site wide links to our clients websites. Therefore, would you recommend disallowing the bots to crawl/index these via a robots.txt file, a no follow meta tag on the specific pages the shopping cart links are implemented on or implement no follow links on every shopping cart link? Thanks!
Intermediate & Advanced SEO | | RezStream80 -
Use rel=canonical to save otherwise squandered link juice?
Oftentimes my site has content which I'm not really interested in having included in search engine results. Examples might be a "view cart" or "checkout" page, or old products in the catalog that are no longer available in our system. In the past, I'd blocked those pages from being indexed by using robots.txt or nofollowed links. However, it seems like there is potential link juice that's being lost by removing these from search engine indexes. What if, instead of keeping these pages out of the index completely, I use to reference the home page (http://www.mydomain.com) of the business? That way, even if the pages I don't care about accumulate a few links around the Internet, I'll be capturing the link juice behind the scenes without impacting the customer experience as they browse our site. Is there any downside of doing this, or am I missing any potential reasons why this wouldn't work as expected?
Intermediate & Advanced SEO | | cadenzajon1 -
Switching to masked affiliate links
Hi there, I run a content affiliate website where I introduce products in articles and then link to merchants where the user can buy the respective product. Currently I am using regular affiliate links here with the "nofollow" attribute. With growing size of the site, I would like to switch to masked affiliate links, so instead of a link like "jdoqocy.com/click-123" I want to use "mydomain.com/recommend/123". My question here is: When switching to masked affiliate links, does it makes sense to also convert all the older unmasked affiliate links? If yes, what would be the best way to do that - Convert all old links at once or convert them over time (e.g. over a few month)? Currently about 2/3 of my site's outbound links are unmasked, external affiliate links. So I am afraid that changing this relatively large share of links from unmasked external affiliate links to masked links doenst look natural at all... Thank you for your advice!
Intermediate & Advanced SEO | | FabRag0