Canonical for blog tag or search site
-
Dear all,
I have problem with duplicate content on my site and crawled by seomoz as "duplicate content", might be i am not clear enough about how to put "canoncial" but the problem is with my site mostly on blog or tags or categories, so some link that actually different tags ....come with same result..so like:
http://www.livingwordfreelutheran.org/news-events/blog/tag/ Gymnastics
and
http://www.livingwordfreelutheran.org/news-events/blog/tag/ God's Power
It will show same result..the problem is,all are dynamic... and what i should put the canonical for that page? Both of link use same page or controller? If i put the canonical itself on each result it will be fix it? Or how?
…and also I confusing how I put it also on search result? Like ?query=keywords that show same result? How I put canonical on there?
Sorry if this duplicate question... I very very appreciate for the help…thank you!
Best regards,
Harrison -
Yes use noindex, follow for that
-
Sorry i got it..noindex, follow..so it still will follow..ignored my question..thanks again!
-
Oh yes...i build it custom for admin and put it on uncommon url name
I decide to not index all tags url
just like you said...i will waiting for my next crawl, by the way if i do that...it will be not index by google too? because
client seems want see all url by using that google analytics? sorry i am new on this and i am very appreciate for your help René Hansen.
for conflicts..ok got it..so i better just put GTM and remove the GA tag....got it...you are the best!
-
You are welcome, I do not seem to recognize a CMS present on your site, but it should be something that your programmer would be able to fix.
On another note, your are outputting Google Analytics as well as Google Tag Manager at the same time. Beware that this might conflict with your Analytics data.If you want to use GTM, then fire a GA tag instead of outputting it directly on the page.
-
oh Thank you René Hansen ! thanks you so much! this help me to explain too..thank you!
-
Well, your issues are with tags and I would rather noindex them than using a canonical tag.
There are no reason for the search engines to find tag categories and it doesn't help a searcher either (on site it can be useful however).
rel="canonical" should be used for content which is similiar like a product with different attributes, a product with multiple URLs with the same content, or if you syndicate content from another site.
Remember that rel="canonical" is only good to use if you cannot hardcode your way out of it. If you can remove the need for the tag, by either removing the indexation of the tag category or making the blog function different, then that would be the deal solution.
Read about rel="canonical" here: https://support.google.com/webmasters/answer/139066?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Loss of search visibility-consecutive drops in one month - something I did or competitors?
I am fairly new to Moz. I co-manage a national website with about 400 common pages and separate location areas for cities in Australia. 1 city starting their own separate website a year ago. A drop in search visibility of the whole national site and my location page started in mid July according to Moz stats.- 8%>12%>$38% consecutive drops per week. In google analytics the organic search has dropped 8% overall & 2% on my location page in last month. I did minor optimisation to the my page and articles using Moz in July - upped H2 to H1 title, tweaked main keyword, wrote slightly different SEO title and included keywords in body copy. The rankings of the target keywords went up but other keyword rankings went down. The other thing that started in June was Facebook advertising of our blog articles (click-throughs have a high bounce rate of 95%). The office with its own website (with a similar brand name) also started doing Facebook advertising and SEO for it earlier this year. I can see their own website traffic really shot up in June/July, and they also maintained their traffic on the national site. Wondering if any of these are causing the drop, or if this is more an indicator of competitor activity or alogorthms? Any ideas about causes and solutions appreciated.
Local Website Optimization | | SueMclean0 -
Best SEO Option for Multi-site Set-up
Hi Guys, We have a Business to Business Software Website. We are Global business but mainly operate in Ireland, UK and USA. I would like your input on best practice for domain set-up for best SEO results in local markets. Currently we have: example.com (no market specified) and now we are creating: example.com/ie (Ireland) example.com/uk (united kingdom) example.com/us (united states) My question is mainly based on the example.com/us website - should we create example.com/us for the US market OR just use example.com for the US the market? If the decision is example.com/us should we build links to the directory or the main .com website. To summarize there is two questions: 1. Advise on domain set-up 2. Which site to build links to if example.com/us is the decision. Thank you in advance, Glen.
Local Website Optimization | | DigitalCRO0 -
Will NAP Schema Impact non local searches
Hi, Just got a business address and a toll free number for my website. I have read that adding the NAP details schema to the site gives that additional weight of trust to Google and also helps local search. Now my website is NOT local. However, if I add my LA address details on my website using the Local Business schema.org, it might give Google the impression that I am based out of CA. Fair enough, but my question is, will it impact negatively for SERPs from other states. For example I might want to rank for KW "Autism Alternative Treatment". Obviously now that I have added my NAP, if someone keys in Autism Alternative Treatment LA or Autism Alternative Treatment CA, google should give my site preference. But if someone searched Autism Alternative Treatment Arizona, will google exclude/downgrade me (even though there may not be a local site for Arizona) from the search results under the pretext that I am not Arizona based? Your suggestion would be very helpful.
Local Website Optimization | | DealWithAutism0 -
Updated site with new Url Structure - What Should I expect to happen ?. Also it's showing PR 1 for my urls on Opensite explorer
Hi All, We updated our website with a new url structure. Apart from the root domain , everyother page is showing up in opensite explorer with a page rank 1. Although we only went live with this yesterday, I would have thought that the 301's etc from the old urls would be coming through and the PR would show ?.. I am not familiar what to expect or what alarms bells I need to watch out for when doing this type of thing although I would probably expect a small drop in traffic ?..I don;t know what the norm is though so Any advice greatly appreciated? thanks PEte
Local Website Optimization | | PeteC120 -
Need advice on direction to go with site
I am taking over this site and redoing it all over. I believe that google may have penalized the site because the site doesn't show up in the SERPS, but will show under a google search (site:prosplumbingsanjoseca.com). I am just asking for your opinions on what I should do to correct the issues with this site and get back into the SERPS.
Local Website Optimization | | mikezaiss0 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0 -
Need to access robots.txt to block tags
Hi My website nykb.com is showing up in moz as having multiple duplicate pages because of the tags (each tag generates its own page and since posts have many tags but the same tags are only used once/twice the tag pages are all duplicate pages. I wanted to block the tagpages in robots.txt but cant seem to find access to it- have searched online but havent come up with anything! I do not have access to the ftp folders only the wordpress backend.. should I just remove tags? the posts are grouped by category too.. THANKS
Local Website Optimization | | henya0