Canonical for 80-90% duplicate content help
-
Hi . I seem to spend more time asking questions atm.
I have a site I have revamped www.themorrisagency.co.uk
I am working through sorting out the 80-90% duplicated content that just replaces a spattering of geographical and band styles eg:
http://www.themorrisagency.co.uk/band-hire/greater-manchester/ 'manchester' being changed to :
http://www.themorrisagency.co.uk/band-hire/oxfordshire/ etc
So I am going through this slow but essential process atm.
I have a main http://www.themorrisagency.co.uk/band-hire/ page
My question is:
Would it be sensible to (using Yoast SEO plug in) use a canonical redirect as a temp solution from these dup pages to http://www.themorrisagency.co.uk/band-hire/
Rather than remove them
What are your thoughts as I am aware that the damage using a rel= could make it worse.
Thanks as always
Daniel
-
OK Seen Video thanks Wiqas, understand that Rel is a thorny thing. So I am not going to opt for that Brett thanks!
Suggestion is to no index /no follow until it is made unique and concentrate on the most imortant pages first and then submit those bit by bit. Necessary ?
Is it more beneficial to remove crap pages than to have them there even though they are indexed albeit at a low level.
-
Did you saw this Video by Rand : http://moz.com/blog/handling-duplicate-content-across-large-numbers-of-urls
It will give you Quite a Good idea What to Do..
-
Hi I would be mindful of using rel.
I would use a simpler more logical way and restructure the pages with original content and just use 301 redirection. This passes Googles website design guidelines available here:-
https://support.google.com/webmasters/answer/93633?hl=en
In particular:-
- You're merging two websites and want to make sure that links to outdated URLs are redirected to the correct pages.
Which is essentially what you have done/doing
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do old backlinks still help with new URL with 301 redirect? Also I added the www. How does this affect it all?
I changed my URL from exampledetailing. com to exampleautodetailing. com. It is redirected with a 301. Also, it is on Squarespace AND I opted to add the www. So will the old backlinks of exampledetailing. com still help the new URL exampleautodetailing. com or do I need to try and update all the links? Also, future links, do I need to include the www. or just the root domain of exampleautodetailing. com or even the whole https://wwwexampleautodetailing. com? I believe the www is considered a sub domain and a new entity on Google, so I am not sure how that works. Thank you!
Local Website Optimization | | Rmarkjr810 -
Question about partial duplicate content on location landing pages of multilocation business
Hi everyone, I am a psychologist in private practice in Colorado and I recently went from one location to 2 locations. I'm currently updating my website to better accommodate the second location. I also plan continued expansion in the future, so there will be more and more locations as time goes on. As a result, I am making my websites current homepage non-location specific and creating location landing pages as I have seen written about in many places. My question is: I know that location landing pages should have unique content, and I have plenty of this, but how much content is it also okay to have be duplicate across the location landing pages and the homepage? For instance, here is the current draft of the new homepage (these are not live yet): http://www.effectivetherapysolutions.com/dev/ And here are the drafts of the location landing pages: http://www.effectivetherapysolutions.com/dev/denver-office http://www.effectivetherapysolutions.com/dev/colorado-springs-office And for reference, here is the current homepage that is actually live for my single Denver location: http://www.effectivetherapysolutions.com/ As you can see, the location landing pages have the following sections of unique content: Therapist picture at the top testimonial quotes (the one on the homepage is the only thing I have I framed in this block from crawl so that it appears as unique content on the Denver page) therapist bios GMB listing driving directions and hours and I also haven't added these yet, but we will also have unique client success stories and appropriately tagged images of the offices So that's plenty of unique content on the pages, but I also have the following sections of content that are identical or nearly identical to what I have on the homepage: Intro paragraph blue and green "adult" and child/teen" boxes under the intro paragraph "our treatment really works" section "types of anxiety we treat" section Is that okay or is that too much duplicate content? The reason I have it that way is that my website has been very successful for years at converting site visitors into paying clients, and I don't want to lose aspects of the page that I know work when people land on it. And now that I am optimizing the location landing pages to be where people end up instead of the homepage, I want them to still see all of that content that I know is effective at conversion. If people on here do think it is too much, one possible solution is to turn parts of it into pictures or put them into I-frames on the location pages so Google doesn't crawl those parts of the location pages, but leave them normal on the homepage so it still gets crawled on there. I've seen a lot written about not having duplicate content on location landing pages for this type of website, but everything I've read seems to refer to entire pages being copied with just the location names changed, which is not what I'm doing, hence my question. Thanks everyone!
Local Website Optimization | | gremmy90 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
Franchise Content Spinning
Hey Guys, Thanks for taking the time out to read my question, I appreciate it. I know Google doesn't treat all duplicate content the same, but what about this scenario. We have a garage door company franchise that services Seattle, San Diego, & Salt Lake City. It is the same brand, but each area has a different website, catering to their own county. Say I write & post a blog about "how to maintain your garage door" to the Seattle site. This is certainly useful for the other locations as well. So would I get penalized for posting the same article to San Diego & Salt Lake City without massively changing the content to avoid duplication? Or should I dedicate the extra time to revamp the content and avoid duplication? Does Google care about this type of duplication? Thanks in advance!!
Local Website Optimization | | dwayne.jones260 -
Need Help - Google has picked up an overseas company with the same name and put it in search on the right
Hi All, Google has picked up a competitors logo from overseas (same name) and input it with the wikipedia excerpt on the right hand side of search. What the heck can I do to get this removed as its a serious legal/brand issue. See URL - http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu Hope someone can help !! Cheers Dave http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu
Local Website Optimization | | CFCU0 -
Canonical for blog tag or search site
Dear all, I have problem with duplicate content on my site and crawled by seomoz as "duplicate content", might be i am not clear enough about how to put "canoncial" but the problem is with my site mostly on blog or tags or categories, so some link that actually different tags ....come with same result..so like: http://www.livingwordfreelutheran.org/news-events/blog/tag/ Gymnastics and http://www.livingwordfreelutheran.org/news-events/blog/tag/ God's Power It will show same result..the problem is,all are dynamic... and what i should put the canonical for that page? Both of link use same page or controller? If i put the canonical itself on each result it will be fix it? Or how? …and also I confusing how I put it also on search result? Like ?query=keywords that show same result? How I put canonical on there? Sorry if this duplicate question... I very very appreciate for the help…thank you! Best regards,
Local Website Optimization | | lwflc
Harrison0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1