Canonical for 80-90% duplicate content help
-
Hi . I seem to spend more time asking questions atm.
I have a site I have revamped www.themorrisagency.co.uk
I am working through sorting out the 80-90% duplicated content that just replaces a spattering of geographical and band styles eg:
http://www.themorrisagency.co.uk/band-hire/greater-manchester/ 'manchester' being changed to :
http://www.themorrisagency.co.uk/band-hire/oxfordshire/ etc
So I am going through this slow but essential process atm.
I have a main http://www.themorrisagency.co.uk/band-hire/ page
My question is:
Would it be sensible to (using Yoast SEO plug in) use a canonical redirect as a temp solution from these dup pages to http://www.themorrisagency.co.uk/band-hire/
Rather than remove them
What are your thoughts as I am aware that the damage using a rel= could make it worse.
Thanks as always
Daniel
-
OK Seen Video thanks Wiqas, understand that Rel is a thorny thing. So I am not going to opt for that Brett thanks!
Suggestion is to no index /no follow until it is made unique and concentrate on the most imortant pages first and then submit those bit by bit. Necessary ?
Is it more beneficial to remove crap pages than to have them there even though they are indexed albeit at a low level.
-
Did you saw this Video by Rand : http://moz.com/blog/handling-duplicate-content-across-large-numbers-of-urls
It will give you Quite a Good idea What to Do..
-
Hi I would be mindful of using rel.
I would use a simpler more logical way and restructure the pages with original content and just use 301 redirection. This passes Googles website design guidelines available here:-
https://support.google.com/webmasters/answer/93633?hl=en
In particular:-
- You're merging two websites and want to make sure that links to outdated URLs are redirected to the correct pages.
Which is essentially what you have done/doing
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Page URL Question
Our main website is geared toward the city where we are located and includes the city name in content page URLs. We also have separate websites for three surrounding cities; these websites have duplicate content except the city name: MainWebsite.com
Local Website Optimization | | sharon75025
City2-MainWebsite.com
City3-MainWebsite.com
City4-MainWebsite.com We're restructuring to eliminate the location websites and only use the main website. The new site will have city pages. We have well established Google business locations for all four cities. We will keep all locations, replacing the location website with the main website. Should we remove City-IL from all content page URLs in the new site? We don't want to lose traffic/ranking for City2 or City3 because the content pages have City1 in the URL. Page URLs are currently formatted as follows: www.MainWebsite.com/Service-1-City1-IL.html
www.MainWebsite.com/Service-2-City1-IL.html
www.MainWebsite.com/Service-3-City1-IL.html
www.MainWebsite.com/Service-4-City1-IL.html Thanks!0 -
Does having 2 separate domains with similar content always = duplicate content?
I work for a global company which is in the process of launching their US & European websites, (just re-launched Australian site, migrated from an old domain) all with separate domains with the purpose of localising. However, the US website content will essentially be the same as the Australian one with minor changes (z instead of s, slightly different service offerings etc) but the core information will be the same as the AU site. Will this be seen as duplicate content and Is there a way we can structure this so that the content won’t be seen as duplicate but is still a separate localised website? Thank you.
Local Website Optimization | | PGAUE0 -
Content Strategy – Blog Channel Questions
We are currently blogging at a high volume to hit keywords for our 1,500 locations across the country. We are trying to make sure we rank well near each location and we have been using our blog to create content for that reason. With recent changes on Google, I am seeing that it is more about content topics than hitting all variations of your keywords and including state and city specific terms. We are now asking ourselves if the blog channel portion of our content strategy is incorrect. Below are some of the main questions we have and any input that is backed by experience would be helpful. 1. Can it hurt us to blog at a high volume (4 blogs per day) in an effort to include all of our keywords and attach them to state and city specific keywords (ie. "keyword one" with "keyword one city" and "keyword one different city")? 2. Is it more valuable to blog only a couple of times per month with deeper content, or more times per month with thinner connect but more keyword involvement? 3. Our customers are forced to use our type of product by the government. We are one of the vendors that provide this service. Because of this our customers may not care at all about anything we would blog about. Do we blog for them, or do we blog for the keyword and try and reach partners and others who would read the content and hope that it also ranks us high when our potential customers search? 4. Is there an advantage/disadvantage or does it matter if we have multiple blog authors? Big questions for sure, but if you have insight on any one of them, please provide and maybe we can answer them all with a group effort. Thanks to all of you who are taking the time to read this and contribute.
Local Website Optimization | | Smart_Start0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
Need Help - Google has picked up an overseas company with the same name and put it in search on the right
Hi All, Google has picked up a competitors logo from overseas (same name) and input it with the wikipedia excerpt on the right hand side of search. What the heck can I do to get this removed as its a serious legal/brand issue. See URL - http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu Hope someone can help !! Cheers Dave http://www.google.com.au/webhp?nord=1&gws_rd=cr&ei=GcMeVuS0CMq-0gSR7Lm4BA#nord=1&q=cfcu
Local Website Optimization | | CFCU0 -
Massive duplicate content should it all be rewritten?
Ok I am asking this question to hopefully confirm my conclusion. I am auditing a domain who's owner is frustrated that they are coming in #2 for their regionally tagged search result and think its their Marketer/SEOs fault. After briefly auditing their site, the marketing company they have doing their work has really done a great job. There are little things that I have suggested they could do better but nothing substantial. They are doing good SEO for the most part. Their competitor site is ugly, has a terrible user experience, looks very unprofessional, and has some technical SEO issues from what I have seen so far. Yet it is beating them every time on the serps. I have not compared backlinks yet. I will in the next day or so. I was halted when I found, what seems to me to be, the culprit. I was looking for duplicate content internally, and they are doing fine there, then my search turned externally...... I copied and pasted a large chunk of one page into Google and got an exact match return.....rutro shaggy. I then found that there is another site from a company across the country that has identical content for possibly as much as half of their entire domain. Something like 50-75 pages of exact copy. I thought at first they must have taken it from the site I was auditing. I was shocked to find out that the company I am auditing actually has an agreement to use the content from this other site. The marketing company has asked the owners to allow them to rewrite the content but the owners have declined because "they like the content." So they don't even have authority on the content for approximately 1/2 of their site. Also this content is one of three main topics directed to from home page. My point to them here is that I don't think you can optimize this domain enough to overcome the fact that you have a massive portion of your site that is not original. I just don't think perfect optimization of duplicate content beats mediocre optimization of original content. I now have to convince the owners they are wrong, never an easy task. Am I right or am I over estimating the value of original content? Any thoughts? Thanks in advance!
Local Website Optimization | | RossM0 -
General SEO Help
Hi Everyone, **Website: **www.helppestcontrol.com I've been working on a wordpress based website for the past few months now. This is a new website that we designed for an existing company that decided to rebrand. The previous website had little to no traffic.. so we've basically started for scratch. I've followed SEO guides and have completed many of the basics. We started using MOZ just under a month ago and have made a ton of changes based upon those suggestions. With all of this being said, we have seen some slight improvements in traffic, but nothing truly noticeable. In fact, 90% of our traffic is coming from a Facebook PPC campaign. I think the main struggle is that the company has such a wide operating based (a ton of very small towns and cities). We created an optimize page for each one (same content, just switched out the keywords).. in hopes of driving traffic. Is this the correct approach? Or should be optimize for general terms such as "Bed Bug Removal" versus "Bed Bug Removal Barrie"? I was hoping that the community could take a look at the website (maybe run it through a few tests) and give me some more suggestions. I would really appreciate any feedback. Thank you!
Local Website Optimization | | Timrhendry0 -
Website and eshop with the same product descrition is duplicate content
Hi there! I'm building a website that is divided in a "marketing" and "shop" sections. The 2 sites are being authored by two companies (my company is doing the marketing one). The marketing site has all the company products while the shop will sell just some of those. I'm facing the problem of duplicated content and want to ask you guys if it will be a problem/mistake to use the same product description (and similar url) for the same product in both sites, and the right way to do it (without rewriting product descriptions). the main site will be : www.companyname.com
Local Website Optimization | | svitol
the shop will be: shop.companyname.com thanks
Francesco0