Is this Duplicate content?
-
Hi all,
This is now popping up in Moz after using this for over 6 months.
It is saying this is now duplicate site content.What do we think? Is this a bad strategy, it works well on the SERPS but could be damaging the root domain page ranking? I guess this is a little shady.
http://www.tomlondonmagic.com/area/close-up-magician-in-crowborough/
http://www.tomlondonmagic.com/area/close-up-magician-in-desborough/
http://www.tomlondonmagic.com/area/close-up-magician-in-didcot/
Thanks.
-
If what you've got right now is working for you and bringing in relevant (converting) traffic then I would be cautious about doing anything too drastic. There's always a risk associated with any changes you make like this and the last thing you want to do is kill your own traffic.
I wouldn't immediately tear down the duplicate pages, but I would start to think about how I could update some of the content and maybe create new pages that better engage with your visitors and help to increase your conversion rate (I don't know what your conversion rate is.). That may help off set any impact cause by a potential loss of rankings for those duplicate pages might.If the pages continue to rank then it'll still help!
I've got some thoughts that might be useful (please take this as constructive criticism and recognise that I don't know your niche as well as you do!)
For example, the copy on your home page is "all about you" and very little about what your visitor. What do I get if I book you for an event? What's your value proposition, the benefits of your particular service and how can you differentiate yourself from the competition.
A great place to start is to speak to your last 10 customers and find out why they hired you, what were the things that convinced them to hire you, what were the concerns/doubts they the had?
I'm guessing here (you'll need to talk to your real customers) but if I was hiring you for my wedding, I wouldn't be so worried about the price, or the quality of your routines (I don't know what ground-breaking magic is!) but more concerned with questions like:
- "What if it's all going to be a bit cheesy?"
- Is this going to annoy my guests?
- Is it going to be intrusive?
- Can he work with the venue?
- Can the performance be tailored to the theme of my event or the location?
If you can figure our what really matters to people you can quickly put them at ease and even turn these concerns into benefits.
You might want to also look at how you're using images. It can be hard on the ego, but it's not you that's the important thing here - if you can show more of the reactions and atmosphere that you create then that may help people fell that "yes, I want some of that for my wedding/party etc"
Don't bury your testimonials away on a testimonials page. You've got some great comments there about "delighting guests", "making birthdays special"... I'd use those on your relevant pages. (Personally I think they're more compelling than the "celeb" testimonials.)
Segment your customers and work that group's particular needs/concerns. I'm sure you know the kind of specific issues that come up when your dealing with corporate customers.
I really do think it would help to write the content in the first person, using as natural language as possible. As it stand, the site comes across a bit cold, and doesn't let your personality come across.
Hope this helps.
-
Doug,
Thank you for your response, it solidifys what I have been thinking for the last few months about removing the keyword optimisation on site.
Yes, I do get a lot of work from those pages, and they do seem to convert fairly well. I guess I need to change the title of the website and the copy for human eyes, not google's.
The only fear there is that I drop out of rankings. I guess that is the price to pay if you want to play by the rules!
With regards to the duplicate pages, what should I do then, everyone in my niche is doing it, shall I get rid of them all and bite the bullet!?
-
Nice!
Tom, out of interest, do these pages get much search traffic? What is the conversion rate like on these - do they actually get your any work. If you're not getting any traffic/conversions then just showing up in the SERPS for your keyword is just vanity thing.
If the tactic is getting you work then you obviously don't want to tear it all down, although I'm sure you understand that it's not exactly the kind of thing Google's terms of service are trying to encourage. These kind of tactics are still working, but there's a risk attached too and it's not something I would recommend and not something I'd feel comfortable recommending.
You've got to look at your competition too - and I see that it's a pretty common (almost ubiquitous) tactic used in your niche.
Do you detail the area your cover on your home page? I'm worried that seeing "Magician London" at the start of your page title and the keywords "Magician London" all over the copy could put people off looking for something local.
How can people find out if you cover their area when they visit your site?
The page copy doesn't read very naturally! Have you tried reading it out-loud? I'm, not sure you'd talk to someone like this face to face. I would try to make the text more natural and use the first person. After all, you're trying to sell yourself aren't you, and it's your personality, that's makes you different from your competition.
My general advice would to think less about optimising for search engines, and start thinking about optimising your your visitors, what information are they looking for and what are they trying to achieve on your site...
-
Hi there, this is definitely not a good idea from an SEO stand point. I strongly recommend to you to have the content written uniquely for each of those pages. I have seen methods like these making websites vanish from the index as well as making websites safely pass under the Google's radar. But, we should stick to the best practices and see to it that all the pages on our websites have substantially unique content so as to find and secure their place into the SERPs. Quality content that is unique, fresh, highly relevant, interesting, link and share worthy can literally spell magic for your SEO efforts. Just my two cents my friend.
Best of luck to you,
Devanur Rafi.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Product Descriptions - Technical List Supplier Gave Us
Hello, Our supplier gives us a small paragraph and a list of technical features for our product descriptions. My concern is duplicate content. Here's what my current plan is: 1. To write as much unique content (rewriting the paragraph and adding to it) as there is words in the technical description list. Half unique content half duplicate content. 2. To reword the technical descriptions (though this is not always possible) 3. To have a custom H1, Title tag and meta description My question is, is the list of technical specifications going to create a duplicate content issue, i.e. how much unique content has to be on the page for the list that is the same across the internet does not hurt us? Or do we need to rewrite every technical list? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Duplicate Content issue in Magento
I am getting duplicate content issue because of the following product URL in my Magento store. http://www.sitename.com/index.php/sports-nutritions/carbohydrates http://www.sitename.com/sports-nutritions/carbohydrates Please can someone guide me on how to solve it. Thanks Guys
White Hat / Black Hat SEO | | webteamBlackburn0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Does posting a source to the original content avoid duplicate content risk?
A site I work with allows registered user to post blog posts (longer articles). Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content? Thanks!
White Hat / Black Hat SEO | | 945010 -
Duplicate content showing on local pages
I have several pages which are showing duplicate content on my site for web design. As its a very competitive market I had create some local pages so I rank high if someone is searching locally i.e web design birmingham, web design tamworth etc.. http://www.cocoonfxmedia.co.uk/web-design.html http://www.cocoonfxmedia.co.uk/web-design-tamworth.html http://www.cocoonfxmedia.co.uk/web-design-lichfield.html I am trying to work out what is the best way reduce the duplicate content. What would be the best way to remove the duplicate content? 1. 301 redirect (will I lose the existing page) to my main web design page with the geographic areas mentioned. 2. Re write the wording on each page and make it unique? Any assistance is much appreciated.
White Hat / Black Hat SEO | | Cocoonfxmedia0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0