Strategies for revising my duplicate content?
-
New to SEO and SEOmoz. I tried searching for this first and I'm sure it's on here but I could not find it.
I have a site that markets fishing charters in a few dozen cities. Up to now I was relying on PPC and using each city page as a landing page of sorts. Each citiy page is very similar (there are only so many ways to write about a type of fish or fishing). What would be the recommended way for optimizing this, keeping in mind the duplicate information we provide on each page seems to be important to people.
Site is www.vipfishingcharters.com
Thanks!
-
No because im guessing each page has a title tag specific to the location, if you merge 2 pages together your end up with one title tag rather then 2.
Persoanlly myself I would have a page which lists all the locations called locations.php then link to each city from that page, if you build a few links to the city page your might find the city pages get crawled more often.
So your have:
vipfishingcharters.com/locations/boca-raton-fishing-charters/
-
Wow thanks for all this feedback guys. Yes, I sensed that the best solution is write original content but it's true - there really are only a finite number of ways to explain what a sailfish is. I guess I can reword but how does Google's comparison engine work?
Assuming I could rewrite all the text, would it help to create separate file names for the images on each page (i am not going to have 40 different pictures of sailfish) - does this matter at all?
Searches for my site nearly always include a city name along with them so my original logic was to create a "landing page" for every city. Right now nearly 90% of my traffic is CPC but I'm trying to change that.
Would it make more sense to consolidate cities and group them on a smaller number of pages or does this then kill the localized search?
-
Hi Noah!
If there is to be a page for each location then there should be something useful to say about each location too.
Consider what would actually be useful to users specifically searching for the page about their area.
As well as the content about the activities you offer (which, as previous answerers have said, should be unique for each page and show you to be the 'fishing guy') I would suggest:
- Information about how to get to each location
- Local places to stay if people are to travel for the fishing activities
- Anything that makes that particular city unique/interesting in terms of the experience you offer
That type of information that is specific about each location will be unique to each landing page.
As previous answerers have also suggested, customer testimonials and other forms of user generated content would be great.
-
Nice... great idea. Get enthusiastic customers to share.. and send a photo if they caught a nice one. They will probably link to it from their facebook page if their photo is on a fishing boat with a monster fish.
-
Great suggestions, William.
If you want to be the "fishing guy" you have to earn it.
-
You could start with getting a customer testimonial, just have a page with location you fished, comment and collect them, each one will be unique, can be added to the beach they finished and added benefit 'installs trust in potential new business'.
Send the customer an email day after fishing trip with a link to the form.
That's just one way sure there is other ways.
-
First off, you have a very nice website as I am also an avid angler. I do agree with EGOL as you can build unique content for different cities, for one of my sites I promote the bass industry however I can target areas via lake location in different metros. Do you have only one launch point for your service or many? another thing you could do is partner with tackle shops within your target areas and the reference them in your article with the metro making that page include that location you are targeting.
Or you could develop a contest for the best fishing photo in that metro all in all there are many ideas on how you can get listed for these types of metro keywords related to your industry..
One person that you could learn form would be Wil Reynolds as he spoke about this stuff in one of his seminars http://www.youtube.com/watch?feature=player_detailpage&v=hSQ0DZdSDMI#t=368s
You can also reference his last blog post on SEOMOZ
http://www.seomoz.org/blog/never-worry-about-an-algorithm-update-again-a-history
All very good stuff, and welcome to SEOMOZ there is a ton of experts in hear to help.
Hope this gave you some ideas and get your gears working.
-
Each citiy page is very similar (there are only so many ways to write about a type of fish or fishing). What would be the recommended way for optimizing this, keeping in mind the duplicate information we provide on each page seems to be important to people.
I am going to be straight and honest.
Write unique content for each city. Just do it.
(there are only so many ways to write about a type of fish or fishing)
I don't believe this... get to writing. If you want to be the "fishing guy" you should be able to do this no sweat. It just takes time.
I know you will not like my answer. But you know I am right.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Duplicate content issues - page content and store URLs
Hi, I'm experiencing some heavy duplicate content Crawl errors on Moz with www.redrockdecals.com and therefore I really need some help. It brings up different connections between products and I'm having a hard time figuring out what it means. It is listing the same products as duplicate content but they have different URL endings. For example:http://www.redrockdecals.com/car-graphics/chevrolet-silverado?___store=nl&___from_store=us
On-Page Optimization | | speedbird1229
&
http://www.redrockdecals.com/car-graphics/chevrolet-silverado?___store=d&___from_store=us It seems like Moz considers the copy-pasted parts in the Full Description (scrolled a bit down on product pages) as Duplicate Content. For example the general text found on this page: http://www.redrockdecals.com/caution-tow-limited-turning-radius-decal Or this page: http://www.redrockdecals.com/if-you-don-t-succeed-first-time-then-skydiving-isn-t-for-you-bumper-sticker I am planning to write new and unique descriptions for all products but what do you suggest - should I either remove the long same descriptions or just shorten them perhaps so they don't outweigh the short but unique descriptions above? I've heard search engines understand that some parts of the page can be same on other pages but I wonder if in my case this has gone too deep... Thanks so much!0 -
Duplicate Page Content Should we 301 - Best Practices?
What would be the best way to avoid a Duplicate Page Content for these type of pages. Our website generates user friendly urls, for each page..
On-Page Optimization | | 365ToursSafaris
So it is the same exact page, just both versions of the url work.. Example: http://www.safari365.com/about-africa/wildebeest-migration http://www.safari365.com/wildebeest-migration I don't think adding code to the page will work because its the same page for the incorrect and correct versions of the page. I don't think i can use the URL parameter setting because the version with /about-africa/ is the correct (correct as it it follows the site navigation) I was thinking of using the htaccess to redirect to the correct version.. Will that work ? and does it follow best Practices ? any other suggestions that would work better ?0 -
E-commerce site product descriptions and duplicate content
Hi everyone. I'm developing an e-commerce site using Prestashop and concerned about the issue of duplicate content among product descriptions. My main concerns are: If there are 500 or more products and those product descriptions are obtained from a manufacturer or supplier's website hence running into external duplicate content issues. Internal duplicate content is also an issue, if there are multiple similar products and each product has the same description across several pages. What would be the best approach to eliminate the possibility of incurring a duplicate content penalty due to similar product descriptions? I've already considered the suggestion of noindex-ing the complete range of products to help protect from duplicate content penalties and having unique articles written in the site blog discussing products instead linking to certain products on the site. Another consideration I had was noindex-ing all product pages except pages for featured products in the store and rewriting descriptions for a set amount of those featured products regularly (this will still have the problem of internal duplicate content across pages if similar product descriptions are rewritten). The product range is intended to be very large so I'm really seeking an alternative solution from the insane task of rewriting many product descriptions. Any suggestions to make SEO work efficient are very much welcome and appreciated. Thank you!
On-Page Optimization | | valuepets0 -
Removing syndicated duplicate content from website - what steps do I need to take to make sure Google knows?
Hey all, So I've made the decision to cancel the service that provides my blog with regular content / posts, since it seems that having duplicate content on my site isn't doing me any favors. So I'm on a Wordpress system - I'll be exporting the posts so I have them for reference, and then deleting the posts. There are like 150 or so - What steps should I take to ensure that Google learns of the changes I've made? Or do I not need to do anything at all in that department? Also - I guess I've assumed that the best decision would be to 'remove' the content from my blog. IS that the best way to go? Or should I leave it in place and start adding unique content? (my guess is that I need to remove it...) Thanks for your help, Kurt
On-Page Optimization | | KurtBullock0 -
Duplicate content with a trailing slash /
Hi, I 've pages like this: A) www.example.com/file/ B) www.example.com/file Just two questions: Does Google see this as duplicate content? Best to 301 redirect B to A? Many thanks Richard PS I read previous threads re the subject, it sounded like there was a bug in SEOMoz but I was not absolutely clear. Apologies if this is going over old ground.
On-Page Optimization | | Richard5550 -
How do I avoid duplicate content and page title errors when using a single CMS for a website
I am currently hosting a client site on a CMS with both a Canadian and USA version of the website. We have the .com as the primary domain and the .ca is re-directed from the registrar to the Canadian home page. The problem I am having is that my campaign produces errors for duplicate page content and duplicate page titles. Is there a way to setup the two versions on the CMS so that these errors do not get produced? My concern is getting penalized from search engines. Appreciate any help. Mark Palmer
On-Page Optimization | | kpreneur0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0