Is it advisable to have unique pages for different cities/states though there wouldnt be any actual differentiation in the actual content.
-
Is it advisable to have unique pages for different cities/states though there wouldnt be any actual differentiation in the content. For example should we have separate pages for "hammers in california" & "hammers in new york". The product is same and content more or less the same. The search volume for individual queries is low but collectively makes a large number. The unique title tag automatically will generate traffic. So does it make sense to make 50 such pages. Else is there any way to uniquely target 50 such queries/month/city
-
no worries.
As of some of the previous answer were suggesting that it depends on many other things too, for example:
- Are you selling those 'hammers' online or do you have shops in each city?
If Online only, then having a page for hammer/s and add tags for the city's. The question to ask then: what is the user experience and what content do I have for the city tags page?
Have a look at the below document from SEOmoz, if you haven't read it yet.
If there are shops in each City, then you will have the information for those shops in the tag page for the City... Address, phone opening hours etc....
hope this helps you further.
-
Jim, can you please elaborate more.
-
Thanks Andy. Agree completely!
-
A good approach in cases like these is to link the content via Tags to your City's.
So you will be able to link content to locations!
-
The Penguin update as described by Google is to hit 'over optimized' sites. Creating duplicate pages with minimal word changes for the purpose of SEO is exactly the kind of thing that Penguin is for - so no, its not an easy opportunity, its not an opportunity at all.
-
Thanks Andy. I tend to agree with you and hence the concern. But isnt it an easy opportunity which we are just letting go?
Having built some domain authority should we let go of such easy opportunities? Is there some organic way to target these?
PPC is always an option...
-
No is the short answer. Having almost identical pages with only minimal differences (for the purposes of SEO) is exactly the kind of spammy thing the recent Penguin & Panda updates are clamping down on. If you dont have stores in california, new york etc Google doesnt want to rank you for those terms.
PPC would be a better solution for those kinds of terms if they can get past the minimum search threshold.
Optimising product pages for types of hammers e.g. claw hammers is the way to go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reposting content.
I have some good articles I wrote for article directories a couple of years ago. I took them down 6 months ago. I am hoping to repost them somewhere better if the content isn't listed on google and passes Copyscape. Would this be safe?
Content Development | | T0BY1 -
Duplicate Content
Hi Does anyone know a site where i can paste text to test for duplication? We've used some outstanding freelancer copywriters in the past but need to check the authenticity of the article created before publishing, Thanks Gary
Content Development | | GaryVictory0 -
Duplicate Content Discovery
I was hit with Penguin on April 24th like a ton of bricks. Luckily my cash cow keyword was kept safe and still is today with even an increase in traffic over the year. With some other main keywords I used to rank far I fell off the board on that day. Since then I have been slowly trying to clean things up as much as I know Today I was sitting down with my coffee and Penguin mindset and I decided to use copyscape again to review duplicate content issues and something I noticed which I either didn't before or didn't think was an issue was my footer. In my footer I used a blurb from some other site in my niche a long time ago. Which I discovered they used from one of the main sites in my niche. Anyways I noticed that my footer is what kept coming up as being duplicate content and was always at an overage of 28% according to copyscape. My question is should I be worried about the footer? Is 28% a lot?
Content Development | | cbielich0 -
How do we setup Rel=Author for Business Pages
Hello, I am interested in adding a rel=author tag to all of our website pages. They are static php pages. How would you suggest we go about doing this? We do have a Google+ business page setup. Thanks
Content Development | | AspectExhibitions0 -
Hit With Panda, How Should I block pages?
Hello! I believe Ive been hit with Panda, I have a large Ecommerce site with literally thousands of pages, but working on adding custom content daily. Should I block pages that have duplicated copy, that dynamically insert a product/artist/team name? Will this help with my huge ranking drop? If so after this has been done should I send a request reconsideration to google? Or will it just happen automatically? I believe this is a algo penalty and not manual, as I have not received any messages in my Webmaster. Any help would be greatly appreciated!! Thank You!
Content Development | | TP_Marketing0 -
Removing decent PA and PR pages?
I currently have some html pages that have camtasia videos demo-ing our products; however, these videos are WAY outdated and we no longer want to maintain them. The are only videos, no text at all is on the page. The pages have an average page authority of 40 and mR of 4. Should I just remove these pages or should I create a redirect to the homepage or product page? Or...? I have roughly 30 or 40 of these types of pages. Thanks for any help. Mike
Content Development | | Mike.Goracke0 -
Does Google really ignore Noindex pages?
Assume I may have some pages of my site that don't have a lot of text on them, and I have to keep them on the site. Let's say there are no more than 50 like this out of 400 great pages, and the ratio of great-to-short pages continues to increase. If I no index the short ones, will Google really ignore them in search? Will they ignore them enough to not downrank my site due to the short, noindexed pages? I know, theoretically, they are supposed to ignore them, but I don't always trust all the rules.
Content Development | | bizzer0 -
Duplicate Content on WordPress Blogs?
We are getting ready to add a WordPress blog to our established website. Our plans are to place it in a subfolder on our website to maximize rank. My question is...Do we need to utilize a Meta Robots WordPress plugin by Yoast or similar so that noindex,follow robots meta tags will prevent search engine indexing of search result pages, subpages and category archives? We want to avoid the dreaded Duplicate Content Error and penalty. Any other great SEO WordPress plugins? Thank you for your time. Brian
Content Development | | gw3seo0