"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
-
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines).
Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area.
Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies.
I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics).
Questions Assuming general on-page optimization and linking factors are equal:
- Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)?
If I choose to differentiate each client's website, how much differentiation makes sense? Specifically:
-
Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'?
-
Are images as important as copy in differentiating content?
-
From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)?
Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names.
Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent.
In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions.
Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
-
so since you're doing what are all the right things generally, then I'd recommend looking at what the inbound link quality/volume/diversity is for various sites you have compared to their individual market competitors. Beyond that it would need to be a case by case evaluation to better nail down issues/problems.
On a final note, social's become a big signal and should be highly encouraged as well... (twitter engagement for example), though I know it's a challenge in that type of market.
-
Hi Alan,
The template site is fairly basic static html, address/contact info is repeated on every page in an 'About Us' sidebar box and prominent phone numbers throughout, also a 'Service Area' table that lists cities is on every page. The site in total is about 27 html pages at average ~25KB a page.
We could definitely differentiate the image alt tags further.
Geographic information is included in title tags for home page and all service-offered related pages, but not in title tags for pages like 'privacy policy.'
Google Places, Yelp, Yahoo/Bing Local etc. are all in place.
Thank you for your feedback!
-
When you ask about the templatized repetitiveness, I need to wonder how much code exists underneath the visible content. If there is an overwhelming ratio of code to on-page content, this can, by itself, negatively impact a site's uniqueness if there are dozens, hundreds, or thousands of identical templates, however it should be a minor concern if there's enough unique content specific to geo-location and individual site owner.
So for example, is geographic information included in every page title and within every page's content? Are site owners able to include their own unique image alternate attribute text? Is their address and contact info on every page? Do they have their own Google Place pages (properly optimized, and pointing back to their site's contact page? Or do they even also have Yelp, CitySearch, Bing Local or Yahoo local listings similarly set up?
All of these can help.
As far as the template repetition, if the rest of the above is all properly utilized, it shouldn't be a major problem, so I'd start looking at those considerations and go from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two Different IP address pointing to my website, does it will effect my website from SEO point of view
Due to some reason my website https://xyz.com is not redirecting to my main website domain - https://www.xyz.com so our tech team suggested - we will have the non-www name on a different IP and we'll 301 redirect that to the https://www.xyz.com. if it works does it will effect our website from SEO point of view? please let me know.
Intermediate & Advanced SEO | | BPLLC0 -
Unique content for international SEO?
Hi Guys, We have a e-commerce store on generic top-level domain which has 1000s of products in US. We are looking to expand to aus, uk and canda using subfolders. We are going to implement hreflang tags. I was told by our SEO agency we need to make all the content between each page unique. This should be fine for cateogry/product listing pages. But they said we need to make content unique on product pages. If we have 1000 products, thats 4000 pages, which is a big job in terms of creating content. Is this necessary? What is the correct way to approach this, won't the hreflang tag be sufficent to prevent any duplicate content issues with product pages? Cheers.
Intermediate & Advanced SEO | | geekyseotools0 -
If my website do not have a robot.txt file, does it hurt my website ranking?
After a site audit, I find out that my website don't have a robot.txt. Does it hurt my website rankings? One more thing, when I type mywebsite.com/robot.txt, it automatically redirect to the homepage. Please help!
Intermediate & Advanced SEO | | binhlai0 -
Difference in Number of URLS in "Crawl, Sitemaps" & "Index Status" in Webmaster Tools, NORMAL?
Greetings MOZ Community: Webmaster Tools under "Index Status" shows 850 URLs indexed for our website (www.nyc-officespace-leader.com). The number of URLs indexed jumped by around 175 around June 10th, shortly after we launched a new version of our website. No new URLs were added to the site upgrade. Under Webmaster Tools under "Crawl, Site maps", it shows 637 pages submitted and 599 indexed. Prior to June 6th there was not a significant difference in the number of pages shown between the "Index Status" and "Crawl. Site Maps". Now there is a differential of 175. The 850 URLs in "Index Status" is equal to the number of URLs in the MOZ domain crawl report I ran yesterday. Since this differential developed, ranking has declined sharply. Perhaps I am hit by the new version of Panda, but Google indexing junk pages (if that is in fact happening) could have something to do with it. Is this differential between the number of URLs shown in "Index Status" and "Crawl, Sitemaps" normal? I am attaching Images of the two screens from Webmaster Tools as well as the MOZ crawl to illustrate what has occurred. My developer seems stumped by this. He has submitted a removal request for the 175 URLs to Google, but they remain in the index. Any suggestions? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Differences between "casas rusticas" and "casas rústicas"
Hi All, I've a client with this website: http://www.e-rustica.com/casas-rusticas It's a spanish realtor for special houses (rustic). We wanto it to be good posited as "casas rústicas" that it's the correct keyword and asl "casas rusticas" that it's like lot of people write it. Do you know if google see this two keywords as the same? Even we've done SEO for "casas rústicas" it's much better posited for "casas rusticas". Regards,
Intermediate & Advanced SEO | | lbenzo_aficiona0 -
Web domain hurt seo?
does having the "web" prefix in the domain name, such as in web.pennies.com/copper hurt SEO?
Intermediate & Advanced SEO | | josh1230 -
What counts as a "deeper level" in SEO?
Hi, I am trying to make our site more crawlable and get link juice to the "bottom pages" in an ecommerce site. Currently, our site has a big mega menu - and we have: Home > CAT 1
Intermediate & Advanced SEO | | bjs2010
SUBCAT 1
SUBSUBCAT 1
PRODUCT Our URL Structure looks:
www.domain.com/cat1/subcat1/subsubcat1/ and here are the links to the products but the URL's look like: www.domain.com/product.html Obviously the ideal thing would be to cut out one of the CATEGORIES. But I may be unable to do that in the short term - so I was wondering if by taking CAT1 out of the equation - e.g., just make it a static item that allows the drop down menu to work, but no page for it - Does that cut out a level? Thanks, Ben0 -
What is the effect on using jQuery sliders for content on SEO?
I know using css in subversive manners gets you dinged for points. I didnt know if JS counted the same since you are essentially hiding parts of the content and showing it in intervals as slides. The goal would be having key items for a client in divs and rotating those divs via a slider plugin as slides. I was just curious if that effected things in any way. Thanks! ~Paul
Intermediate & Advanced SEO | | peb72680