"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
-
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines).
Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area.
Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies.
I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics).
Questions Assuming general on-page optimization and linking factors are equal:
- Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)?
If I choose to differentiate each client's website, how much differentiation makes sense? Specifically:
-
Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'?
-
Are images as important as copy in differentiating content?
-
From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)?
Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names.
Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent.
In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions.
Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
-
so since you're doing what are all the right things generally, then I'd recommend looking at what the inbound link quality/volume/diversity is for various sites you have compared to their individual market competitors. Beyond that it would need to be a case by case evaluation to better nail down issues/problems.
On a final note, social's become a big signal and should be highly encouraged as well... (twitter engagement for example), though I know it's a challenge in that type of market.
-
Hi Alan,
The template site is fairly basic static html, address/contact info is repeated on every page in an 'About Us' sidebar box and prominent phone numbers throughout, also a 'Service Area' table that lists cities is on every page. The site in total is about 27 html pages at average ~25KB a page.
We could definitely differentiate the image alt tags further.
Geographic information is included in title tags for home page and all service-offered related pages, but not in title tags for pages like 'privacy policy.'
Google Places, Yelp, Yahoo/Bing Local etc. are all in place.
Thank you for your feedback!
-
When you ask about the templatized repetitiveness, I need to wonder how much code exists underneath the visible content. If there is an overwhelming ratio of code to on-page content, this can, by itself, negatively impact a site's uniqueness if there are dozens, hundreds, or thousands of identical templates, however it should be a minor concern if there's enough unique content specific to geo-location and individual site owner.
So for example, is geographic information included in every page title and within every page's content? Are site owners able to include their own unique image alternate attribute text? Is their address and contact info on every page? Do they have their own Google Place pages (properly optimized, and pointing back to their site's contact page? Or do they even also have Yelp, CitySearch, Bing Local or Yahoo local listings similarly set up?
All of these can help.
As far as the template repetition, if the rest of the above is all properly utilized, it shouldn't be a major problem, so I'd start looking at those considerations and go from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New websites issues- Duplicate urls and many title tags. Is it fine for SEO?
Hey everyone, I have found few code issues with our new website and wanted to see how bad those problems are and if I have missed anything. If someone can take a look at this and help me it would mean the world. Thank you. all! We hired an agency to design a new site for us and it's almost ready, but the other day I found some problems that made me wonder if this new site might not be as good as I thought and I wanted to ask you to take a look at the code and possibly help me understand if from SEO prospective it is sound. But I really want someone who understands SEO and web design to look at our code and point out what might be wrong there. Here is a link to the actual site which is on a new server: http://209.50.54.42/ What I found few days ago that made me wonder something might not be right. Problem 1. Each page has 3 title tags, I guess whatever template they are using it automatically creates 3 title tags. When you do " View Page Source" For example on this url: http://209.50.54.42/washington-dc-transportation when you view the code, the lines Lines 16,19 and 20 have the title tag which in my opinion is wrong and there should only be one. Could this hurt our SEO? Problem 2. Infinite duplicate urls found All following pages have INFINITE NUMBER OF DUPLICATE URLS. EXAMPLE: http://209.50.54.42/privacy-policy/8, http://209.50.54.42/privacy-policy/1048, http://209.50.54.42/privacy-policy/7, http://209.50.54.42/privacy-policy/1, http://209.50.54.42/privacy-policy you can add any type of number to this url and it will show the same page. I really think this 2nd problem is huge as it will create duplicate content. There should be only 1 url per page, and if I add any number to the end should give a 404 error. I have managed to find these 2 issues but I am not sure what else could be wrong with the code. Would you be able to look into this? And possible tell us what else is incorrect? I really like the design and we worked really hard on this for almost 5 moths but I want to make sure that when we launch the new site it does not tank our rankings and only helps us in a positive way. Thanks in advance, Davit
Intermediate & Advanced SEO | | Davit19850 -
Community Discussion - What's the ROI of "pruning" content from your ecommerce site?
Happy Friday, everyone! 🙂 This week's Community Discussion comes from Monday's blog post by Everett Sizemore. Everett suggests that pruning underperforming product pages and other content from your ecommerce site can provide the greatest ROI a larger site can get in 2016. Do you agree or disagree? While the "pruning" tactic here is suggested for ecommerce and for larger sites, do you think you could implement a similar protocol on your own site with positive results? What would you change? What would you test?
Intermediate & Advanced SEO | | MattRoney2 -
Why do Local "5 pack" results vary between showing Google+, Google+ and website address
I had a client ask me a good question. When they pull up a search result they show up at the top but only with a link to their G+ page. Other competitors show their web address and G+ page. Why are these results different in the same search group? Is there a way to ensure the web address shows up?
Intermediate & Advanced SEO | | Ron_McCabe0 -
Moving Content To Another Website With No Redirect?
I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today. I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care. My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again. Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!
Intermediate & Advanced SEO | | sbrault741 -
What is better for google: keep old not visited content deeply in the website, or to remove it?
We have quite a lot of old content which is not visited anymore. Should we remove it and have a lot of 410 errors which will be reported in GWT? Or should we keep it and forget about it?
Intermediate & Advanced SEO | | bele0 -
Facebook "lockout"
I'm not sure what the correct term is, but I've visited websites that require me to like page 1 of an article, to view page 2. Little annoying but fair enough, they wrote the content, I clearly find it of value as I want page 2. I run a download website, with user generated content. We used to only allow downloads to members, this resulted in 5,000+ new signups per day and a massive userbase. We now allow guests to download content, the majority are freeloaders, not even a thank you to the artist. I am about to employ a system for guests, that forces them to like, tweet or G+ the download, for it to begin. If they don't, no download. Are there any SEO considerations here? The page this will be implemented on, isn't a crawlable page. Cheers.
Intermediate & Advanced SEO | | seo-wanna-bs0 -
Strategy for a large website where you only work for one business unit.
I have been tasked with improving traffic/leads to www.intertek.com. The problems we face are that I only work for one of the business units. There are many within the company and they all work independantly. The services my division offers range from ISO certification to food safety/testing to oil and gas services. They want to increase their quality content and traffic. What is the best strategy to approach working with a company this diverse and the limitation of managing 500 pages of a 15,000 page site? What are the first steps and what actions do you think would give the best results?
Intermediate & Advanced SEO | | laura-intertek0 -
Do 404 Errors hurt SEO ?
I recently did a crawl test and it listed over 10,000 pages, but around 282 of them generated 404 errors (bad links) I'm wondering how much this hurts overall SEO and if its something I should focus on fixing asap ? Thanks.
Intermediate & Advanced SEO | | RagingBull0