Is it bad to have same templates for all of my EMDs
-
I have been working on EMDs and they are more than 40 EMDs. All 2 KW EMDs with DA around 35
Now they all have same templates. Can it be a problem in future ? (though they don't have similar content)
-
yeah fair enough .. Thanks Andy
-
Consider this, a huge chunk of the websites out there use off the shelf templates (think Wordpress). Even Matt cuts blog is an off the shelf (if edited) template.
-
hey Gerd, thanks thats helpful. I searched it and found some nice reads! Didn't know about manual reviews. But I guess manual review is somtime good as If you have good quality brand sites (EMDs) then you dont have to worry about machine algo picking your site by mistake.
-
That's a good example. Thanks for sharing.
Though, I guess if something like this happens it wont be template update alone, it might be a combination of low quality, bit of similar pattern seen in content and few other factors like all similar EMDs with similar templates interlinked and so on.
-
thanks for sharing your opinion .
-
This is already happening - do a search for "google manual review" and have a look at the manual review process. Google currently employs companies to perform manual website reviews based on search terms to classify web-sites.
So although you have the same template, different content and websites, the danger is that your sites from a link-building perspective interlink and a manual review might demote all of them.
Chances are slim as others said, but certainly possible.
-
no clue seriously! Google is in the teen age and doing all the smart things that a super awesome kid should do to prove himself the best among others (sometime it didn't went well...)
I must not say it is not possible but at the moment there is no such thing like this!
-
IMHO, no I don't believe so.
Consider an ecommerce platform like opencart. The default template is probably being used tens of thousands of time with unqiue content and to penalize all those sites because of it , probably will not improve the end user's experience.
The purpose of algo updates is to improve the user's experience--so the content is the key and not necessarily the template (unless it's extremely poor and affects usability).
-
yeah, thought so but do you think any future update might include anything like that ?
-
I am assuming that you are talking about design template of the EMDs (websites) so in that case from the SEO point of view there won’t be any problem but from the user point of view people might frustrates by going to different domains but finding the same kind of websites... but again every niche’s behavior towards website is kind of different.
from the technical website there is no problem in having the same template!
-
Hi,
So not to worry about the same template, if your site having unique content then same design doesn't harm your ranking.
-
yes, even my sites are ranking at top 5 for most of the keywords but I am afraid some future update might change things!
-
yeah I believe at the moment thers is no problem I am assuming that Google algorithm updates might introduce something like that in future where it might use same templates as one factor to know if the sites are all same, hosted on same servers and liked with each other and hence might be come under spam
-
You could face the risk if your sites are interlinked and a manual review flags the sites as similar and demotes some. I think this is a very rare case and it will be unlikely that it could happen. Just remember, Google has a better understanding of link-graphs then any tool available. I have seen some sites drop due to a manual review (the demotion was not because of same UI though).
I honestly would not worry too much about it as long as your copy, brand, keywords and onpage SEO differs.
-
I am not sure but as per my experience, it is not affect your SEO if you have same template but different content. If you have different content for all your site then Google does not consider your sites as duplicate site. My client have multilingual site having same template but still he ranked in top 10.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are sitewide links bad for SEO?
I have 11 real estate sites and have had links from one to another for about 7 years but someone just suggested me to take them all out because I might get penalized or affected by penguin. My main site was affected on July of 2012 and organic visits have dropped 43%...I've been working on many aspects of my SEO but it's been difficult to come back. Any suggestions are very welcome, thanks 🙂
Technical SEO | | mbulox0 -
Website credits for designers - good or bad
Hi My core service is web design and development. I often place a credit on my clients websites pointing them back to my web design or web development pages. Is this a wise practice with penguin and panda updates? Would this also pull my ranking down?
Technical SEO | | Cocoonfxmedia0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Changing the anchor text of a big amount of links at once is bad for SEO?
Hi there, Our service at fotograf.de is a shopsystem for professional photographers. The customers can build their own website with our tool including an onlineshop to sell their pictures. We have a lot of links from our customers linking to our homepage. The links come from subdomains of our domain and from external domains. We are now thinking about changing the anchor text of half of the links (round about 300.000 links). Do we have to fear a penalization of Google for changing so many anchor texts at once? Do we get better rankings if we choose a more optimized anchor text or does this have no effect because most of the links are from subdomains (each customer has its own subdomain) of our domain? Thanks for answering! Sebastian
Technical SEO | | Sebastian230 -
Client with Very Very Bad Onsite SEO
So one of my clients has a really really bad website from the technical perspective. I am talking over 75k in violations and warnings. Granted, the tagging is done well but any other SEO violation you can think of is occurring. In any case, they are building a new website, and I am on a retainer for a couple hours a week to do some link building. I am feeling like I am not getting anywhere. What is your advice? Should I keep on keeping on or advice the client to put SEO on hold until the technical issues are resolved. I feel like all of this link building isn't having the value that it could have with a site like this.
Technical SEO | | runnerkik0 -
Bad Duplicate content issue
Hi, for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content). What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ? It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff. Thanks in advance.
Technical SEO | | nico860 -
Moz Crawl Reporting Duplicate content on "template" styled pages
We have a lot of detail pages on our site that reference specific scholarships. Each page has a different Title and Description. They also have unique information all regarding the same data points. The pages are displayed in a similar structure to the user so the data is easy to read. My problem is a lot of these pages are being reported as duplicate content when they certainly are not. Most of them are reported as duplicates when they have the same sponsor. They may have the same contact information listed. These two are being reported as duplicate of each other. They share some data but they are definitely different scholarships. http://www.collegexpress.com/scholarships/adelaide-mcclelland-garden-club-scholarship/9254/ http://www.collegexpress.com/scholarships/mary-wannamaker-witt-and-lee-hampton-witt-memorial-scholarship/10785/ Would it help to add a Canonical for each page to themselves? Any other suggestions would be great. Thanks
Technical SEO | | GeorgeLaRochelle0 -
Schema for Price Comparison Services - Good or Bad?
Hey guys, I was just wondering what the whole schema.org markup means for people that run search engines (i.e. for a niche, certain products) or price comparison engines in general. The intend behind schema.org was to help the engines better understand the pages content. Well, I guess such services don't necessarily want Google to understand that they're just another search engine (and thus might get thrown out of the index for polluting it with search result pages). I see two possible scenarios: either not implement them or implement them in a way that makes the site not look like an aggregator, i.e. by only marking up certain products with unique text. Any thoughts? Does the SEOmoz team has any advice on that? Best,
Technical SEO | | derderko
schuon0