Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
-
Hi All,
I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :).
So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages.
So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically:
"Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph:
“Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same.
It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized.
Another option is to have several standardized paragraphs, such as:
“Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the locationThen we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages.
So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check.
Sorry for the long message. Hopefully someone can help. Thank you!
Pedram
-
Having duplicate content isn't an issue, so much as having enough unique content for each page to be seen as valuable on their own.
A single template paragraph probably isn't enough, but if you can include other information such as address, driving directions, phone number, photos of the facility, class sizes, school hours, etc. that should be enough unique content for each location.
You can even make the schedule an image or iframe if the duplicate content issue is a concern. Or if the schedule is identical for every location anyway, create a single schedule page, and link to it from each of the locations.
-
Trenton, thanks for the quick reply. We actually did try that suggestion with little success. I'd assume it didn't work out because of all the locations. There was no "honed" location specific content, which spurred the test I launched initially and is now showing results. Also, the pages are dynamic, meaning that content is always going to change as the courses pass in date. So, that does make it somewhat unique in and of itself.
I think I will launch the standardized, generated content but I will watch it closely. I'll certainly not just flood the web with all of our pages at once - maybe put them on our site 20-50 a month and see how that goes. This is a long term strategy, so the patience will be worth it.
Thanks for your help. Everyone's input has really helped here. It's definitely a sticky topic. I'll try to update you all on how it goes after a few months.
-
I totally understand what you're trying to do. What I'm trying to say is that they may be another way to get this location specific information to your users. Perhaps if you had one "sharepoint training" page, you can include all the locations there, with a schedule that changes if you hover or click on a location, but keeps you on the same page. This would likely be much safer with Google and would reduce the amount of work significantly. However, you may be losing potential SEO value without the individual pages for each location. Again, it's a balance, if you are able to create the pages without them being seen as duplicate content, then you're safe. If you can't make them unique, try to think about another method.
-
Definitely not trying to game Google. We offer classes all around the country at different schedules and times. However, if a student would like to take a class that is offered in New York but lives in, say Atlanta, they have the option of taking the New York class by going to a local center that broadcasts the class live, online as if they are in the actual classroom with the ability to interact with the instructor and students via our patented technology. Thus, the schedules for our courses are all basically the same because students can take them from almost anywhere. This is where the content comes into play.
Say someone is searching for "sharepoint training in new york," they'd be taken to a custom page for all of our SharePoint course trainings in the New York area for the next couple of months. The page will have location specific content. Now, if someone searches for "sharepoint training washington dc," they'd be taken to a custom page for all of our SharePoint courses in DC. However, the schedule would be the same as the one seen for NY, simply because of the student's option to take a course locally even if the course is offered somewhere else - that's the only "duplicate content" I'm worried about, even though each page does have location specific content for each topic area. Hopefully that all makes sense.
My objective is to really let a user know we have courses they're looking for in the areas we are located. It's not like we're creating pages for Houston, for example, even though a student could technically take a course from Houston, we simply are not located there.
-
Head over to oDesk and hire someone part time to write unique and relevant stuff for those pages. The standardization will help get you started, but doing work like this will take you to the next level. We just did this for about 1800 product pages and have seen significant organic traffic gains and have reduced or eliminated the thin content on those pages.
-
I would agree with the other two commenters here, you don't need to worry about duplicate meta descriptions, but each page needs to be unique to a certain extent. I'll try to add something different to this discussion: If we're talking to Google and Matt Cutts, and we're interested in white-hat only techniques, then I don't think he would suggest you create so many different pages if they aren't going to be very different. If you have many pages that aren't very different, than what value is that giving to the user? Or are you actually attempting to game Google (black-hat) by creating all these pages strictly for SEO purposes? If so, perhaps you should reevaluate your strategy.
However, if each and every location and topic is different and contains unique content such as completely different schedules and topic content, then I don't think you should have much to worry about. Just make sure that the actual content of each page is unique. Once you start creating dozens of duplicate pages, it may make more sense to try and figure out a simpler way to build out your site. You can try to balance and compare the risk of duplicate content to the benefit of having so many pages. Just focus on different content for each location and topic and you should be fine. In fact, Moz will tell you if you have duplicate content in your Crawl Diagnostics.
-
Takeshi,
I think you are OK. While it is always better to write completely unique content I would say in this scenario you are OK.
I would implement this and watch your ranking as well as other indicators to verify this to make sure.
Ron
-
There is no problem with standardizing meta tags, titles, h1s, etc. This is standard practice for large sites.
What can be problematic is if ALL the content on your pages is just templatized/madlibs. Having some randomized content is obviously better than nothing, but it's not going to do well if that's all the content you have on those pages. Having some standardized paragraphs with words filled in is fine, but make sure you have unique content on all of those pages as well.
If you have user reviews, that can be one good way to get some free UGC onto your pages. 700 pages also isn't terribly many, you can get some decent unique content written for that amount for under $10k. If that's out of your budget, start by focusing on the highest value pages, and calculate how many pages it makes sense to write unique content for based on ROI.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Embedded links/badges
Hi there Just picking up on something Rand said in his blog analysing his predictions for 2014. Rand predicted that Google will publicly acknowledge algorithmic updates targeting...embeddable infographics/badges as manipulative linking practices While this hasn't exactly materialised yet, it has got me thinking. We have a fair few partners linking to us through an embedded badge. This was done to build the brand, but the positives here wouldn't be worth being penalised in search. Does anyone have any further evidence of websites penalised for doing this, or any views on whether removing those badges should be a priority for us? Many thanks
White Hat / Black Hat SEO | | HireSpace0 -
Cloaking/Malicious Code
Does anybody have any experience with software for identifying this sort of thing? I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code. Thanks everybody!
White Hat / Black Hat SEO | | HashtagHustler0 -
Thousands of 301 redirections - .htaccess alternatives?
Hi guys, I just want to ask if there are other possible issues/problems (other than server load) once we implement 301 redirections for 10,000+ URLs using .htaccess. Are there other alternatives?
White Hat / Black Hat SEO | | esiow20130 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0 -
Multiple doamin with same content?
I have multiple websites with same content such as http://www.example.com http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org. Is that enough to keep away my exampl.org site from indexing on google and other search engines? the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages? i would welcome good seo practices regarding maintaining multiple domains thanks and regards
White Hat / Black Hat SEO | | VipinLouka780 -
Is domain name or page title "safe" as anchor text?
I am aware of the dangers of excessively optimized anchor text I have seen some suggestions that as long as your anchor text is either the URL or the page title that this will be OK, no matter how many links come in with that anchor text. Does anyone have an opinion, or even any hard data on this? Thx Paul
White Hat / Black Hat SEO | | diogenes0 -
A domain is ranking for a plural key word in SERPs on page 1 but for the singular not at all?
What could the reasons that a domain is ranking for the plural version of a key word on SERPs page 1 and for the singular version not at all? Google knows that both key words belong together, as in the SERPs for one version also the other version of the key word is being highlighted. If I search for the domain with the plural keyword it shows up on the first page in SERPs, but If I search for the same keyword as singular (in German it is just removing an “s”) I see the plural version highlighted many times but I cannot find my domain. What could be the reason for this behavior? penalties?
White Hat / Black Hat SEO | | SimCaffe0