Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
-
Hi All,
I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :).
So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages.
So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically:
"Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph:
“Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same.
It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized.
Another option is to have several standardized paragraphs, such as:
“Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the locationThen we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages.
So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check.
Sorry for the long message. Hopefully someone can help. Thank you!
Pedram
-
Having duplicate content isn't an issue, so much as having enough unique content for each page to be seen as valuable on their own.
A single template paragraph probably isn't enough, but if you can include other information such as address, driving directions, phone number, photos of the facility, class sizes, school hours, etc. that should be enough unique content for each location.
You can even make the schedule an image or iframe if the duplicate content issue is a concern. Or if the schedule is identical for every location anyway, create a single schedule page, and link to it from each of the locations.
-
Trenton, thanks for the quick reply. We actually did try that suggestion with little success. I'd assume it didn't work out because of all the locations. There was no "honed" location specific content, which spurred the test I launched initially and is now showing results. Also, the pages are dynamic, meaning that content is always going to change as the courses pass in date. So, that does make it somewhat unique in and of itself.
I think I will launch the standardized, generated content but I will watch it closely. I'll certainly not just flood the web with all of our pages at once - maybe put them on our site 20-50 a month and see how that goes. This is a long term strategy, so the patience will be worth it.
Thanks for your help. Everyone's input has really helped here. It's definitely a sticky topic. I'll try to update you all on how it goes after a few months.
-
I totally understand what you're trying to do. What I'm trying to say is that they may be another way to get this location specific information to your users. Perhaps if you had one "sharepoint training" page, you can include all the locations there, with a schedule that changes if you hover or click on a location, but keeps you on the same page. This would likely be much safer with Google and would reduce the amount of work significantly. However, you may be losing potential SEO value without the individual pages for each location. Again, it's a balance, if you are able to create the pages without them being seen as duplicate content, then you're safe. If you can't make them unique, try to think about another method.
-
Definitely not trying to game Google. We offer classes all around the country at different schedules and times. However, if a student would like to take a class that is offered in New York but lives in, say Atlanta, they have the option of taking the New York class by going to a local center that broadcasts the class live, online as if they are in the actual classroom with the ability to interact with the instructor and students via our patented technology. Thus, the schedules for our courses are all basically the same because students can take them from almost anywhere. This is where the content comes into play.
Say someone is searching for "sharepoint training in new york," they'd be taken to a custom page for all of our SharePoint course trainings in the New York area for the next couple of months. The page will have location specific content. Now, if someone searches for "sharepoint training washington dc," they'd be taken to a custom page for all of our SharePoint courses in DC. However, the schedule would be the same as the one seen for NY, simply because of the student's option to take a course locally even if the course is offered somewhere else - that's the only "duplicate content" I'm worried about, even though each page does have location specific content for each topic area. Hopefully that all makes sense.
My objective is to really let a user know we have courses they're looking for in the areas we are located. It's not like we're creating pages for Houston, for example, even though a student could technically take a course from Houston, we simply are not located there.
-
Head over to oDesk and hire someone part time to write unique and relevant stuff for those pages. The standardization will help get you started, but doing work like this will take you to the next level. We just did this for about 1800 product pages and have seen significant organic traffic gains and have reduced or eliminated the thin content on those pages.
-
I would agree with the other two commenters here, you don't need to worry about duplicate meta descriptions, but each page needs to be unique to a certain extent. I'll try to add something different to this discussion: If we're talking to Google and Matt Cutts, and we're interested in white-hat only techniques, then I don't think he would suggest you create so many different pages if they aren't going to be very different. If you have many pages that aren't very different, than what value is that giving to the user? Or are you actually attempting to game Google (black-hat) by creating all these pages strictly for SEO purposes? If so, perhaps you should reevaluate your strategy.
However, if each and every location and topic is different and contains unique content such as completely different schedules and topic content, then I don't think you should have much to worry about. Just make sure that the actual content of each page is unique. Once you start creating dozens of duplicate pages, it may make more sense to try and figure out a simpler way to build out your site. You can try to balance and compare the risk of duplicate content to the benefit of having so many pages. Just focus on different content for each location and topic and you should be fine. In fact, Moz will tell you if you have duplicate content in your Crawl Diagnostics.
-
Takeshi,
I think you are OK. While it is always better to write completely unique content I would say in this scenario you are OK.
I would implement this and watch your ranking as well as other indicators to verify this to make sure.
Ron
-
There is no problem with standardizing meta tags, titles, h1s, etc. This is standard practice for large sites.
What can be problematic is if ALL the content on your pages is just templatized/madlibs. Having some randomized content is obviously better than nothing, but it's not going to do well if that's all the content you have on those pages. Having some standardized paragraphs with words filled in is fine, but make sure you have unique content on all of those pages as well.
If you have user reviews, that can be one good way to get some free UGC onto your pages. 700 pages also isn't terribly many, you can get some decent unique content written for that amount for under $10k. If that's out of your budget, start by focusing on the highest value pages, and calculate how many pages it makes sense to write unique content for based on ROI.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Product Descriptions - Technical List Supplier Gave Us
Hello, Our supplier gives us a small paragraph and a list of technical features for our product descriptions. My concern is duplicate content. Here's what my current plan is: 1. To write as much unique content (rewriting the paragraph and adding to it) as there is words in the technical description list. Half unique content half duplicate content. 2. To reword the technical descriptions (though this is not always possible) 3. To have a custom H1, Title tag and meta description My question is, is the list of technical specifications going to create a duplicate content issue, i.e. how much unique content has to be on the page for the list that is the same across the internet does not hurt us? Or do we need to rewrite every technical list? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Page title optimisation - Does suffix keywords matters?
Hi Moz community, We can see in many of the page titles; "brand & keyword" go after every topic like..... "best tiles for kitchen | vertigo tiles". Do Google count this suffix as any other word in page title or give low preference just because it has been repeated across every single page? What if the "keyword" is repeated with topic and brand name as well. I mean which one of the below 2 page titles gonna workout better in correlation with keyword and website authority ? best tiles for kitchen | vertigo tiles best tiles for kitchen | vertigo Thanks
White Hat / Black Hat SEO | | vtmoz0 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
[linkbuilding] link partner page on webshop, is it working?
Hello Mozzers, I am wondering about the effect of link building by swapping links between websites and adding a link partner page to the web shop containing hundreds of links. I have this new competitor coming in to the SERP of Google competing on the keywords I am targeting. The competitor has way more links than our web shop. The competitor has a page with hundreds of links to other web shops witch on there turn has a link to there web shop. (not all off them link back btw) I always thought it is no use sharing links with other websites this way in creating a huge page with hundreds of links. it is of no benefit for neighter website to do this. Still it does seems to work (?) and tis strategy is used by a lot of web shops in the Netherlands. How are you guys looking at this?
White Hat / Black Hat SEO | | auke1810
Witch of you guy's are using strategy like this?
Should I pick up this strategy myself?0 -
A client/Spam penalty issue
Wondering if I could pick the brains of those with more wisdom than me... Firstly, sorry but unable to give the client's url on this topic. I know that will not help with people giving answers but the client would prefer it if this thread etc didn't appear when people type their name in google. Right, to cut a long story short..gained a new client a few months back, did the usual things when starting the project of reviewing the backlinks using OSE and Majestic. There were a few iffy links but got most of those removed. In the last couple of months have been building backlinks via guest blogging and using bloggerlinkup and myblogguest (and some industry specific directories found using linkprospector tool). All way going well, the client were getting about 2.5k hits a day, on about 13k impressions. Then came the last Google update. The client were hit, but not massively. Seemed to drop from top 3 for a lot of keywords to average position of 5-8, so still first page. The traffic went down after this. All the sites which replaced the client were the big name brands in the niche (home improvement, sites such as BandQ, Homebase, for the fellow UK'ers). This was annoying but understandable. However, on 27th June. We got the following message in WMT - Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
White Hat / Black Hat SEO | | GrumpyCarl
As a result, Google has applied a manual spam action to xxxx.co.uk/. There may be other actions on your site or parts of your site. This was a shock to say the least. A few days later the traffic on the site went down more and the impressions dropped to about 10k a day (oddly the rankings seem to be where they were after the Google update so perhaps a delayed message). To get back up to date....after digging around more it appears there are a lot of SENUKE type links to the site - links on poor wiki sites,a lot of blog commenting links, mostly from irrelevant sites, i enclose a couple of examples below. I have broken the links so they don't get any link benefit from this site. They are all safe for work http:// jonnyhetherington. com/2012/02/i-need-a-new-bbq/?replytocom=984 http:// www.acgworld. cn/archives/529/comment-page-3 In addition to this there is a lot of forum spam, links from porn sites and links from sites with Malware warnings. To be honest, it is almost perfect negative seo!! I contacted several of the sites in question (about 450) and requested they remove the links, the vast majority of the sites have no contact on them so I cannot get the links removed. I did a disavow on these links and then a reconsideration request but was told that this is unsuccessful as the site still was being naughty. Given that I can neither remove the links myself or get Google to ignore them, my options for lifting this penalty are limited. What would be the course of action others would take, please. Thanks and sorry for overally long post0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0 -
Dealing with internal pages with bad backlinks - is this approach OK?
Hi all, I've just been going through every page of my company website, and found a couple of internal pages with nasty backlinks/profiles. There are a significant number of article marketing and rubbish directory pages pointing to these internal pages. These internal pages have low PR, yet are performing well in terms of SERPs. I was planning to: (1) change URLs - removing current (soon to be former) URLs from Google via Webmaster Tools. Then (2) remove website's 404 for a while so nasty links aren't coming anywhere near the website (hopefully nasty links will fail to find website and broken links will result in link removal - that's my thinking anyway). PS. I am not planning to implement any kind of redirect from the old URLs. Does this sound like a sensible approach, or may there be problems with it? Thanks in advance, Luke
White Hat / Black Hat SEO | | McTaggart0 -
404checker.com / crawl errors
I noticed a few strange crawl errors in a Google Webmaster Tools account - further investigation showed they're pages that don't exist linked from here: http://404checker.com/404-checker-log Basically that means anyone can enter a URL into the website and it'll get linked from that page, temporarily at least. As there are hundreds of links of varying quality - at the moment they range from a well known car manufacturer to a university, porn and various organ enlargement websites - could that have a detrimental effect on any websites linked? They are all nofollow. Why would they choose to list these URLs on their website? It has some useful tools and information but I don't see the point in the log page. I have used it myself to check HTTP statuses but may look elsewhere from now on.
White Hat / Black Hat SEO | | Alex-Harford0