Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
-
Hi All,
I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :).
So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages.
So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically:
"Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph:
“Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same.
It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized.
Another option is to have several standardized paragraphs, such as:
“Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the locationThen we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages.
So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check.
Sorry for the long message. Hopefully someone can help. Thank you!
Pedram
-
Having duplicate content isn't an issue, so much as having enough unique content for each page to be seen as valuable on their own.
A single template paragraph probably isn't enough, but if you can include other information such as address, driving directions, phone number, photos of the facility, class sizes, school hours, etc. that should be enough unique content for each location.
You can even make the schedule an image or iframe if the duplicate content issue is a concern. Or if the schedule is identical for every location anyway, create a single schedule page, and link to it from each of the locations.
-
Trenton, thanks for the quick reply. We actually did try that suggestion with little success. I'd assume it didn't work out because of all the locations. There was no "honed" location specific content, which spurred the test I launched initially and is now showing results. Also, the pages are dynamic, meaning that content is always going to change as the courses pass in date. So, that does make it somewhat unique in and of itself.
I think I will launch the standardized, generated content but I will watch it closely. I'll certainly not just flood the web with all of our pages at once - maybe put them on our site 20-50 a month and see how that goes. This is a long term strategy, so the patience will be worth it.
Thanks for your help. Everyone's input has really helped here. It's definitely a sticky topic. I'll try to update you all on how it goes after a few months.
-
I totally understand what you're trying to do. What I'm trying to say is that they may be another way to get this location specific information to your users. Perhaps if you had one "sharepoint training" page, you can include all the locations there, with a schedule that changes if you hover or click on a location, but keeps you on the same page. This would likely be much safer with Google and would reduce the amount of work significantly. However, you may be losing potential SEO value without the individual pages for each location. Again, it's a balance, if you are able to create the pages without them being seen as duplicate content, then you're safe. If you can't make them unique, try to think about another method.
-
Definitely not trying to game Google. We offer classes all around the country at different schedules and times. However, if a student would like to take a class that is offered in New York but lives in, say Atlanta, they have the option of taking the New York class by going to a local center that broadcasts the class live, online as if they are in the actual classroom with the ability to interact with the instructor and students via our patented technology. Thus, the schedules for our courses are all basically the same because students can take them from almost anywhere. This is where the content comes into play.
Say someone is searching for "sharepoint training in new york," they'd be taken to a custom page for all of our SharePoint course trainings in the New York area for the next couple of months. The page will have location specific content. Now, if someone searches for "sharepoint training washington dc," they'd be taken to a custom page for all of our SharePoint courses in DC. However, the schedule would be the same as the one seen for NY, simply because of the student's option to take a course locally even if the course is offered somewhere else - that's the only "duplicate content" I'm worried about, even though each page does have location specific content for each topic area. Hopefully that all makes sense.
My objective is to really let a user know we have courses they're looking for in the areas we are located. It's not like we're creating pages for Houston, for example, even though a student could technically take a course from Houston, we simply are not located there.
-
Head over to oDesk and hire someone part time to write unique and relevant stuff for those pages. The standardization will help get you started, but doing work like this will take you to the next level. We just did this for about 1800 product pages and have seen significant organic traffic gains and have reduced or eliminated the thin content on those pages.
-
I would agree with the other two commenters here, you don't need to worry about duplicate meta descriptions, but each page needs to be unique to a certain extent. I'll try to add something different to this discussion: If we're talking to Google and Matt Cutts, and we're interested in white-hat only techniques, then I don't think he would suggest you create so many different pages if they aren't going to be very different. If you have many pages that aren't very different, than what value is that giving to the user? Or are you actually attempting to game Google (black-hat) by creating all these pages strictly for SEO purposes? If so, perhaps you should reevaluate your strategy.
However, if each and every location and topic is different and contains unique content such as completely different schedules and topic content, then I don't think you should have much to worry about. Just make sure that the actual content of each page is unique. Once you start creating dozens of duplicate pages, it may make more sense to try and figure out a simpler way to build out your site. You can try to balance and compare the risk of duplicate content to the benefit of having so many pages. Just focus on different content for each location and topic and you should be fine. In fact, Moz will tell you if you have duplicate content in your Crawl Diagnostics.
-
Takeshi,
I think you are OK. While it is always better to write completely unique content I would say in this scenario you are OK.
I would implement this and watch your ranking as well as other indicators to verify this to make sure.
Ron
-
There is no problem with standardizing meta tags, titles, h1s, etc. This is standard practice for large sites.
What can be problematic is if ALL the content on your pages is just templatized/madlibs. Having some randomized content is obviously better than nothing, but it's not going to do well if that's all the content you have on those pages. Having some standardized paragraphs with words filled in is fine, but make sure you have unique content on all of those pages as well.
If you have user reviews, that can be one good way to get some free UGC onto your pages. 700 pages also isn't terribly many, you can get some decent unique content written for that amount for under $10k. If that's out of your budget, start by focusing on the highest value pages, and calculate how many pages it makes sense to write unique content for based on ROI.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
More than 450 Pages Created by a hacker
Hi Moz Community, I am in charge of the Spanish SEO for an international company, related to security. A couple of months ago, I realized that my Spanish/keywords/post all vanished from Google, Yahoo, Bing and Duckduckgo. Then I noticed that somebody in command of the main website used a disavow all! I was in shock, as all of you can imagine. Knowing that all the inbound links were spam score under 4, highly relevant and so. Later on, I was informed that the website was hacked and somebody took that action. Of course, it did not solved the issue. I continue researching and found those pages - "Online%20Games%20-%20Should%20Parents%20Worry%20Or%20Celebrate%3F" - all of them like this one. I informed the owner of the website - he is not my client - my client is the Spanish Manager. They erased the pages, of course plus sent all those, to avoid the 404 responses, to the homepage with a 301. My heart stopped at that point! I asked them to send all those with a redirect 301 to a new hidden page with nofollow and noindex directives. We recover, my keywords/pages are in the first page again. Although the DA fell 7 points and no inbound links for now. I asked for the disavow file "to rewrite it", not received yet. Any better ideas? Encountered a similar issue? How did you solved it?
White Hat / Black Hat SEO | | Mª Verónica B.
Thanks in advance.0 -
How Do You Know or Find Out if You've been hit by a Google Penalty?
Hi Moz Community, How do you find out if you have been hit with a Google Penalty? Thanks, Gary
White Hat / Black Hat SEO | | gdavey0 -
Diminishing Returns for Links to an Unrelated Page
Suppose I have a new website about cars and I had created a page about something completely not-related - like cupcakes. However, I found that it was very easy to get high quality sites to link to the cupcakes page where as it was very difficult to get people to link to the homepage about cars. If my goal is to increase the SEO for the homepage (which again is related to cars), is there a point where additional high quality links to my cupcakes page is not useful for it anymore? What if I created another page - about frosted cupcakes - which was also easy to get high quality links to?
White Hat / Black Hat SEO | | wlingke10 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
What EMD Meta Title should we use and what about getting links to the same C-Block IP?
Situation: Recently I encountered two problems with both internal and external SEO for my company websites.
White Hat / Black Hat SEO | | TT_Vakantiehuizen
This Dutch company has four websites on one server. Three closely related EMD(Exact Match Domain) websites and one overarching website. (Holiday homes rental websites) Vakantiehuizen-Verhuur.nl (overarching)
Vakantiehuizen-Frankrijk.nl (EMD)
Vakantiehuizen-Italie.nl (EMD)
Vakantiehuizen-Spanje.nl (EMD) Question 1:
What would be a preferable Meta Title for the EMD websites (homepage/subpages)? Keep in mind that the domains are EMD. The homepage will target the most important keywords and should not compete with subpages. Options for the homepage:
1. Vakantiehuizen Frankrijk | Alle vakantiehuizen in Frankrijk op een rij!
2. Vakantiehuizen Frankrijk | Vakantiehuizen-Frankrijk.nl onderdeel van Vakantiehuizen-Verhuur.nl
3. Suggestions? Options for the subpages:
1. Vakantiehuis Normandie | Vakantiehuizen Frankrijk
2. Vakantiehuis Normandie | Vakantiehuizen-Frankrijk.nl
3. Suggestions? And concerning the keywords in the beginning; is it wise to use both plural and singular terms in the meta title? For Example:
Hotel New York. Best hotels in New York | Company Name Question 2: Many SEOs state that getting (too many) links from the same C-Block IP is bad practice and should be avoided. Is this also applicable if one website links out to different websites with the same C-Block IP? Thus, website A, B and C (on the same server) link to website D (different server) could be seen as spam but is this the same when website D links to website A, B and C?0 -
Is this Duplicate content?
Hi all, This is now popping up in Moz after using this for over 6 months.
White Hat / Black Hat SEO | | TomLondon
It is saying this is now duplicate site content. What do we think? Is this a bad strategy, it works well on the SERPS but could be damaging the root domain page ranking? I guess this is a little shady. http://www.tomlondonmagic.com/area/close-up-magician-in-crowborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-desborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-didcot/ Thanks.0 -
How to recover my site from -50 penalty
One of my sites was hit after Google confirmed its panda 3.2 update. The site ranked very well for many heavy traffic keywords in my niche. But all of a sudden, 80% of the keywords which ranked high in the previous dropped 50 in SERP. I know it is a -50 penalty , but i do not know how to recover from it. The link building campaign is almost the same as before and all of the articles are unique. BTW, i have two image ads on the sidebar and 7 affiliate links on the bottom of the page. Any input will be great appreciated !
White Hat / Black Hat SEO | | aoneshosesun0 -
From page 3 to page 75 on Google. Is my site really so bad?
So, a couple of weeks ago I started my first CPA website, just as an experiment and to see how well I could do out of it. My rankings were getting better every day, and I’ve been producing constant unique content for the site to improve my rankings even more. 2 days ago my rankings went straight to the last page of Google for the keyword “acne scar treatment” but Google has not banned me or given my domain a minus penalty. I’m still ranking number 1 for my domain, and they have not dropped the PR as my keyword is still in the main index. I’m not even sure what has happened? Am I not allowed to have a CPA website in the search results? The best information I could find on this is: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=76465 But I’ve been adding new pages with unique content. My site is www.acne-scar-treatment.co Any advice would be appreciated.
White Hat / Black Hat SEO | | tommythecat1