Does Google index dynamically generated content/headers, etc.?
-
To avoid dupe content, we are moving away from a model where we have 30,000 pages, each with a separate URL that looks like /prices/<product-name>/<city><state>, often with dupe content because the product overlaps from city to city, and it's hard to keep 30,000 pages unique, where sometimes the only distinction is the price & the city/state.</state></city></product-name>
We are moving to a model with around 300 unique pages, where some of the info that used to be in the url will move to the page itself (headers, etc.) to cut down on dupe content on those unique 300 pages.
My question is this. If we have 300 unique-content pages with unique URL's, and we then put some dynamic info (year, city, state) into the page itself, will Google index this dynamic content?
The question behind this one is, how do we continue to rank for searches for that product in the city-state being searched without having that info in the URL?
Any best practices we should know about?
-
Hi there,
Not sure I have enough information to weigh in on the first part of your question - Google will index whatever it sees on the page. If you deliver the content to Google, then they index it. The problem comes when you deliver different content to different users. Try a tool like SEO Browser to see how googlebot views your site.
To answer your second question, its often hard to rank near-duplicate pages for specific cities/states without running into massive duplicate content problems. Matt Cutts himself actually addressed this awhile back. He basically stated if you have multiple pages all targeting different locations, it's best to include a few lines of unique content on each page (I recommend the top) to make each unique.
“In addition to address and contact information, 2 or 3 sentences about what is unique to that location and they should be fine,” Source
But this technique would be very hard with only 300 product page. The alternative, stuffing these pages with city/state information for every combination possible, is not advised.
http://www.seomoz.org/q/on-page-optimization-to-rank-for-multiply-cities
So in the end, it's actually not hard to rank for city-state keywords without having it in the URL, but the information should be in the content or other places like the title tag or internal link structure - but to do this for 1000's of locations with only 300 pages without keyword stuffing is near impossible.
The best thing to do is figure out how to create unique content for every page you want to rank for, and take that route.
For example, I might create a "Seattle" page, create unique content for the top of the page, then list 50 or so products with the unique Seattle prices. (This is a rough strategy - you'd have to refine it greatly to work for your situation.
Hope this helps! Best of luck with your SEO.
-
I see. To get the city-state pages indexed then they must have their own URL. If you can only access it via posting a form (assumed for using the search feature), the a search engine can't see it.
To get round this, you could put a links underneath the search box to popular searches. This will get them indexed.
Does that answer the questions?
Thanks
Iain - Reload
-
Thanks for the reply. The city-state content wouldn't be driven by the URL, it would be driven by the city-state that the user searched for. ie if the person searched for <product><city><state>I would want our /product/ page to show up, and show them content in their local city state.</state></city></product>
-
Hi Editable Text,
In short if you show Google a crawlable link to the content with the dynamic header/content, and the content is driven by the unique URL, yes it will index it.
As with any SEO/life question, there are a few t&c's with this.
- The pages need to be unique enough not to be classed as duplicate content
- Make sure it's intelligently linked internally
- You have external links pointing deep into the site
- You have a decent site architecture
To answer you second question, you'll need unique pages for each location, unless your content would be so thin, you'd need to group them. The URL doesn't have to include the keyword, but it's damn helpful if it does.
Hope that helps
Iain - Reload Media
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Impact of keyword/keyphrases density on header/footer
Hi, It might be a stupid question but I prefer to clear things out if it's not a problem: Today I've seen a website where visitors are prompted no less than 5 times per page to "call [their] consultants".
On-Page Optimization | | GhillC
This appears twice on the header, once on the side bar (mouse over pop up), once in the body of most of the pages and once in the footer. So obviously, besides the body of the pages, it appears at least 4 times on every single pages as it's part of the website template. In the past, I never really wondered re the menu, the footer etc as it's usually not hammering the same stuff repeatedly everywhere. Anyway, I then had a look at their blog and, given the average length of their articles, the keyword density around these prompts is about 0.5% to 0.8% for each page. This is huge! So basically my question is as follow: is Google's algorithm smart enough to understand what this is and make abstraction of this "content" to focus on the body of the pages (probably simply focusing on the tags)? Or does it send wrong signals and confuse search engine more than anything else? Reading stuff such as this, I wonder how does it work when this is not navigational or links elements. Thanks,
G Note: I’m purposely not speaking about the UX which is obviously impacted by such a hammering process.0 -
Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites). I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors. It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble. Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do? Thanks Moz community!
On-Page Optimization | | paulz9990 -
Paginated URLs are getting Indexed
Hi, For ex: - My site is www.abc.com and Its paginated URLs for www.abc.com/jobs-in-delhi are in the format of : www.abc.com/jobs-in-delhi-1, www.abc.com/jobs-in-delhi-2 and vice versa also i have used pagination tags rel=next and rel=prev. My concern is all the paginated URLs are getting indexed so is their any disadvantage if these URLs are getting indexed as somewhere i have read that link juice may get distributed in case of pagination. isn't it good to use Noindex, Follow so that we can make the Google to understand that paginated page are not so much important and that should not be ranked.
On-Page Optimization | | vivekrathore0 -
Googlebot indexing URL's with ? queries in them. Is this Panda duplicate content?
I feel like I'm being damaged by Panda because of duplicate content as I have seen the Googlebot on my site indexing hundreds of URL's with ?fsdgsgs strings after the .html. They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later. At a loss what to do. Since Panda, I have lost a couple of dozen #1 rankings that I've held for months on end and had one drop over 100 positions.
On-Page Optimization | | sparrowdog0 -
Posting content from our books to our website
Hello, I am the newly appointed in-house seo person for a small business. The founders of our company have written several books, which we sell. But book sales are a small part of our business. We are considering posting to our website some or all of the content of the books. This content is directly relevant to the existing content of our website and would be available for free to all visitors. 1. Is it likely that the traffic and links to the new book pages would improve the search engine rankings of our existing pages? 2. We already have pdf versions of each book we could post, which are formatted nicely. Should we convert these to html to make them more friendly to search engines? 3. Of course, we would have to split each book into multiple web pages, perhaps one chapter per page. How much content could each new page optimally accommodate? 4. Would it be more valuable from an SEO perspective to post pieces of the books over time in a blog format? Thank you very much for your thoughts!
On-Page Optimization | | nyc-seo0 -
Issues with Product Pages Getting Index In Google
I just started working here the other week and one of the big issue is that a lot of the product pages are not getting index in google. We have an xml.gz site map they submitted a long time ago. My guess is it might be something with not enough content on the pages? Here are a few example of pages that are not getting index in google. http://www.rockymountainatvmc.com/p/43/-/439/716/-/33097/Alpinestars-Dual-Motorcycle-Gloves http://www.rockymountainatvmc.com/p/47/-/201/803/-/28948/Camelbak-Blowfish-2013 http://www.rockymountainatvmc.com/p/46/-/203/836/-/6996/MSR-Head-Case http://www.rockymountainatvmc.com/p/44/54/208/764/80/1220/Galfer-Brake-Pad-Sintered-Metal There are 100's that are not indexed just trying to figure out what we need to do! We are working on new content to them all but we have over 5000 products so it will take a long time. We also have the reviews on the pages and are looking at starting a Q&A on page to help get more unique content.
On-Page Optimization | | DoRM0 -
Google indexing https insted of http pages
Hi!
On-Page Optimization | | ovieira
First of all i have a Wordpress portuguese languagem website (**http://**bit.ly/TGjpVx). For a while, for security pourposes, i had a SSL certificate installed on my website but i didn't renew it, for a few months now. I didn't have any special https page. All pages responded using http or https. My problem is that it seems that Google still indexes some o my webpages with https and not http, so when people click on it they get a bad cached page. No good for SEO, i think. What can i do about this? I only want Google, and other serach engines, to index my clean http pages (about 70 pages). Thanks,
OV0 -
Meta Descriptions - Duplicate Content?
I have created a Meta Description for a page that is optimized for SERPS. If I also put this exact content on my page for my readers, would this be considered duplicate content? The meta description and content will be listed on the same page with the same URL. Thanks for your help.
On-Page Optimization | | tuckjames0