Does Google index dynamically generated content/headers, etc.?
-
To avoid dupe content, we are moving away from a model where we have 30,000 pages, each with a separate URL that looks like /prices/<product-name>/<city><state>, often with dupe content because the product overlaps from city to city, and it's hard to keep 30,000 pages unique, where sometimes the only distinction is the price & the city/state.</state></city></product-name>
We are moving to a model with around 300 unique pages, where some of the info that used to be in the url will move to the page itself (headers, etc.) to cut down on dupe content on those unique 300 pages.
My question is this. If we have 300 unique-content pages with unique URL's, and we then put some dynamic info (year, city, state) into the page itself, will Google index this dynamic content?
The question behind this one is, how do we continue to rank for searches for that product in the city-state being searched without having that info in the URL?
Any best practices we should know about?
-
Hi there,
Not sure I have enough information to weigh in on the first part of your question - Google will index whatever it sees on the page. If you deliver the content to Google, then they index it. The problem comes when you deliver different content to different users. Try a tool like SEO Browser to see how googlebot views your site.
To answer your second question, its often hard to rank near-duplicate pages for specific cities/states without running into massive duplicate content problems. Matt Cutts himself actually addressed this awhile back. He basically stated if you have multiple pages all targeting different locations, it's best to include a few lines of unique content on each page (I recommend the top) to make each unique.
“In addition to address and contact information, 2 or 3 sentences about what is unique to that location and they should be fine,” Source
But this technique would be very hard with only 300 product page. The alternative, stuffing these pages with city/state information for every combination possible, is not advised.
http://www.seomoz.org/q/on-page-optimization-to-rank-for-multiply-cities
So in the end, it's actually not hard to rank for city-state keywords without having it in the URL, but the information should be in the content or other places like the title tag or internal link structure - but to do this for 1000's of locations with only 300 pages without keyword stuffing is near impossible.
The best thing to do is figure out how to create unique content for every page you want to rank for, and take that route.
For example, I might create a "Seattle" page, create unique content for the top of the page, then list 50 or so products with the unique Seattle prices. (This is a rough strategy - you'd have to refine it greatly to work for your situation.
Hope this helps! Best of luck with your SEO.
-
I see. To get the city-state pages indexed then they must have their own URL. If you can only access it via posting a form (assumed for using the search feature), the a search engine can't see it.
To get round this, you could put a links underneath the search box to popular searches. This will get them indexed.
Does that answer the questions?
Thanks
Iain - Reload
-
Thanks for the reply. The city-state content wouldn't be driven by the URL, it would be driven by the city-state that the user searched for. ie if the person searched for <product><city><state>I would want our /product/ page to show up, and show them content in their local city state.</state></city></product>
-
Hi Editable Text,
In short if you show Google a crawlable link to the content with the dynamic header/content, and the content is driven by the unique URL, yes it will index it.
As with any SEO/life question, there are a few t&c's with this.
- The pages need to be unique enough not to be classed as duplicate content
- Make sure it's intelligently linked internally
- You have external links pointing deep into the site
- You have a decent site architecture
To answer you second question, you'll need unique pages for each location, unless your content would be so thin, you'd need to group them. The URL doesn't have to include the keyword, but it's damn helpful if it does.
Hope that helps
Iain - Reload Media
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google index text that I can not find
Hello everybody, As you can see here: http://webcache.googleusercontent.com/search?q=cache:G-iicHoDJeYJ:www.billigste-internet.dk/&hl=da&gl=dk&strip=1 Google index the text "Forside" as the H1 tag, and "Right" and "Left" as body text, on my website. But I do not want that Google indexes this. But when I look in mine source code (see here: view-source:http://www.billigste-internet.dk/) I can not find "Forside", "rigth" or "Left", so I can delete it. Is there anyone who can help me where I need to delete the text "Forside", "Right" and "Left", so Google does not index this text? Hope someone can help.
On-Page Optimization | | JoLindahl910 -
Google picking up old pages
I recently redesigned a site that had all the keywords it was ranking for going to the home page. Now I have specific pages for each of these keywords but I'm seeing the home page (not the page that, if I do an on page optimization by hand in MOZ gives me an A rating) showing up in the auto reports (assuming pages Google sees for these keywords related to the url) as F's. They're all pointing to the home page. I've redirected the old index.html home page to the new but I suspect the reason is actually these pages (were) ranking for these terms (though none too well - all but one were not in the top 50 and one was 45) because these rankings are all dropping as well. I'm at a loss, with the site replaced, as to how to correct this and tell Google these keyword phrases all have their own pages now. I've dug through this forum and the only applicable answer I can see would be to add these phases to the home page (where they all rank for now) with anchored links to their new (A rated by Moz for these terms when I hand enter them) singular pages? Or is it just a waiting game?
On-Page Optimization | | adworksofboca0 -
Duplicate Content
Hi I am new to SEO and at the moment looking at warnings from the crawl diagnostics report. When I have looked at the content from the urls given I cant see anything obvious that relates to duplicate content. Whats the best way to find out the problem please?
On-Page Optimization | | Pauline080 -
How much content does Google Crawl on your site?
Hi, We've had a debate around the office where some people believe that Google only crawls the first 150-200 words on a page and some people believe that they priority content that is above the fold and other people believe that all content has the same priority. Can you help us? Thanks,
On-Page Optimization | | mdorville
Matt0 -
Static content VS Dynamic changing content
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0 -
Prevent indexing of dynamic content
Hi folks! I discovered bit of an issue with a client's site. Primarily, the site consists of static html pages, however, within one page (a car photo gallery), a line of php coding: dynamically generates a 100 or so pages comprising the photo gallery - all with the same page title and meta description. The photo gallery script resides in the /gallery folder, which I attempted to block via robots.txt - to no avail. My next step will be to include a: within the head section of the html page, but I am wondering if this will stop the bots dead in their tracks or will they still be able to pick-up on the pages generated by the call to the php script residing a bit further down on the page? Dino
On-Page Optimization | | SCW0 -
Geo-targeted content and SEO?
I am wondering, what effect does geo-targeted "cookie cutter" content have on SEO. For example, one might have a list of "Top US Comedians", which appears as "Top UK Comedians" for users from the United Kingdom. The data would be populated with information from a database in both cases, but would be completely different for each region, with the exception of a few words. Is this essentially giving Google's (US-based) crawler different content to users? I know that plenty of sites do it, but is it legitimate? Would it be better to redirect to a unique page, based on location, rather than change the content of one static page? I know what the logical SEO answer is here, but even some of the big players use the "wrong" tactic. I am very interested to hear your thoughts.
On-Page Optimization | | HalogenDigital0 -
URL with two forward slashes //
We have a potential client with a URL structure in this fashion: http://www.site-url.com//cpage/page.html pretty strange, right? my question is: How bad are the 2 forward slashes // for SEO? How bad is it to have that extra layer of /cpage in the URL? this doesn't appear to serve any other purpose than making the URL longer than necessary.
On-Page Optimization | | Motava0