How to implement Rich snippets on dynamic pages?
-
I'm working on an e-commerce website, and I want to recommend to the client rich snippets, however only the main menu pages are the ones that have an actual static landing page, the rest are dynamic content pulled from a database.
How can I implement rich snippets in this situation? or is it only applicable to static pages?
Also when I optimized the title tags and meta description tags, the dynamic content is pulling these tags from the main categories, so Seomoz indicates there is a bunch of duplicate meta description tags and title tags. Any tips for this?
-
ASP . Net
-
What CMS/language are you using?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page optimisation
It's not possible to add a keyword to the homepage url. Do we still add a secondary keyword along the primary one on the home page title and meta description etc. How do we make our primary keyword to dominate in this case. Thanks
On-Page Optimization | | Timberwink0 -
Dynamically populated content
We are developing a website for a school that has 19 campuses divided into 8 districts. Ideally, we would like to have one search page that dynamically populates when people search WHILE on the site. The question is what happens when someone does an organic search, will the search engine populate with the schools in that district. For instance, if i search on Google "Austin Schools", will the Austin district-that does not have a unique URL- show up in a Google search? What the generated page looks like is on this link http://imgur.com/stCQcP6. If yes, any special type of coding we need to add to the backend?
On-Page Optimization | | jgodwin0 -
Home page cannibal
I was wondering if others had the same problem I have. It appears Google loves that home page too much and I'm having a difficult time getting it to rank the page I really want. And that happens if a keyword I want to rank for only appears on the home page one time with a keyword density of .1%. Take vanillaqueen.com for example. The home page ranks on the first page for "bulk vanilla beans" and not http://vanillaqueen.com/shop/category/vanilla-beans/ or http://vanillaqueen.com/five-reasons-why-buying-bulk-vanilla-makes-good-sense/ And I'll add another one that I recently took on. This is a personal injury attorney in a large city so there is a ton of competition who have been doing SEO for a very long time. (Fortunately he also does business and civil litigation law to keep the business going). Last month, according to webmaster tools, he got a couple of clicks (hey, it's something!) on "personal injury attorney [his city]" on page 2 in the SERPS, but it was his home page. http://bit.ly/1Gvumlm **In this case I don't mind people landing on the home page, but does the fact that another page that is much better optimized for those keywords indicate a penalty on that page? And is his rank lower because the better page is not ranking and Google has to find the next best thing in the home page? ** Has anyone else experienced that and what have you done to get Google to not go home? P.S. The law site is a huge challenge because of the competition. Any help you pros out there can offer to get this underdog out of hiding will be much appreciated. We're starting a smart, strategic content marketing plan now that I'm very excited about.
On-Page Optimization | | katandmouse1 -
Duplicate Page Content
Hi, I am new to the MOZ Pro community. I got the below message for many of my pages. We have a video site so all content in the page except the video link would be different. How can i handle such pages. Can we place adsense AD's on these pages? Duplicate Page Content Code and content on this page looks similar or identical to code and content on other pages on your site. Search engines may not know which pages are best to include in their index and rankings. Common fixes for this issue include 301 redirects, using the rel=canonical tag, and using the Parameter handling tool in Google Webmaster Central. For more information on duplicate content, visit http://moz.com/learn/seo/duplicate-content. Please help me to know how to handle this.. Regards
On-Page Optimization | | Nettv0 -
On Page Optimization Report
Does this tool also guard against an instance of over-optimization or keyword-spamming?
On-Page Optimization | | webfeatus0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
What if Paginated Pages all have PageRank?
Paginated Pages, page 2,3,4 etc.... they aren't supposed to have a PageRank, right? If they are only linked to from themselves, only the original page, Page 1, is supposed to be showing PageRank? I'm trying to double check that I am handling this right. I'm not using canonical, or noindex or any of that... just using rel next and prev, which I thought would be fine. Thoughts?
On-Page Optimization | | MadeLoud0 -
SEO Value of Within-Page Links vs. Separate Pages
Title says it all. Assuming that you're talking about similar content (let's say, widgets), which is better: using within-page links for variations or using separate pages? I.e., do we have a widget page and then do in-page links to describe green, blue, and red widgets, or separate pages for each type of widget? In-page pro: more content on a single page, thus more keywords, key phrases, and general appearance of real content. In-page con: Jakob Neilsen says they're confusing. Also, for SEO, you only get one page title, rather than a separate page title for each. My personal bias is for in-page, since I hate creating dozens of short pages for what could be on one page, but my suspicion is that separate pages are better for SEO.
On-Page Optimization | | maxkennerly0