Microdata and dinamic data.
-
Hi, everybody!
We're starting up a local services website in Brazil. Something like redbeacon.com or thumbtack.com, but obviously different.
So we are developing our 2.0 version of the site, and I want do put microdata in every provider's pages, to rank people's evaluation about this particular provider, and geographic information about him. Ok, we want to use microdata in several pages, but those are more important: the providers.
These data (geo and rank) will be dynamically generated from our database.
In Schema.org, I only found information about using static data to build microdata for my intentions.
My doubt is: does google and bing and yahoo and etc index dynamic generated data? Is there something about sitemaps.xml or robots.txt that I can do to have my data indexed on search engines? Our front-end is the guy who deal with html and our codemaster uses pure php for coding.
Thanks!
-
You can add them to teh sitemap, but they will be found if they have links, unless your site is huge.
I would add them to the sitemap if it is easy to do, but i would not loose sleep over it
-
That's all?
Just "don't worry about it"?
No sitemap changes, nothing?
-
Yes they index dynamic data, but try to keep the urls friendly, you can dynamicly generate microdata just like any other markup this should not be a problem
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Job Posting Page and Structured Data Issue
We have a website where we do job postings. We manually add the data to our website. The Job Postings are covered by various other websites including the original recruiting organisations. The details of the job posting remain the same, for instance, the eligibility criteria, the exam pattern, syllabus etc. We create pages where we list the jobs and keep the detailed pages which have the duplicate data disallowed in robots.txt. Lately, we have been thinking of indexing these pages as well, as the quantum of these non-indexed pages is very high. Some of our competitors have these pages indexed. But we are not sure whether doing this is gonna be the right move or if there is a safe way to deal with this. Additionally, there is this problem that some job posts have very less data like fees, age limit, salary etc which is thin content so that might contribute to poor quality issue. Secondly, we wanted to use enriched result snippets for our job postings. Google doesn't want snippets to be used on the listing page: "Put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages). Instead, apply structured data to the most specific page describing a single job with its relevant details." Now, how do we handle this situation? Is it safe to allow the detailed pages which have duplicate job data and sometime not so high quality data in robots.txt?
Intermediate & Advanced SEO | | dailynaukri0 -
What is the best way to add semantic linked data to WordPress?
As a recent Moz subscriber, I'm trying to up my game in terms of inbound marketing. One of the most pressing tasks is to add json-ld across all of my WordPress sites. What is the best way to do this? Should I use the technique set out here: https://moz.com/blog/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags Or should I use one of these plugins? https://en-gb.wordpress.org/plugins/schema/ https://en-gb.wordpress.org/plugins/wp-structuring-markup/ I want to get this right so any guidance would be gratefully received.
Intermediate & Advanced SEO | | treb0r0 -
Would a mass data update have a negative effect on SEO?
We have a large eCommerce site with the ability to do an export, change data, and import new data in mass. Over the 15 years that this site has been growing, it has accumulated several inconsistencies in product titles, descriptions, title tags, etc. The question is: If we were to update thousands of product titles ( 's on those pages) and some of the descriptions, would it have a negative SEO impact because of the groundbreaking number of products effected? Or would it only be for the better if they were all technically improvements (both in SEO and UX)? Thanks!
Intermediate & Advanced SEO | | frankandmaven0 -
Keyword ranking verse all other data
Hi there I have just joined Moz so I am not sure if i am doing a good job of analysing all the data, but from what i can see i have a few questions: 1. I seem to have a fairly high visibility compared to a few other competitors 2. All the other competitors i am looking at have a much lower domain authority 3. I win the link metrics in all categories compared to my competitors 4. I have a page optimisation score of 94 5. I dont have any crawl issues (except that i just changed to https and i believe there is a synching issue with Moz and cloud flare..) YET I barely rank for any of the main keyword in my industry.... Kitchen, New Kitchen, Kitchen Renovation etc. I also have a page optimisation score of 94 for related keywords. I feel like i am really missing a big point and was hoping I could get your expert thoughts on this 🙂 Thanks so much! PS my domain is www.bluetea.com.au
Intermediate & Advanced SEO | | bluetea0 -
Should HTML be included in the structured data (schema) markup for the main body content?
Lately we have been applying structured data to the main content body of our client's websites. Our lead developer had a good question about HTML however. In JSON-LD, what is the proper way to embed content from a data field that has html markup (i.e. p, ul, li, br, tags) into mainContentOfPage. Should the HTML be stripped our or escaped somehow? I know that apply schema to the main body content is helpful for the Googlebot. However should we keep the HTML? Any recommendations or best practices would be appreciated. Thanks!
Intermediate & Advanced SEO | | RosemaryB0 -
How to make the most of our data, charts and graphs online?
Hi, Our business has an enormous amount of data at our disposal, which we have traditionally only ever printed in our offline publications. I am aware however that much of this information would be of great use to people searching for the topics we cover online, so it strikes me that there must be a way to represent our data, tables and graphs online to maximise the visibility of the information. For example rather than simply including an image of a graph on a web page and editing the alt text to something like "Saudi Arabia economic growth 1995-2016", is there a way to publish this data online that will give search engines and users some good contextual information about what we are trying to show? Would certain rich snippets for example do the trick? Any help or pointers that could be provided would be greatly appreciated. Thanks Lou
Intermediate & Advanced SEO | | OBG0 -
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
Hi, We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc. The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them. As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent. The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is 1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it). 2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors). Many thanks in advance
Intermediate & Advanced SEO | | James770