Hi Dan, I could explain a lot of things here but I would recommend to visit and read the content on this page: https://support.google.com/webmasters/answer/164506?hl=en Google will explain what you need to do to get event snippets.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Best posts made by Martijn_Scheijbeler
-
RE: Schema description wordcount guidelines ?
-
RE: How can I find the redirect that removes special parameter from Adwords?
Hi,
Start with installing the Redirect Path plugin for Chrome so you can more easily track what's happening with redirects on your site. That hopefully can provide you with valuable insights.
Probably every extra parameter on your site will cause a redirect on your site. So go ahead and just enter a URL on your site add a parameter or the GCLID parameter to see what happens. Use Redirect Path to determine from where to where the page is redirecting. Now you can tell your developer or figure out yourself where the redirect is triggered and how you can overcome this.
Good luck!
-
RE: How do you de-index and prevent indexation of a whole domain?
Hi TCE,
There are two options in my opinion for this which are the best (as their are several other options including META robots etc):
- Robots.txt: Exclude your portal via your robots.txt file.
- HTTP Auth: Let your developers add HTTP Authentication so you have to log in to see the portal. Google won't be able to see the pages itself and so will de index them.
Hope this helps.
-
RE: XML SiteMap - Right Away or Delay?
For our company/websites we used different methods of submitting our sitemaps, but until now I prefer submitting the XML sitemaps after a couple of weeks/months (it varies with the kind of website). For us this had a couple of reasons, these were the main ones:
- Insights in your ("natural") crawl stats: By waiting a couple of weeks/months with submitting your sitemaps to Google we were able to see the "natural" crawl stats of our websites.
- Analyse the differences: This also applies to the reason I mentioned above. If you wait you are able to see the difference in your crawl stats after submitting your sitemap to Google. You are not definitely sure if this is directly correlated to you submitting your sitemap of course. But at least you could have a look.
But there are sure also reasons for directly submitting a sitemap: you are directly able to segment your urls, so you can see which segments of your website need to be optimized. And the number of pages crawled vs indexed.
I would like to know your own answer to the question. Why do you prefer submitting a XML sitemap right away?
-
RE: Do You Like Live Chat Pop-Ups... Please comment!
- Love Live Chat Pop Ups
Like it as a consumer with the right timing.
-
RE: SVG image files causing multiple title tags on page - SEO issue?
Could you share the example here? I doubt it would create any issues as it seems that it's more an issue on the side of most SEO crawlers. A title elemement on an SVG doesn't even get close to being a second title tag
-
RE: Should I include unnecessary pages in the sitemap.xml
That clearly changes my ideas about this ;-). As we're talking about a couple of million pages I wouldn't include them in the sitemaps then and to make sure they're absolutely made sure that it's noindexed.
-
RE: Searching for a keyword on html source code of a website via Google
Hi,
Eventually the answer to your question is No. Google doesn't have an operator to search within the source of a document/ Web site. As far as I know, those and a couple of others are the only ones available within Google. I also checked Bing and Blekko. But they both also seem not to provide an operator like you would like to have.
Hope this helps!
-
RE: Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hi,
I wouldn't worry to much on this issue, it's true that you don't want to depend on the level of the Googlebot to find out if this could be an issue but I think that the encoding of characters will make sure you'll be fine. As a suggestion I would say use canonical tags on of these pages to direct Google or other search engines to the right page. This makes sure you'll never get an issue with duplicate content. However I really doubt that this will turn into an issue.