Best way to manage SEO for a massive events listing website.
-
I run a website that tracks entertainment for the entire state of South Dakota. While I've made some fantastic strides in gaining traffic, I feel lost on how to manage all those entries in an SEO friendly manner. I have a TON of errors showing on my crawl diagnostics and I just don't know what to do. The nature of the website is such that there are going to be duplications all over the place. I know that I can help some of this by getting my canonical links setup properly (that's coming in my next version of the site's theme), but what else should I do to make those event listings friendly for the SE's??
-
You are very welcome JC. Don't give up on your rich snippets. I think it was Richard Baxter who recently said that the most common problem he's seeing with micro-data is people giving up too soon when they don't see Google picking it up and displaying it right away. It could take 6 months. Just hang in there, and by all means try the tool. I bet it works
-
Thanks Dana! I have been using microformats in the code and hoping that Google would start using that in the snippets, but so far they haven't thrown me a bone. I'll check into this and see what I can do with it! Thank you so much.
-
Hi JC, You're in luck. Google just rolled out a new tool, specifically for events, that helps you get them set up with Structured Data without having to go in and code your pages. It's called the Data Highlighter Tool. Here's a linke to the offical post announcing it: http://googlewebmastercentral.blogspot.com/2012/12/introducing-data-highlighter-for-event.html
Here is a quote: "Data Highlighter is a point-and-click tool that can be used by anyone authorized
for your site in Google Webmaster Tools. No changes to HTML code are required.
Instead, you just use your mouse to highlight and "tag" each key piece of data
on a typical event page of your website:"Log in to your Google Webmaster Tools and click on "Structured Data" in the left Nav. - then click the subcategory "Data Highlighter" - Watch the brief video, then click "Start Highlighting."
I am hoping (for the love of God!) they add the same kind of simple tool to allow Webmasters to markup other types of data like product videos and reviews.
I hope this helps you out a little. Given that this tool is specifically designed for sites with events, it should be a perfect fit!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword SEO
Hi everyone! I am pretty new to SEO so all the help would be great. Does every webpage on our website need a focus keyword for example like tructiepbongda Just to note that I am using Yoast on Wordpress. Many thanks,
Technical SEO | | yenu0 -
What IS SEO FRIENDLY BEST PRACTICE FOR URLS FILTERED 'TAGGED'
EX: https://www.STORENAME.com/collections/all-deals/alcatel– Tagged "Alcatel", when I run audits, I come across these URLS that give me duplicate content and missing H1. This is Canonical: https://www.STORENAMEcom/collections/all-deals/alcatel Any advice on how to tackle these I have about4k in my store! Thank you
Technical SEO | | Sscha0030 -
What is the best way to handle Product URLs which prepopulate options?
We are currently building a new site which has the ability to pre-populate product options based on parameters in the URL. We have done this so that we can send individual product URLs to google shopping. I don't want to create lots of duplicate pages so I was wondering what you thought was the best way to handle this? My current thoughts are: 1. Sessions and Parameters
Technical SEO | | moturner
On-site product page filters populate using sessions so no parameters are required on-site but options can still be pre-populated via parameters (product?colour=blue&size=100cm) if the user reaches the site via google shopping. We could also add "noindex, follow" to the pages with parameters and a canonical tag to the page without parameters. 2. Text base Parameters
Make the parameters in to text based URLs (product/blue/100cm/) and still use "noindex, follow" meta tag and add a canonical tag to the page without parameters. I believe this is possibly the best solution as it still allows users to link to and share pre-populated pages but they won't get indexed and the link juice would still pass to the main product page. 3. Standard Parmaters
After thinking more today I am considering the best way may be the simplest. Simply using standard parameters (product?colour=blue&size=100cm) so that I can then tell google what they do in webmaster tools and also add "noindex, follow" to the pages with parameters along with the canonical tag to the page without parameters. What do you think the best way to handle this would be?0 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
SEO impact classifieds website
Hi, I'm part of an organization running a classifieds platform in Spain. (Mercadonline.es) We are hit by Google penalties since a few weeks, possibly caused by numerous errors we are experiencing. Most frequent errors are 404's and duplicate content (titles tags etc) since the nature of our website is dynamic. Many ads change daily, are added or removed, causing Googlebots (and others) to flag us and not being able to see our more unique content. Until what part of our platform should we be indexed? Since we have +34,000 pages indexed (mostly due to internal filter pages) I would need a systematic solution for us to display relevant and unique content, with enough usage of keywords that can bring us back up - we are actually ranked <50 on google for most of our main keywords. It is costing us precious time and money since we can only aquire our visitors (adwords etc) and not being to attract any organically. I can go in more detail with someone who can give me a bit more direction. Your answer is much appreciated! Ivor
Technical SEO | | ivordg0 -
Should I use Event Schema for a page that reports on an event?
I have a question about using Schema data. Specifically: Should I use Event Schema for a page that reports on an event? I provide high-quality coverage (reporting) about new products being introduced at an industry trade show. For the event, I create a single page using the event name, and provide a great deal of information on how to attend the show, the best places to stay and other insider tips to help new attendees. Then during the show, I list the new products being introduced along with photos and videos. Should I use event schema data for this page, or does Google only want the event organizer to use that data? Any benefits or drawbacks to using event schema? Thanks! Richard
Technical SEO | | RichardInFlorida0 -
Is Adobe Acrobat the best for making PDF documents in terms of seo and price?
As we add PDF documents to our website, I want to take it up a notch. In terms of seo and software price, is Adobe Acrobat the only choice? Thanks! No Mac here. I should clarify that I can convert files to PDFs with Microsoft Word and add some basic info for the search engines such as title, keywords, author, and links. This article inspired me: www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search I can add links back to the page when I create the PDF, but we also have specific product PDFs that suppliers let us copy and serve from our server--why use their bandwidth. Much as you would stamp your name on a hard copy brochure the vendor supplies, I want to add a link to our page from those PDFs. That makes me think I should ask our supplier to give me a version with a link to our page. Then there is the question: is that ok to do? In the meantime, I will check TriviaChicken's suggestions and dream about a Mac, Allan. Thanks
Technical SEO | | zharriet0 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0