Best Practice Out-of-Date Sales and Events
-
Hi Everyone,
http://www.palaceresorts.com/cozumelpalace/en/cozumel-ironman-triathlon
I found an out-of-date event in the clients' crawl report that returns a 404 not found Status Code. I remember to have read an article advising webmasters to don't ever remove a single landing page, but instead to advise that the sale/event it's expired and some information about the upcoming event.
Does anyone have had this experience before ? Could you provide me with a real case scenario.
Thank You
-
Ok, that works too. You are most welcome to ask further questions.
-
Thank You Vijay,
This is pretty helpful we might don't have the landing pages URLs for 2017 ready. But could be an idea to leave them ready to receive information at any time.
Thank You
-
Hi There,
We have a travel client for the Caribbean who ranks very well for Caribbean events / carnivals etc. We made it a practice to 301 a past event to next year event date with information about this year's event (when an event is passed) . This has helped us build much more authority for next year (2017) events and carnivals.
I also found an article that you can refer
Event Wrapup Page: Often it’s good to have a wrapup page for the event. Here's an example from a recent link building presentation I gave: Why Links Matter to Small Businesses and How to Get Them. This is a dedicated page on your website that has references from the event (such as a Slideshare embed of presentations or perhaps links to other resources you mentioned) and other pertinent information. It’s also a great place for event attendees to link after the event.
When I send out the copy of my slides to event attendees, I email them the link to this page and encourage them to share it freely and link to it.
Speaking of links after the event...
https://moz.com/blog/the-complete-guide-to-link-building-with-local-events
I hope this helps, let me know if you have further questions.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How best to fix 301 redirect problems
Hi all Wondering if anyone could help out with this one. Roger Bot crawler has just performed it's weekly error crawl on my site and I appear to have 18,613 temp redirect problems!! Rather, the same 1 problem 18,613 times. My site is a magento store and the errors it is giving me is due to the wishlist feature on the site. For example, it is trying to crawl links such as index.php/wishlist/index/add/product/29416/form_key/DBDSNAJOfP2YGgfW (which would normally add the item to one's wishlist). However, because Roger isn't logged into the website it means that all these requests are being sent to the login url with the page title of Please Enable Cookies. Would the best way to fix this be to enable wishlists for guests? I would rather not do that but cannot think of another way of fixing it. Any other Magento people come across this issue? Thanks, Carl
Technical SEO | | daedriccarl0 -
Does Google differentiate between a site with spammy link building practices from a victim of a negative SEO attack?
I've be tasked with figuring out how to recover our rankings as we are likely being hurt by an algorithmic penalty. I have no idea if this was the workings of a previously hired SEO or the result of negative SEO, **how does Google differentiate between a site with bad/spammy link building practices from a victim of a negative SEO attack? **
Technical SEO | | Syed_Raza0 -
Best Way to Break Down Paginated Content?
(Sorry for my english) I have lots of user reviews on my website and in some cases, there are more than a thousand reviews for a single product/service. I am looking for the best way to break down these reviews in several sub-pages. Here are the options I thought of: 1. Break down reviews into multiple pages / URL http://www.mysite.com/blue-widget-review-page1
Technical SEO | | sbrault74
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be indexed by search engines. Pros: all the reviews are getting indexed Cons: It will be harder to rank for "blue widget review" as their will be many similar pages 2. Break down reviews into multiple pages / URL with noindex + canonical tag http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be set to noindex and the canonical tag would point to the first review page. Pros: only one URL can potentially rank for "blue widget review" Cons: Subpages are not indexed 3. Load all the reviews into one page and handle pagination using Javascript reviews, reviews, reviews
more reviews, more reviews, more reviews
etc... Each page would be loaded in a different which would be shown or hidden using Javascript when browsing through the pages. Could that be considered as cloaking?!? Pros: all the reviews are getting indexed Cons: large page size (kb) - maybe too large for search engines? 4. Load only the first page and load sub-pages dynamically using AJAX Display only the first review page on initial load. I would use AJAX to load additional reviews into the . It would be similar to some blog commenting systems where you have to click on "Load more comments" to see all the comments. Pros: Fast initial loading time + faster loading time for subpages = better user experience Cons: Only the first review page is indexed by search engines ========================================================= My main competitor who's achieving great rankings (no black hat of course) is using technique #3. What's your opinion?0 -
I need help to define which is the best friendly url structure
Hi, I need some help to define which is the best friendly url structure for my new project, I'm in doubt for some cases, anyone could help me define which would be the best way? domain.com/buy-online/0-1,this-cool-model or
Technical SEO | | LeonardoLima
domain.com/buy-online/this-cool-model,0-1 or
domain.com/buy-online/0-1/this-cool-model or
domain.com/buy-online/this-cool-model/0-1 or
domain.com/buy-online/this-cool-model_0-1 or
domain.com/buy-online/this-cool-model?Model=0&OtherParam=1 Thanks! Best Regards,
Leonardo Lima0 -
Redirect from old wordpress site to new php site? Best approach
Hi I have two websites one legacy site done in wordpress the other in php. However I would like to merge the two together and remove the wordpress site. However it has a good link profile and the pages rank well. What is the best approach to do a 301 redirect from the old site with all its pages pointing to the homepage of the new site? If so what's the best way to do this in wordpress? Many thanks
Technical SEO | | ocelot0 -
What is the best SEO URL design for keywords with a period?
Quick question, Assume my keyword is the German soccer team "1.FC Nuremberg". What is the best URL design to target that keyword? domainname.com/1fc-nuremberg or domainname.com/1-fc-nuremberg Any thoughts? Does it make - even a tiny - difference? /Thomas
Technical SEO | | tomypro0 -
What is the best method for indexing blog pages?
I have a client whose blog has hundreds if not thousands of entries. My question is does it help his site if each unique blog entry becomes indexed on Google? Can we do this dynamically? And role does the canonical tag play in blog entries if at all? Thanks, Chris
Technical SEO | | coxen000 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0