Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to deal with an event calendar
-
I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions.
Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future.
I thought of having the calendar no followed at all but the content for the classes seems valuable.
Thanks,
-
Sorry for all the posts however maybe this will help you as well that get rid of the dynamic uRLs
http://www.webconfs.com/url-rewriting-tool.php
Thomas
-
A great completely and this is a good example of the type of difference changing the robots.txt file could make
I would read all the information you can on it as it seems to be constantly updating.
I used this info below as an example of a happy ending but to see the problems I would read all the stories you will see if you check out this link.
http://wordpress.org/support/topic/max-cpu-usage/page/2
CPU usage from over 90% to less than 15%. Memory usage dropped by almost half, from 1.95 GB to 1.1 GB including cache/buffers.
My setup is as follows:
Linode 2GB VPS
Nginx 1.41
Percona SQL Server using XtraDB
PHP-FPM 5.4 with APC caching db requests and opcode via W3 Total Cache
Wordpress 3.52
All in One Event Calendar 1.11All the Best,
Thomas
-
I got the robots.txt file I hope this will help you.
This is built into every GetFlywheel.com website they are a managed WordPress only hosting company
website the reason they did this was the same reason Dan as described above.
I'm not saying this is a perfect fix however after speaking with the founder of GetFlywheel I know they place this in the robots.txt file for every website that they host in order to try get rid of the crawling issue.
This is an exact copy of any default robots.txt file from getflywheel.com
Default Flywheel robots file
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Disallow: /calendar/action:posterboard/
Disallow: /calendar/action:agenda/
Disallow: /calendar/action:oneday/
Disallow: /calendar/action:month/
Disallow: /calendar/action:week/
Disallow: /calendar/action:map/As found on a brand-new website. If you Google "Max CPU All in one calendar" you will see more about this issue.
I hope this is of help to you,
Thomas
PS
here is what
The maker of the all in one event calendar has listed on their site as a fix
-
Hi Landon
I had a client with a similar situation. Here's what I feel is the best goal;
Calendar pages (weeks/months/days etc) - don't crawl, don't index
Specific event pages - crawl and index
Most likely the calendar URLs have not been indexed, but you can check with some site: searches. Assuming the have not been indexed, the best solution was to block crawling to certain URLs with robots.txt - calendars can go off into infinity, and you don't want to send the crawlers off into a black hole as it's not good for crawl budget, or for directing them to your actual content.
-
is this the all-in-one event calendar for WordPress?
If so I can give you the information or you can just Google CPU Max WordPress
essentially you have to change the robots.txt file so the crawlers don't have huge issues as they do now with it.
Get flywheel has that built into their robots.txt file if that is your issue I can go in and grab it for you.
Sincerely,
Thomas
-
Besides this, take a look at the schema markup for Events it might help you mark up the page better so Google will understand what the page/ event is about: http://schema.org/Event
-
Are the same classes in the future link to the same page? are you using canonical tags correctly? Your URL should help diagnose the problem and guide you better,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are there ways to avoid false positive "soft 404s" by Google
Sometimes I get alerts from Google Search Console that it has detected soft 404s on different websites, and since I take great care to never have true soft 404s, they are always false positives. Today I got one on a website that has pages promoting some events. The language on the page for one event that has sold out says that "tickets are no longer available" which seems to have tripped up Google into thinking the page is a soft 404. It's kind of incredible to me that in the current era we're in, with things like chatGPT that Google doesn't seem to understand natural language. But that has me thinking, are there some strategies or best practices we can use in how we write copy on the page so Google doesn't flag it as soft 404? It seems like anything that could tell a user that an item isn't available could trip it up into thinking it is a 404. In the case of my page, it's actually important information we need to tell the public that an event has sold out, but to use their interest in that event to promote other events. so I don't want the page deindexed or not to rank well!
Technical SEO | | IrvCo_Interactive0 -
Recurring events and duplicate content
Does anyone have tips on how to work in an event system to avoid duplicate content in regards to recurring events? How do I best utilize on-page optimization?
Technical SEO | | megan.helmer0 -
What's the best way to handle product filter URLs?
I've been researching and can't find a clear cut answer. Imagine you have a product category page e.g. domain/jeans You've a lot of options as to how to filter the results domain/jeans?=ladies,skinny,pink,10 or domain/jeans/ladies-skinny-pink-10 or domain/jeans/ladies/skinny?=pink,10 And in this how do you handle titles, breadcrumbs etc. Is the a way you prefer to handle filters and why do you do it that way? I'm trying to make my mind up as some very big names handle this differently e.g. http://www.next.co.uk/shop/gender-women-category-jeans/colour-pink-fit-skinny-size-10r VS https://www.matalan.co.uk/womens/shop-by-category/jeans?utf8=✓&[facet_filter][meta.tertiary_category][Skinny]=on&[facet_filter][variants.meta.size][Size+10]=on&[facet_filter][meta.master_colour][Midwash]=on&[facet_filter][min_current_price][gte]=6.0&[facet_filter][min_current_price][lte]=18.0&per=36&sort=
Technical SEO | | RodneyRiley0 -
What is the best way to refresh a webpage of a news site, SEO wise?
Hello all, we have a client which is a sports website. In fact it is a veyr big website and has a huge number of news per day. This is mostly the reason why it refreshes some of its pages with news list every 420 seconds. We currently use meta refresh. I have read here and elsewhere that meta refreshes should be avoided. But we don't do it to send to another page and pass any kind of page authority / juice. Is in this case javascript refresh better? Is there any other better way. What do you think & suggest? Thank you!
Technical SEO | | pkontopoulos0 -
URLs in Greek, Greeklish or English? What is the best way to get great ranking?
Hello all, I am Greek and I have a quite strange question for you. Greek characters are generally recognized as special characters and need to have UTF-8 encoding. The question is about the URLs of Greek websites. According the advice of Google webmasters blog we should never put the raw greek characters into the URL of a link. We always should use the encoded version if we decide to have Greek characters and encode them or just use latin characters in the URL. Having Greek characters un-encoded could likely cause technical difficulties with some services, e.g. search engines or other url-processing web pages. To give you an example let's look at A) http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1which is the URL with the encoded Greek characters and it shows up in the browser asB) http://el.wikipedia.org/wiki/Ελβετία The problem with A is that everytime we need to copy the URL and paste it somewhere (in an email, in a social bookmark site, social media site etc) the URL appears like the A, plenty of strange characters and %. This link sometimes may cause broken link issues especially when we try to submit it in social networks and social bookmarks. On the other hand, googlebot reads that url but I am wondering if there is an advantage for the websites who keep the encoded URLs or not (in compairison to the sites who use Greeklish in the URLs)! So the question is: For the SEO issues, is it better to use Greek characters (encoded like this one http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1) in the URLs or would it be better to use just Greeklish (for example http://el.wikipedia.org/wiki/Elvetia ? Thank you very much for your help! Regards, Lenia
Technical SEO | | tevag0 -
Best URL-structure for ecommerce store?
What structure will recommend to the product pages? Lets make an example with the keyword "Luxim FZ200" With category in url:
Technical SEO | | gojesper
www.myelectronicshop.com/digital-cameras/luxim-FZ200.html With /product prefix:
www.myelectronicshop.com/product/luxim-FZ200.html Without category in url:
www.myelectronicshop.com/luxim-FZ200.html I have read in a blog post that Paddy Moogan recommend /lluxim-FZ200.html - i think i prefer this version too. But I can see that many of the bigger ecommerce stores are using a /product prefix before the product name. What is the reason for this? and what is best practice?0 -
ECommerce: Best Practice for expired product pages
I'm optimizing a pet supplies site (http://www.qualipet.ch/) and have a question about the best practice for expired product pages. We have thousands of products and hundreds of our offers just exist for a few months. Currently, when a product is no longer available, the site just returns a 404. Now I'm wondering what a better solution could be: 1. When a product disappears, a 301 redirect is established to the category page it in (i.e. leash would redirect to dog accessories). 2. After a product disappers, a customized 404 page appears, listing similar products (but the server returns a 404) I prefer solution 1, but am afraid that having hundreds of new redirects each month might look strange. But then again, returning lots of 404s to search engines is also not the best option. Do you know the best practice for large ecommerce sites where they have hundreds or even thousands of products that appear/disappear on a frequent basis? What should be done with those obsolete URLs?
Technical SEO | | zeepartner1 -
Internal vs external blog and best way to set up
I have a client that has two domians registered - one uses www.keywordaustralia.com the other uses www.keywordaelaide.com He had already bought and used the first domain when he came to me I suggested the second as being worth buying as going for a more local keyword would be more appropriate. Now I have suggested to him that a blog would be a worthy use of the second domain and a way to build links to his site - however I am reading that as all links will be from the same site it wont be worth much in the long run and an internal blog is better as it means updated content on his site. should i use the second domain for blog, or just 301 the second domain to his first domain. Or is it viable to use the second domain as the blog and just set up an rss feed on his page ? Is there a way to have the second domain somehow 'linked' to his first domain with the blog so that google sees them as connected ? NOOBIE o_0
Technical SEO | | mamacassi0