What is the best way to deal with an event calendar
-
I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions.
Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future.
I thought of having the calendar no followed at all but the content for the classes seems valuable.
Thanks,
-
Sorry for all the posts however maybe this will help you as well that get rid of the dynamic uRLs
http://www.webconfs.com/url-rewriting-tool.php
Thomas
-
A great completely and this is a good example of the type of difference changing the robots.txt file could make
I would read all the information you can on it as it seems to be constantly updating.
I used this info below as an example of a happy ending but to see the problems I would read all the stories you will see if you check out this link.
http://wordpress.org/support/topic/max-cpu-usage/page/2
CPU usage from over 90% to less than 15%. Memory usage dropped by almost half, from 1.95 GB to 1.1 GB including cache/buffers.
My setup is as follows:
Linode 2GB VPS
Nginx 1.41
Percona SQL Server using XtraDB
PHP-FPM 5.4 with APC caching db requests and opcode via W3 Total Cache
Wordpress 3.52
All in One Event Calendar 1.11All the Best,
Thomas
-
I got the robots.txt file I hope this will help you.
This is built into every GetFlywheel.com website they are a managed WordPress only hosting company
website the reason they did this was the same reason Dan as described above.
I'm not saying this is a perfect fix however after speaking with the founder of GetFlywheel I know they place this in the robots.txt file for every website that they host in order to try get rid of the crawling issue.
This is an exact copy of any default robots.txt file from getflywheel.com
Default Flywheel robots file
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Disallow: /calendar/action:posterboard/
Disallow: /calendar/action:agenda/
Disallow: /calendar/action:oneday/
Disallow: /calendar/action:month/
Disallow: /calendar/action:week/
Disallow: /calendar/action:map/As found on a brand-new website. If you Google "Max CPU All in one calendar" you will see more about this issue.
I hope this is of help to you,
Thomas
PS
here is what
The maker of the all in one event calendar has listed on their site as a fix
-
Hi Landon
I had a client with a similar situation. Here's what I feel is the best goal;
Calendar pages (weeks/months/days etc) - don't crawl, don't index
Specific event pages - crawl and index
Most likely the calendar URLs have not been indexed, but you can check with some site: searches. Assuming the have not been indexed, the best solution was to block crawling to certain URLs with robots.txt - calendars can go off into infinity, and you don't want to send the crawlers off into a black hole as it's not good for crawl budget, or for directing them to your actual content.
-
is this the all-in-one event calendar for WordPress?
If so I can give you the information or you can just Google CPU Max WordPress
essentially you have to change the robots.txt file so the crawlers don't have huge issues as they do now with it.
Get flywheel has that built into their robots.txt file if that is your issue I can go in and grab it for you.
Sincerely,
Thomas
-
Besides this, take a look at the schema markup for Events it might help you mark up the page better so Google will understand what the page/ event is about: http://schema.org/Event
-
Are the same classes in the future link to the same page? are you using canonical tags correctly? Your URL should help diagnose the problem and guide you better,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there an easy way to hide one of your URL's on google search?, rather than redirecting?
We don't want to redirect to a different page, as some people still use it, we just don't want it to appear in search
Technical SEO | | TheIDCo0 -
What is the best way to refresh a webpage of a news site, SEO wise?
Hello all, we have a client which is a sports website. In fact it is a veyr big website and has a huge number of news per day. This is mostly the reason why it refreshes some of its pages with news list every 420 seconds. We currently use meta refresh. I have read here and elsewhere that meta refreshes should be avoided. But we don't do it to send to another page and pass any kind of page authority / juice. Is in this case javascript refresh better? Is there any other better way. What do you think & suggest? Thank you!
Technical SEO | | pkontopoulos0 -
What is the best strategy for a company in various countries?
Hello I have to make yt SEO marketing strategy for a company that provides services in Spain, Colombia and Mexico
Technical SEO | | interficto
I'm looking at two options: Buy different domains (TLD): This option seems feasible but very expensive and manage each domain position it would have to have different content in each (plus you would not know that because it is put exactly the same domain) Place each service and country folders eg
www.dominio.com / mexico / training-financiero.html
www.dominio.com / espana / training-financiero.html I have understood that option 1 is no longer necessary since you can use html tags within the code to tell Google that you try to target content to customers from a different country.
In principle we would use the same content would change only a few words and of course the currency to suit the local currency of each country. However I believe that customers could rely more on a domain if their country. Plus I'm afraid I google indexed as duplicate content is another matter What country would main that could confuse the visitor?0 -
Google doesn't rank the best page of our content for keywords. How to fix that?
Hello, We have a strange issue, which I think is due to legacy. Generally, we are a job board for students in France: http://jobetudiant.net (jobetudiant == studentjob in french) We rank quite well (2nd or 3rd) on "Job etudiant <city>", with the right page (the one that lists all job offers in that city). So this is great.</city> Now, for some reason, Google systematically puts another of our pages in front of that: the page that lists the jobs offers in the 'region' of that city. For example, check this page. the first link is a competitor, the 3rd is the "right" link (the job offers in annecy), but the 2nd link is the list of jobs in Haute Savoie (which is the 'departement'- equiv. to county) in which Annecy is... that's annoying. Is there a way to indicate Google that the 3rd page makes more sense for this search? Thanks
Technical SEO | | jgenesto0 -
Best strategy for redirecting domain authority from an acquired site...?
Hi all, I'm an in-house for a company that made several acquisitions last year prior to my starting. I'm just now hearing about several loose-ends websites that belong to companies that have been absorbed by us. The question is how to best approach the task of utilizing that site's domain authority to our site's benefit. There is already a link to the homepage in the header of the site in question (our logo's right under theirs) so we're already getting some linkjuice. Looks like the whois information never changed. Here are the options I'm considering: 1. Blanket redirect (all of their pages there into our home page) - not ideal. 2. Targeted redirect (try to "connect the dots" between content pages with similar subjects/keyword relevance - better than #1, but is it worth the extra effort? 3. More linking (add more strategically placed and keyword optimized links back to our site) - also more work, but certainly do-able if the consensus is to leave the site up. 4. Any other suggestions? Thanks for your help everyone!
Technical SEO | | TGViaWest0 -
What is the best strategy for franchise companies when building local sites?
Hi, if you represent a national franchise, I have noticed that Dominos and others do NOT use local websites for LOCAL SEO, rather they use their own MAMMOTH sites with a store locator for the local stores with a few NOT keyword rich pages with very basic information. However, for LOCAL SEO, I have been thinking that using e.g. Hyperfranchise.com for the main domain and then e.g. buckhead.hyperfranchise.com or buckheadhyperfrnachise.com would be better for LOCAL SEO including Yelp, FourSquare and more.It will take time to rank for all local sites, but is that not better in the end than having e.g. 6 pages of content that are "local" on the main site? However, I have not seen any of the big ones do that, but that might be because they are so entrenched in their own OLD system that might be ranking well anyway for their local franchisees? Any comments, ideas, suggestions?
Technical SEO | | yvonneq0 -
Whats the quickest way of diagnosing a canonical problem
Salut from from positivley tropical 10 degrees C wetherby UK 🙂 Ok here goes... on this site http://www.cedarcourthotels.co.uk/ there is a canonical problem but I'm interested to know if my method of spotting a canonical problem is the most efficient. In the case of Cedar Court I started with http://www.cedarcourthotels.co.uk/ then i entered this https://www.cedarcourthotels.co.uk/ and noted they were pointing to the same home page. My question is there a quicker way of diagnosing a canonical problem or is it a case of knowcking out w's and adding s etc. Thanks in advance, David
Technical SEO | | Nightwing0 -
Best practise for updating software guide
Heya! I write a guide for a specific piece of Internet-based software which is about to undergo a major patch release. No-one's going to be using the old version, so my old-version articles are essentially going to be useless, as are keywords related to the old version number. Given that, I'm intending to update all my guides to be current with the new version. However, obviously I want to keep the Google juice for the old guides, as they rank pretty well. The three options I'm considering: Simply retitle the old guides to the latest version number - "How to use Blue Widget 2.0" becomes "How to use Blue Widget 3.0". Disadvantage - my URLs still include the old version number, 2.0. Write updated guides as seperate articles and 301 redirect the old articles to them. I've done this before with some success. So, I'd 301 the URL for "How to use Blue Widget 2.0" to the url for "How to use Blue Widget 3.0", my new article. Disadvantages - possible loss of link juice? Also, I believe redirects can be kinda tricksy. Just leave both the old and new versions up there, with a link from the old version saying "outdated, check the new version". My belief is that this would be the worst idea. Should I do one of them, or something else? And why?
Technical SEO | | Cairmen0