What is the best way to deal with an event calendar
-
I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions.
Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future.
I thought of having the calendar no followed at all but the content for the classes seems valuable.
Thanks,
-
Sorry for all the posts however maybe this will help you as well that get rid of the dynamic uRLs
http://www.webconfs.com/url-rewriting-tool.php
Thomas
-
A great completely and this is a good example of the type of difference changing the robots.txt file could make
I would read all the information you can on it as it seems to be constantly updating.
I used this info below as an example of a happy ending but to see the problems I would read all the stories you will see if you check out this link.
http://wordpress.org/support/topic/max-cpu-usage/page/2
CPU usage from over 90% to less than 15%. Memory usage dropped by almost half, from 1.95 GB to 1.1 GB including cache/buffers.
My setup is as follows:
Linode 2GB VPS
Nginx 1.41
Percona SQL Server using XtraDB
PHP-FPM 5.4 with APC caching db requests and opcode via W3 Total Cache
Wordpress 3.52
All in One Event Calendar 1.11All the Best,
Thomas
-
I got the robots.txt file I hope this will help you.
This is built into every GetFlywheel.com website they are a managed WordPress only hosting company
website the reason they did this was the same reason Dan as described above.
I'm not saying this is a perfect fix however after speaking with the founder of GetFlywheel I know they place this in the robots.txt file for every website that they host in order to try get rid of the crawling issue.
This is an exact copy of any default robots.txt file from getflywheel.com
Default Flywheel robots file
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Disallow: /calendar/action:posterboard/
Disallow: /calendar/action:agenda/
Disallow: /calendar/action:oneday/
Disallow: /calendar/action:month/
Disallow: /calendar/action:week/
Disallow: /calendar/action:map/As found on a brand-new website. If you Google "Max CPU All in one calendar" you will see more about this issue.
I hope this is of help to you,
Thomas
PS
here is what
The maker of the all in one event calendar has listed on their site as a fix
-
Hi Landon
I had a client with a similar situation. Here's what I feel is the best goal;
Calendar pages (weeks/months/days etc) - don't crawl, don't index
Specific event pages - crawl and index
Most likely the calendar URLs have not been indexed, but you can check with some site: searches. Assuming the have not been indexed, the best solution was to block crawling to certain URLs with robots.txt - calendars can go off into infinity, and you don't want to send the crawlers off into a black hole as it's not good for crawl budget, or for directing them to your actual content.
-
is this the all-in-one event calendar for WordPress?
If so I can give you the information or you can just Google CPU Max WordPress
essentially you have to change the robots.txt file so the crawlers don't have huge issues as they do now with it.
Get flywheel has that built into their robots.txt file if that is your issue I can go in and grab it for you.
Sincerely,
Thomas
-
Besides this, take a look at the schema markup for Events it might help you mark up the page better so Google will understand what the page/ event is about: http://schema.org/Event
-
Are the same classes in the future link to the same page? are you using canonical tags correctly? Your URL should help diagnose the problem and guide you better,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 page cannot be created - Best solution?
Hi all, i am helping a frind with his page, he is very shot on money and cannot spend a dime on programers or learn how to create a 404 page.
Technical SEO | | Gaston Riera
His web is in php laravel, also i dont know how to create one. My options are: Leave the 404 page to be just like that. Redirect, via .htaccess to homepage. What should recommend him to do? Thanks!
GR.0 -
Best SEO service/process to harness the power of quality backlinks?
What/who would you recommend for those looking for a strategy around realizing the benefits of high quality back links? We have tons of earned links from DA 90+ sites, but don't think we are realizing the full benefit due to onsite issues. We have scraper sites outranking us. Would it be a technical on page audit? Any guidance appreciated.
Technical SEO | | loveit0 -
Company blog. What are the best solutions?
Hello Moz Community! Our company has its own blog (www.awarablogs.com) - the blog was created some time ago by means of a simple blog-engine. Now we see that the structure of the blog is bad for SEO (it has long URLs, many useless folders, subdomains and so on), so we'd like to simplify it. But the engine doesn't allow to change its structure in the way we 'd like to. Our webmaster suggested that we use "Alias". Will this method really help us make our blog SEO-friendly? Or is it better to choose another blog software like Wordpress? Thank you very much!
Technical SEO | | Awaraman0 -
When rankings dip what's the best diagnostic procedure?
Bonjourno from 10 degrees C lighly raining Wetherby UK 🙂 Every so often SEO feels like a game of snakes & ladders. One minute your rankings go up and then then within the click of a mouse they drop back down. Like a Greek play you begin to feel our mortal lives as SEO pundits is controlled by the Google Gods. A case in point is illustrated here in this graph:
Technical SEO | | Nightwing
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/lincoln-drop_zpseeb04690.jpg Now if i want to explain why the rapid dip has occured for target term "Lincoln Solicitors" here's is what i'd do: 1. Go to webmaster tools and check for crawl errors 2. See if a Google algo change has changed the rules of engagment 3. Check another site administrator hasnt tinkered with the original layout But i wonder what process do other SEO practitioners follow to explain to a disgruntled client - "Why have my rankings that i pay you to look after nose dived?" Any insights welcome:-)0 -
What is the best strategy for franchise companies when building local sites?
Hi, if you represent a national franchise, I have noticed that Dominos and others do NOT use local websites for LOCAL SEO, rather they use their own MAMMOTH sites with a store locator for the local stores with a few NOT keyword rich pages with very basic information. However, for LOCAL SEO, I have been thinking that using e.g. Hyperfranchise.com for the main domain and then e.g. buckhead.hyperfranchise.com or buckheadhyperfrnachise.com would be better for LOCAL SEO including Yelp, FourSquare and more.It will take time to rank for all local sites, but is that not better in the end than having e.g. 6 pages of content that are "local" on the main site? However, I have not seen any of the big ones do that, but that might be because they are so entrenched in their own OLD system that might be ranking well anyway for their local franchisees? Any comments, ideas, suggestions?
Technical SEO | | yvonneq0 -
Merging several sites into one - best practice
I had 2 sites on the web (www.physicseditor.de, www.texutrepacker.com) and decided to move them all under one single domain (www.codeandweb.com) Both sites were ranking very good for several keywords. I not redirected the most important pages from the old domains with a 301 redirect to the new subpages (www.texturepacker.com => www.codeandweb.com/texturepacker) Google still delivers the old domains but the redirect take people directly to the new content. I've already submitted the new site map to google webmaster tools. Pages are already in the index but do not really show up in the search results. How long does it take until google accepts the new domain and delivers the new content in the search results? Was it ok what I did? Or is there some room for improvement? SeoMoz will of course not find any information about the new page since it is not yet directly linked in google. But I can't get ranking information for the "old" pages since SeoMoz tells me that it can't crawl the old domains....
Technical SEO | | gossi740 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0 -
What is the best way to change your sites folder structure?
Hi, Our site was originally created with a very flat folder structure - most of the pages are at the top level. Because we will adding more content I want to tidy up the structure first. I just wanted to check what the best way to go about this was. Is it best to: First configure all the new 301 redirects to point to the new pages, while leaving the actual links on our site pointing to the old pages. Then changing the links on the site after a few weeks. Configure the redirects and change the actual links on my website at the same time to point to the new locations. My thinking that if I go with option 1 route then I will give Google a chance to process all the redirects and change the locations in their index before I start pointing them to the new locations. But does it make any difference? What is the best wat to go about making this sort of change to minimize any loss in rankings, page rank etc? Thanks for the help.
Technical SEO | | Maximise0