Webmaster Tools - Clarification of what the top directory is in a calender url
-
Hi all,
I had an issue where it turned out a calender was used on my site historically (a couple of years ago) but the pages were still present, crawled and indexed by google to this day.
I want to remove them now from the index as it really clouds my analysis and as I have been trying to clean things up e.g. by turning modules off, webmaster tools is throwing up more and more errors due to these pages.
Below is an example of the url of one of the pages:
The closest question I have found on the topic in Seomoz is:
http://www.seomoz.org/q/duplicate-content-issue-6
I want to remove all these pages from the index by targeting their top level folder. From the historic question above would I be right in saying that it is:
http://www.example.co.uk/index.php?mact=Calendar
I want to be certain before I do a directory level removal request in case it actually targets index.php instead and deindexes my whole site (or homepage at the very least).
Thanks
-
Unfortunately, "index.php?mact=Calendar" is not a folder, it's a page+parameter. If you tried to block that as a folder in GWT, it would mostly just not work. If it went really wrong, you'd block anything driven from index.php (including your home-page).
A couple of options:
(1) Programmatically META NOINDEX anything that calls the calendar parameters. This would have to be done selectively in the index.php header with code, so that ONLY the calendar pages were affected.
(2) Block "mact=" or "year=" with parameter handling in GWT. under "Configuration" > "URL Parameters". ONLY do this if these parameters drive the calendar and no other pages. You can basically tell Google to ignore pages with "year=" in them.
You can also block parameters in Robots.txt, but honestly, once the pages have been indexed, it doesn't work very well.
-
Thanks Thomas, I have uploaded a new site map to GMWT and hopefully that will cause Google to ignore those disappeared pages.
Best,
Mitty
-
I would not use googles disavow or remove links tool Lightly at all.
in my opinion it would be easier to fix the problems you're talking about on the site internally and to ask Google to ignore or disavow. They can basically penalize you because have essentially admitted you've done something wrong just by using the tips about tool. I don't mean to scare you and I don't think you've done anything wrong and if I were you I would let Google know that what you have done is simply try to picture website up to the bust your abilities for the end-user's experienced not for hiding any malicious actions in the past.
Sorry to be so alerted by this but you really do want to stay on top of what you tell Google and what they perceive you're telling them.
I hope this has been of help. The reason I gave the thing for treehouse which is available at an pro perks at the bottom of the page is they teach everything you need to fix the problems you have without using Google.
Sincerely,
Thomas
-
Thanks for the advice and the links Thomas.
I've already gotten rid of the pages from my site and they are not malware inducing so not to worry.
My question is only concerned with webmaster tools. I can manually enter each link into the removal tool but that will take days.
I am aware that there is a option in GWMT to remove directories as well as individual urls i.e. if I had a site that had the following pages: www.example.com/plants/tulips & www.example.com/plants/roses
I could either enter both urls into the removal tool or simply put www.example.com/plants/ and designate it a directory both pages would be removed.
My question is to confirm if I have the following pages which have virtually identical pathways but for the dates 2084 and 3000:
Could I just simply use http://www.example.co.uk/index.php?mact=Calendar as a directory, saving me having to write out the full pathways for both the pages.
-
I just remembered where you can learn how to do this. And it's free.
Pro perks at the bottom of the page will give you one month of free information from treehouse it is a mecca of videos and information on code to Seo to hosting to everything you ever wanted to know honestly take it advantage of this tool.
-
Well I believe the URLs you're talking about if the calendar took up the entire page or even part of it. It could harm other content on the page if there is any ask is there any?
Run your website through http://sitecheck.sucuri.net/scanner/
It will tell you immediately if you have any malware running on your website. If you do I strongly suggest purchasing sucuri and cleaning up. However hopefully that's not the case and you simply need some tweaking Denchere website. I unfortunately am not gifted with the knowledge of code. But I know there is a option out there that is extremely inexpensive and very high-quality called tweak a five I will try to find the URL right now that for less than $100 I
http://www.webdesignerdepot.com/category/code/
One of the better ones can be found by asking the guys at webdevStudios.com they are geniuses and it will lead you in the right direction. I don't want give you any advice that's wrong advice. Sincerely, Tom
-
Thanks Thomas.
It was a calender module with my CMS CMS Made Simple, it seemed to have generated thousands of pages which all linked to each page of my site so webmaster tools had listed my less than 100 page site (or so I thought) as having over 40,000 internal links pointing to each page.
I have deactivated it and added a site map to webmaster tools (GWMT) and that seems to have generated thousands of errors in the GWMT.
There is a list of the top 1,000 urls which are pretty much the calender pages and they are now returning 404 errors (as I have switched off the module so they are effectively deleted) but I want to have them deindexed so as to see if there is anything else hidden in the background.
I'm not completely sure with what you've sent me below. Are you concurring that if I add the below URL to the removal tool and select directory removal it will just target all those 404'd pages?
-
-
http://www.w3schools.com/php/php_ref_filesystem.asp
http://www.w3schools.com/php/php_ref_filter.asp
&
http://www.w3schools.com/php/php_ref_calendar.asp
Tell me if you need any help after this removing that calendar and I will give it a whirl. Sincerely, Tom
-
You are wise to be conscious. Be sure is your site hundred percent PHP? To you know what the calendar is made from meaning is it a third-party software? Or is it something you had Bill or had someone built for you a while back?
Try running your site through builtwith.com and it will give you the components of your website if the calendar shows up you can then Google a way to remove it.
If you don't feel comfortable sharing your URL here send it to me in a private message I'd be happy to have a look at it and give you my best opinion. I am not going to tell you that I am king of coding but I can pick up on these types of things sometimes and I'd be happy to lend another set of eyes.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to mark as fixed multiple errors on webmaster tools
We have 44,249 errors and I have set up for most of the URLs a 301 redirect. I know exactly which links are correctly redirected my problem is I don't want to mark as fixed each one individually. Is there a way to upload a URL list to webmaster tools and it automatically marks as fixed based on the list.
Technical SEO | | easyoffices0 -
URL Parameters
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+ Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled. Our robotos.txt files shows now: # Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
Technical SEO | | Happy-SEO1 -
What directory should a site go in (url structure)?
Hi All, The is the first actual SEO campaign i've worked on and I had a few question about where the site should live on the server and url structure. The site is in WP and we're using Yoast SEO. Anyway the site lives in a a folder called Coastal, which is a child of the WWW folder. So the permalink of the homepage is mcoastalwindows.com/coastal/. The URL is mycoastalwindows.com. The thing is I can still get to the homepage or any of the pages on the site by typing in the /coastal/. Another example is permalink mycoastalwndows.com/coastal/siding/ and url mycoastalwindows.com/siding/. The urls always display without the /coastal/, so I'm not too worried about people linking to them, but Yoast puts a canonical element to the permalink and always includes the /coastal/. Also I'm seeing that Google displays a lot of the urls with the /coastal/, which is an issue seeing as we don't link to the pages that way. My original thought was to solve this at the source and just move everything out of the coastal directory, but the developer swears that it's more secure being in another folder especially with WP. What would you all do and what is best practice? Would you move everything out of the coastal folder, 301 re-direct, do something with. htaccess, or another solution? Appreciate the input thanks!
Technical SEO | | Mario.Souza0 -
DNS error on webmaster tool
Google webmaster tool is showing DNS error and that is leading to many server error (502,500) almost 50+ in every crawl. Recently Google crawled one of our sub domains that we did not want google to crawl. We blocked it via Robots.txt and also removed all the URL's and since then we are having this issue. Any suggestions how to fix this DNS error? Thanks in advance.
Technical SEO | | tpt.com0 -
Webmaster Tools 404s
We try to keep our 404s in google webmaster tools to a minimum but in recent months, the volume has simply exploded to over 500k errors. 99.95% of this is complete spam linking to pages that never existed. Have tried marking as resolved but they just end up back in the list eventually and don't like the idea of 301ing so many links when the pages never existed in the first place. We can just ignore them all but this makes it hard to identify legitimate 404s that need redirecting as there is only so much data we can export out of WT. Has anyone had experience with returning 410s? Does google eventually drop these from WT?
Technical SEO | | jandunlop0 -
Severe Health issue on my site through Webmaster tools
I use Go Daddy Website Tonight. I keep getting a severe health message in Google Webmaster tools stating that my robots.txt file is blocking some important page. When I try to get more details the blocked file will not open. When I asked the Go Daddy peeps they told me that it was just image and backup files that do not need to be crawled. But if Google spiders keep thinking an important page is blocked will this hurt my SERPS?
Technical SEO | | VictorVC0 -
Wordpress URL weirdness - why is google registering non-pretty URLS?
I've noticed in my stats that google is indexing some non-pretty URLs from my wordpress-based blog.
Technical SEO | | peterdbaron
For instance, this URL is appearing google search: http://www.admissionsquest.com/onboardingschools/index.php?p=439 It should be: http://www.admissionsquest.com/onboardingschools/2009/01/do-american-boarding-schools-face-growing-international-competition.html Last week I added the plugin Redirection in order to consolidate categories & tags. Any chance that this has something to do with it? Recs on how to solve this? Fyi - I've been using pretty URLS with wordpress from the very beginning and this is the first time that I've seen this issue. Thanks in advance for your help!0 -
Used SEOMOZ top 100 Directories, my site ranking lowered, what can we do to fix this?
We have made a big mistake.... So what can we do to fix this? A trainee member of staff has used the seomoz 100 top directories and added to sites from PR10 to PR6 approx about 25 sites, using keywords were possible instead of using the website URL "which i now was stupid!. Our website ranking have been lowered big time for all keywords used!, eg from 1st to 10th and even disappeared from the top 100 We are contacting all directories asking for the Title link to be changed to the URL instead of a keyword.. Will this help? I understand that Google give sites a penalty for this!!, but what can i do to put this right and how long would this penalty last for? Any advice would be highly appreciated... Thanks Dean
Technical SEO | | deanpallatt0