Webmaster Tools - Clarification of what the top directory is in a calender url
-
Hi all,
I had an issue where it turned out a calender was used on my site historically (a couple of years ago) but the pages were still present, crawled and indexed by google to this day.
I want to remove them now from the index as it really clouds my analysis and as I have been trying to clean things up e.g. by turning modules off, webmaster tools is throwing up more and more errors due to these pages.
Below is an example of the url of one of the pages:
The closest question I have found on the topic in Seomoz is:
http://www.seomoz.org/q/duplicate-content-issue-6
I want to remove all these pages from the index by targeting their top level folder. From the historic question above would I be right in saying that it is:
http://www.example.co.uk/index.php?mact=Calendar
I want to be certain before I do a directory level removal request in case it actually targets index.php instead and deindexes my whole site (or homepage at the very least).
Thanks
-
Unfortunately, "index.php?mact=Calendar" is not a folder, it's a page+parameter. If you tried to block that as a folder in GWT, it would mostly just not work. If it went really wrong, you'd block anything driven from index.php (including your home-page).
A couple of options:
(1) Programmatically META NOINDEX anything that calls the calendar parameters. This would have to be done selectively in the index.php header with code, so that ONLY the calendar pages were affected.
(2) Block "mact=" or "year=" with parameter handling in GWT. under "Configuration" > "URL Parameters". ONLY do this if these parameters drive the calendar and no other pages. You can basically tell Google to ignore pages with "year=" in them.
You can also block parameters in Robots.txt, but honestly, once the pages have been indexed, it doesn't work very well.
-
Thanks Thomas, I have uploaded a new site map to GMWT and hopefully that will cause Google to ignore those disappeared pages.
Best,
Mitty
-
I would not use googles disavow or remove links tool Lightly at all.
in my opinion it would be easier to fix the problems you're talking about on the site internally and to ask Google to ignore or disavow. They can basically penalize you because have essentially admitted you've done something wrong just by using the tips about tool. I don't mean to scare you and I don't think you've done anything wrong and if I were you I would let Google know that what you have done is simply try to picture website up to the bust your abilities for the end-user's experienced not for hiding any malicious actions in the past.
Sorry to be so alerted by this but you really do want to stay on top of what you tell Google and what they perceive you're telling them.
I hope this has been of help. The reason I gave the thing for treehouse which is available at an pro perks at the bottom of the page is they teach everything you need to fix the problems you have without using Google.
Sincerely,
Thomas
-
Thanks for the advice and the links Thomas.
I've already gotten rid of the pages from my site and they are not malware inducing so not to worry.
My question is only concerned with webmaster tools. I can manually enter each link into the removal tool but that will take days.
I am aware that there is a option in GWMT to remove directories as well as individual urls i.e. if I had a site that had the following pages: www.example.com/plants/tulips & www.example.com/plants/roses
I could either enter both urls into the removal tool or simply put www.example.com/plants/ and designate it a directory both pages would be removed.
My question is to confirm if I have the following pages which have virtually identical pathways but for the dates 2084 and 3000:
Could I just simply use http://www.example.co.uk/index.php?mact=Calendar as a directory, saving me having to write out the full pathways for both the pages.
-
I just remembered where you can learn how to do this. And it's free.
Pro perks at the bottom of the page will give you one month of free information from treehouse it is a mecca of videos and information on code to Seo to hosting to everything you ever wanted to know honestly take it advantage of this tool.
-
Well I believe the URLs you're talking about if the calendar took up the entire page or even part of it. It could harm other content on the page if there is any ask is there any?
Run your website through http://sitecheck.sucuri.net/scanner/
It will tell you immediately if you have any malware running on your website. If you do I strongly suggest purchasing sucuri and cleaning up. However hopefully that's not the case and you simply need some tweaking Denchere website. I unfortunately am not gifted with the knowledge of code. But I know there is a option out there that is extremely inexpensive and very high-quality called tweak a five I will try to find the URL right now that for less than $100 I
http://www.webdesignerdepot.com/category/code/
One of the better ones can be found by asking the guys at webdevStudios.com they are geniuses and it will lead you in the right direction. I don't want give you any advice that's wrong advice. Sincerely, Tom
-
Thanks Thomas.
It was a calender module with my CMS CMS Made Simple, it seemed to have generated thousands of pages which all linked to each page of my site so webmaster tools had listed my less than 100 page site (or so I thought) as having over 40,000 internal links pointing to each page.
I have deactivated it and added a site map to webmaster tools (GWMT) and that seems to have generated thousands of errors in the GWMT.
There is a list of the top 1,000 urls which are pretty much the calender pages and they are now returning 404 errors (as I have switched off the module so they are effectively deleted) but I want to have them deindexed so as to see if there is anything else hidden in the background.
I'm not completely sure with what you've sent me below. Are you concurring that if I add the below URL to the removal tool and select directory removal it will just target all those 404'd pages?
-
-
http://www.w3schools.com/php/php_ref_filesystem.asp
http://www.w3schools.com/php/php_ref_filter.asp
&
http://www.w3schools.com/php/php_ref_calendar.asp
Tell me if you need any help after this removing that calendar and I will give it a whirl. Sincerely, Tom
-
You are wise to be conscious. Be sure is your site hundred percent PHP? To you know what the calendar is made from meaning is it a third-party software? Or is it something you had Bill or had someone built for you a while back?
Try running your site through builtwith.com and it will give you the components of your website if the calendar shows up you can then Google a way to remove it.
If you don't feel comfortable sharing your URL here send it to me in a private message I'd be happy to have a look at it and give you my best opinion. I am not going to tell you that I am king of coding but I can pick up on these types of things sometimes and I'd be happy to lend another set of eyes.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
50 Duplicate URLS, but not the same
Hi According to my latest site crawl, many of my pages are showing up to 50 duplicate urls. However this isn't the case in real life. http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/hitachi/ex-33mu.html is showing 31 duplicate URL. Examples include: http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/parts/x430.html
Technical SEO | | JDadd
http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/case/cx-75sr.html Obviously these URL's are very similar and I know that Moz judges URLs by 90% of their similarity, but is this affecting my actual raking on google? If so, what can I do? This pages are also very similar in code and content, so they are also showing as duplicate content etc as well. Worried that this is having an affect on my SERP rankings, as this pages arent ranking particularly well. Thanks, Ellie0 -
Vanity URLs are being indexed in Google
We are currently using vanity URLs to track offline marketing, the vanity URL is structured as www.clientdomain.com/publication, this URL then is 302 redirected to the actual URL on the website not a custom landing page. The resulting redirected URL looks like: www.clientdomain.com/xyzpage?utm_source=print&utm_medium=print&utm_campaign=printcampaign. We have started to notice that some of the vanity URLs are being indexed in Google search. To prevent this from happening should we be using a 301 redirect instead of a 302 and will the Google index ignore the utm parameters in the URL that is being 301 redirect to? If not, any suggestions on how to handle? Thanks,
Technical SEO | | seogirl221 -
Some URLs in the sitemap not indexed
Our company site has hundreds of thousands of pages. Yet no matter how big or small the total page count, I have found that the "URLs Indexed" in GWMT has never matched "URLS in Sitemap". When we were small and now that we have a LOT more pages, there is always a discrepancy of ~10% or so missing from the index. It's difficult to know which pages are not indexed, but I have found some that I can verify are in the Sitemap.xml file but not at all in the index. When I go to GWMT I can "Fetch and Render" missing pages fine - it's not as though it's blocked or inaccessible. Any ideas on why this is? Is this type of discrepancy typical?
Technical SEO | | Mase0 -
Are my Domain URLs correctly set up?
Hi Im struggling with this probably easy concept, so I am sure one of you guys out there can answer it fairly easy! My website is over50choices.co.uk and whilst using the free tools in Majestic it said that I had: 77 Referring Domains pointing to www.over50choices.co.uk and only 35 pointing to www.over50choices.co.uk/ And in Moz it said: The URL you've entered redirects to another URL. We're showing results for www.over50choices.co.uk/ since it is likely to have more accurate link metrics. See data for over50choices.co.uk instead? Does this mean that my domains arent set up correctly and are acting as separate domains - should one be pointing to the other? Your help appreciated. Ash
Technical SEO | | AshShep10 -
SEO URLs?
What are the best practices for generating SEO-friendly headlines? dashes between words? underscores between words? etc. Looking for a programatically generated solution that's using editor-written headlines to produce an SEO-friendly URL Thanks.
Technical SEO | | ShaneHolladay0 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0 -
Can you 404 any forms of URL?
Hi seomozzers, <colgroup><col width="548"></colgroup>
Technical SEO | | Ideas-Money-Art
| http://ex.com/user/login?destination=comment%2Freply%2F256%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F258%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F242%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F257%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F260%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F225%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F251%23comment-form |
| http://ex.com/user/login?destination=comment%2Freply%2F176%23comment-form | These are duplicate content and the canonical version is: http://www.ex.com/user (login and pass page of the website) Since there were multiple other duplicates which mostly have been resolved by 301s, I figured that all "LOGIN" URLs (above) should be 404d since they don't carry any authority and 301 those wouldn't be the best solution since "too many 301s" can slow down the website speed. But a member of the dev team said: "Looks like all the urls requested to '404 redirect' are actually the same page http://ex.com/user/login. The only part of the url that changes is the variables after the "?" . I don't think you can (or highly not recommended) make 404 pages display for variables in a url. " So my question is: I am not sure what he means by that? and Is it really better to not 404 these? Thanks0 -
Having some weird crawl issues in Google Webmaster Tools
I am having a large amount of errors in the not found section that are linked to old urls that haven't been used for 4 years. Some of the ulrs being linked to are not even in the structure that we used to use for urls. Never the less Google is saying they are now 404ing and there are hundreds of them. I know the best way to attack this is to 301 them, but I was wondering why all of these errors would be popping up. I cant find anything in the google index searching for the link in "" and in webmaster tools it shows unavailable as where these are being linked to from. Any help would be awesome!
Technical SEO | | Gordian1