Webmaster Tools - Clarification of what the top directory is in a calender url
-
Hi all,
I had an issue where it turned out a calender was used on my site historically (a couple of years ago) but the pages were still present, crawled and indexed by google to this day.
I want to remove them now from the index as it really clouds my analysis and as I have been trying to clean things up e.g. by turning modules off, webmaster tools is throwing up more and more errors due to these pages.
Below is an example of the url of one of the pages:
The closest question I have found on the topic in Seomoz is:
http://www.seomoz.org/q/duplicate-content-issue-6
I want to remove all these pages from the index by targeting their top level folder. From the historic question above would I be right in saying that it is:
http://www.example.co.uk/index.php?mact=Calendar
I want to be certain before I do a directory level removal request in case it actually targets index.php instead and deindexes my whole site (or homepage at the very least).
Thanks
-
Unfortunately, "index.php?mact=Calendar" is not a folder, it's a page+parameter. If you tried to block that as a folder in GWT, it would mostly just not work. If it went really wrong, you'd block anything driven from index.php (including your home-page).
A couple of options:
(1) Programmatically META NOINDEX anything that calls the calendar parameters. This would have to be done selectively in the index.php header with code, so that ONLY the calendar pages were affected.
(2) Block "mact=" or "year=" with parameter handling in GWT. under "Configuration" > "URL Parameters". ONLY do this if these parameters drive the calendar and no other pages. You can basically tell Google to ignore pages with "year=" in them.
You can also block parameters in Robots.txt, but honestly, once the pages have been indexed, it doesn't work very well.
-
Thanks Thomas, I have uploaded a new site map to GMWT and hopefully that will cause Google to ignore those disappeared pages.
Best,
Mitty
-
I would not use googles disavow or remove links tool Lightly at all.
in my opinion it would be easier to fix the problems you're talking about on the site internally and to ask Google to ignore or disavow. They can basically penalize you because have essentially admitted you've done something wrong just by using the tips about tool. I don't mean to scare you and I don't think you've done anything wrong and if I were you I would let Google know that what you have done is simply try to picture website up to the bust your abilities for the end-user's experienced not for hiding any malicious actions in the past.
Sorry to be so alerted by this but you really do want to stay on top of what you tell Google and what they perceive you're telling them.
I hope this has been of help. The reason I gave the thing for treehouse which is available at an pro perks at the bottom of the page is they teach everything you need to fix the problems you have without using Google.
Sincerely,
Thomas
-
Thanks for the advice and the links Thomas.
I've already gotten rid of the pages from my site and they are not malware inducing so not to worry.
My question is only concerned with webmaster tools. I can manually enter each link into the removal tool but that will take days.
I am aware that there is a option in GWMT to remove directories as well as individual urls i.e. if I had a site that had the following pages: www.example.com/plants/tulips & www.example.com/plants/roses
I could either enter both urls into the removal tool or simply put www.example.com/plants/ and designate it a directory both pages would be removed.
My question is to confirm if I have the following pages which have virtually identical pathways but for the dates 2084 and 3000:
Could I just simply use http://www.example.co.uk/index.php?mact=Calendar as a directory, saving me having to write out the full pathways for both the pages.
-
I just remembered where you can learn how to do this. And it's free.
Pro perks at the bottom of the page will give you one month of free information from treehouse it is a mecca of videos and information on code to Seo to hosting to everything you ever wanted to know honestly take it advantage of this tool.
-
Well I believe the URLs you're talking about if the calendar took up the entire page or even part of it. It could harm other content on the page if there is any ask is there any?
Run your website through http://sitecheck.sucuri.net/scanner/
It will tell you immediately if you have any malware running on your website. If you do I strongly suggest purchasing sucuri and cleaning up. However hopefully that's not the case and you simply need some tweaking Denchere website. I unfortunately am not gifted with the knowledge of code. But I know there is a option out there that is extremely inexpensive and very high-quality called tweak a five I will try to find the URL right now that for less than $100 I
http://www.webdesignerdepot.com/category/code/
One of the better ones can be found by asking the guys at webdevStudios.com they are geniuses and it will lead you in the right direction. I don't want give you any advice that's wrong advice. Sincerely, Tom
-
Thanks Thomas.
It was a calender module with my CMS CMS Made Simple, it seemed to have generated thousands of pages which all linked to each page of my site so webmaster tools had listed my less than 100 page site (or so I thought) as having over 40,000 internal links pointing to each page.
I have deactivated it and added a site map to webmaster tools (GWMT) and that seems to have generated thousands of errors in the GWMT.
There is a list of the top 1,000 urls which are pretty much the calender pages and they are now returning 404 errors (as I have switched off the module so they are effectively deleted) but I want to have them deindexed so as to see if there is anything else hidden in the background.
I'm not completely sure with what you've sent me below. Are you concurring that if I add the below URL to the removal tool and select directory removal it will just target all those 404'd pages?
-
-
http://www.w3schools.com/php/php_ref_filesystem.asp
http://www.w3schools.com/php/php_ref_filter.asp
&
http://www.w3schools.com/php/php_ref_calendar.asp
Tell me if you need any help after this removing that calendar and I will give it a whirl. Sincerely, Tom
-
You are wise to be conscious. Be sure is your site hundred percent PHP? To you know what the calendar is made from meaning is it a third-party software? Or is it something you had Bill or had someone built for you a while back?
Try running your site through builtwith.com and it will give you the components of your website if the calendar shows up you can then Google a way to remove it.
If you don't feel comfortable sharing your URL here send it to me in a private message I'd be happy to have a look at it and give you my best opinion. I am not going to tell you that I am king of coding but I can pick up on these types of things sometimes and I'd be happy to lend another set of eyes.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doubts about the technical URL structure
Hello, first we had this structure Categorie: https://www.stoneart-design.de/armaturen/ Subcategory: https://www.stoneart-design.de/armaturen/waschtischarmaturen/ Oft i see this https://www.xxxxxxxx.de/badewelt/badmoebel/ But i have heard it has something to do with layers so google can index it better, is that true ? "Badewelt" is an extra layer ? So i thought maybe we can better change this to: https://www.stoneart-design.de/badewelt/armaturen/ https://www.stoneart-design.de/badewelt/armaturen/waschtischarmaturen/ and after seeing that i thought we can do it also like this so the keyword is on the left, and make instead "badewelt" just a "c" and put it on the back https://www.stoneart-design.de/armaturen/c/ https://www.stoneart-design.de/armaturen/waschtischarmaturen/c/ I dont understand it anymomre which is the best one, to me its seems to be the last one The reason was about this: this looks to me keyword stuffing: Attached picture Google indexed not the same time the same url, so i thougt with this we can solve it Also we can use only the word "whirlpools" in de main category and the subs only the type without "whirlpools" in text thanks Regards, Marcel SC9vi60
Technical SEO | | HolgerL0 -
Urls Too Long - Should I shorten?
On the crawl of our website we have had a warning that 157 have urls that are too long. When I look at the urls they are generally from 2016 or earlier. Should I just leave them as they are or shorten the urls and redirect to new url? Thanks
Technical SEO | | DaleZon4 -
International URL Structures
Hi everyone! I've read a bunch of articles on the topic, but I can't seem to be able to figure out a solution that works for the specific case. We are creating a site for a service agency, this agency has offices around the world - the site has a global version (in English/French & Spanish) and some country specific versions. Here is where it gets tricky: in some countries, each office has a different version of the site and since we have Canada for example we have a French and an English version of the site. For cost and maintenance reason, we want to have a single domain : www.example.com We want to be able to indicate via Search Console that each subdomain is attached to a different country, but how should we go about it. I've seen some examples with subfolders like this: Global FR : www.example.com/fr-GL Canada FR: www.example.com/fr-ca France: www.example.com/fr-fr Does this work? It seems to make more sense to use : **Subdirectories with gTLDs, **but I'm not sure how that would work to indicate the difference between my French Global version vs. France site. Global FR : www.example.com/fr France : www.example.com/fr/fr Am I going about this the right way, I feel the more I dig into the issue, the less it seems there is a good solution available to indicate to Google which version of my site is geo-targeted to each country. Thanks in advance!
Technical SEO | | sarahcoutu150 -
How to add 301 for many urls
Hi I need to redirect many urls in a website and I was wondering if instead of doing it one by one there is a way to get it the other way round.... Redirect all pages but a few. I get a feeling this is not possible, but prefer asking just in case. Thanks for any feedback
Technical SEO | | turismodevino10 -
Getting a ton of "not found" errors in Webmaster tools stemming from /plugins/feedback.php
So recently Webmaster tools showed a million "not found" errors with the url "plugins/feedback.php/blah blah blah." A little googling helped me find that this comes from the Facebook comment box plugin. Apparently some changes recently have made this start happening. The question is, what's the right fix? The thread I was reading suggested adding "Disallow: /plugins/feedback.php" to the robots.txt file and marking them all fixed. Any ideas?
Technical SEO | | cbrant7770 -
Canonical URL
Hi there Our website www.snowbusiness.com has a non www version and this one has 398 backlinks. What is the best way of transfering this link value if i establish the www. address as the canonical URL? Thanks, Ben
Technical SEO | | SnowFX0 -
URL - Well Formed or Malformed
Hi Mozzers, I've been mulling over whether my URLs could benefit a little SEO tweaking. I'd be grateful for your opinion. For instance, we've a product, a vintage (second hand), red Chanel bag. At the moment the URL is: www.vintageheirloom.com/vintage-chanel-bags/2.55-bags/red-2.55-classic-double-flap-bag-1362483150 Broken down... vintage-chanel-bags = this is the main product category, i.e. vintage chanel bags 2.55-bags = is a sub category of the main category above. They are vintage Chanel 2.55 bags, but I've not included 'vintage' again. 2.55 bags are a type of Chanel bag. red-2.55-classic-double-flap-bag = this is the product, the bag **1362483150 **= this is a unique id, to prevent the possibility of duplicate URLs As you no doubt can see we target, in particular, the phrase **vintage. **The actual bag / product title is: Vintage Chanel Red 2.55 classic double flap bag 10” / 25cm With this in mind, would I be better off trying to match the product name with the end of the URL as closely as possible? So a close match below would involve not repeating 'chanel' again: www.vintageheirloom.com/chanel-bags/2.55-bags/vintage-red-2.55-classic-double-flap-bag or an exact match below would involve repeating 'chanel': www.vintageheirloom.com/chanel-bags/2.55-bags/vintage-chanel-red-2.55-classic-double-flap-bag This may open up more flexibility to experiment with product terms like second hand, preowned etc. Maybe this is a bad idea as I'm removing the phrase 'vintage' from the main category. But this logical extension of this looks like keyword stuffing !! www.vintageheirloom.com/vintage-chanel-bags/vintage-2.55-bags/vintage-chanel-red-2.55-classic-double-flap-bag Maybe this is over analyzing, but I doubt it? Thanks for looking. Kevin
Technical SEO | | well-its-1-louder0 -
Overly Dynamic URLs
I have a site that I use to time fitness events and I like to post the results using query strings. I create a link to each event's results/gallery/etc. I don't need these pages crawled and I don't want them to hurt my seo. Can I put a "do not crawl" meta on them or will that hurt my overall positioning? What are my other options?
Technical SEO | | bobbabuoy0