Questions created by footsteps
-
Bulk redirect or only a few pages at a time
Dear all, I would very much like to have your advise about whether or not to implement bulk 301 redirects. We have 3 retail websites with the same technical architecture, namely: Netherlands-example.nl Belgium-example.be France-example.fr These three websites are all bilingual, namely: Netherlands-example.nl/nl Netherlands-example.nl/fr Belgium-example.be/nl Belgium-example.be/fr France-example.fr/nl France-example.fr/fr We’re going to do a CMS update and therefore we have to change a bulk of 301 redirects: Part 1: For France (France-example.fr) URL’s in the Dutch language (France-example.fr/nl) will be redirected to Belgium (Belgium-example.be/nl). It’s a matter of about 8.000 redirects. Part 2: For the Netherlands (Netherlands-example.nl) URL’s in the French language (Netherlands-example.nl/fr ) will be redirected to Belgium (Belgium-example.be/fr). It’s also a matter of about 8.000 redirects. Question:
Intermediate & Advanced SEO | | footsteps
What will be the best way to implement these redirects? Fully implement part 1 first (8.000 redirects) and then a couple of weeks/months later a full implement of part 2? Or will it be better to implement small batches like 200-500 per 2 weeks? I’d like to hear your opinion. Thanks in advance. Kind regards, Gerwin0 -
How to prevent Google from crawling our product filter?
Hi All, We have a crawler problem on one of our sites www.sneakerskoopjeonline.nl. On this site, visitors can specify criteria to filter available products. These filters are passed as http/get arguments. The number of possible filter urls is virtually limitless. In order to prevent duplicate content, or an insane amount of pages in the search indices, our software automatically adds noindex, nofollow and noarchive directives to these filter result pages. However, we’re unable to explain to crawlers (Google in particular) to ignore these urls. We’ve already changed the on page filter html to javascript, hoping this would cause the crawler to ignore it. However, it seems that Googlebot executes the javascript and crawls the generated urls anyway. What can we do to prevent Google from crawling all the filter options? Thanks in advance for the help. Kind regards, Gerwin
Intermediate & Advanced SEO | | footsteps0 -
How long will it take for Google to forgive us?
Hi all, The last few weeks we started working on several affiliate websites. All was fine, the number of pages indexed in Google increased, rankings increased, visitors increased but; After two successful weeks Google noticed what we were doing and decided to crash our party. What we did; Create an affiliate website presenting shoes for ladies (15.000+ pages)
Affiliate Marketing | | footsteps
Create an affiliate website presenting brand shoes only (25.000+ pages)
Both sites were new domains and we linked them to each other. Biggest boo boo's we think we have made; We didn't put a no follow on the affiliatelinks We linked the websites to each other causing google to spot 25K new links to a new domain. Stuffed the index with 40K new pages out of nothing Didn't exclude the correct parameters in webmastertools so all items got indexed a multiple times. The above resulted in a drop of 95% of visitors and 3 weeks down the road we gained 10% from our low point. All boo boo's are corrected the last few day's and now we are wondering; How long will it take for Google to forgive us? And are we correct in concluding we messed up big time? Hope to hear your insight and let this story be a warning to others trying on this endeavour. Kind regards, Gerwin0