Are Incorrectly Set Up URL Rewrites a Possible Cause of Panda
-
On a .NET site, there was a url rewrite done about 2 years ago. From a visitor's perspective, it seems to be fine as the urls look clean. But, Webmaster tools reports 500 errors from time to time showing /modules/categories... and /modules/products.... which are templates and how the original urls were structured. While the developer made it look clean, I am concerned that he could have set it up incorrectly. He acknowledged that IIS 7 on a Windows server allows url rewrites to be set up, but the site was done in another way that forces the urls to change to their product name. So, he has believed it to be okay.
However, the site dropped significantly in its ranking in July 2013 which appears to be a Panda penalty. In trying to figure out if this could be a factor in why the site has suffered, I would like to know other webmasters opinions. We have already killed many pages, removed 2/3 of the index that Google had, and are trying to understand what else it could be.
Also, in doing a header check, I see that it shows the /modules/products... page return a 301 status. I assume that this is okay, but wanted to see what others had to say about this.
When I look at the source code of a product page, I see a reference to the /modules/products...
I'm not sure if any of this pertains, but wanted to mention in case you have insight. I hope to get good feedback and direction from SEOs and technical folks
-
I don't think you understood the question as it didn't have much to do with links. This is related to a content management system which has an ugly url which a developer then took to make it more user friendly. While a user browsing the site sees clean URLs, webmaster tools reported 500 errors which are essentially server errors.
It doesn't seem those errors are in the console anymore. But, I was wondering if anyone has seen sites receive penalties because of a handful of 500 errors even though the site looks pretty good from a user perspective. (Note: the site is fairly large and a few 500 errors appeared).
-
Check your backlink profile - are there unnatural links? Are there sites which are spammy linking to you?
Get your site all working technically speaking, which it sounds like you have and then or at the same time check the links. I suspect you have some bad links which you are going to have to try to remove, or disavow.
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate URLs on eCommerce site caused by parameters
Hi there, We have a client with a large eCommerce site with about 1500 duplicate URLs caused by the parameters in the URLs (such as the sort parameter where the list of products are then sorted by price, age etc.) Example: www.example.com/cars/toyota First duplicate URL: www.example.com/cars/toyota?sort=price-ascending Second duplicate URL: www.example.com/cars/toyota?sort=price-descending Third duplicate URL: www.example.com/cars/toyota?sort=age-descending Originally we had advised to add a robots.txt file to block search engines from crawling the URLs with parameters but this hasn't been done. My question: If we add the robots.txt now and exclude all URLs with filters - how long will it take for Google to disregard the duplicate URLs? We could ask the developers to add canonical tags to all the duplicates but these are about 1500... Thanks in advance for any advice!
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Setting up the right Geo targeting/language targeting settings and not to brake the SEO
Hello the great Moz Community! Gev here from BetConstruct, a leading gaming and betting software provider in the world. Our company website is performing great on SERP. We have 20+ different dedicated pages for our 20+ softwares, event section, different landing pages for different purposes. We also run a blog section, Press section, and more... Our website's default language is EN. 4 months ago we opened the /ru and /es versions of the website! I have set the correct hreflang tags, redirects, etc.. generated correct sitemaps, so the translated versions started to rank normally! Now our marketing team is requesting different stuff to be done on the website and I would love to discuss this with you before implementing! There are different cases! For example: They have created a landing page under a url betconstruct.com/usa-home and want me to set that page as the default website page(ie homepage), if the user visits our website from a US based IP. This can be done in 2 different ways: I can set the /usa-home page as default in my CMS, in case the visitor is from US and the address will be just betconstruct.com(without /use-home). In this case the same URL (betconstruct.com) will serve different content for only homepage. I can check the visitor IP, if he is from US, I can redirect him to betconstruct.com/usa-home. In this case user can click on the logo and go to the homepage betconstruct.com and see the original homepage. Both of the cases seems to be dangerous, because in the 1st case I am not sure what google will think when he sees different homepage from different IPs. And in the 2nd case I am not sure what should be that redirection. Is it 301 or 303, 302, etc... Because Google will think I don't have a homepage and my homepage redirects to a secondary page like /usa-home After digging a lot I realised that my team is requesting from me a strange case. Because the want both language targeting(/es, /ru) and country targeting (should ideally be like /us), but instead of creating /us, they want it to be instead of /en(only for USA) Please let me know what will be the best way to implement this? Should we create a separate version of our website for USA under a /us/* URLs? In this case, is it ok to have /en as a language version and /us as a country targeting? What hreflangs to use? I know this is a rare case and it will be difficult for you to understand this case, but any help will be much appreciated! Thank you! Best,
Intermediate & Advanced SEO | | betconstruct
Gev0 -
How much does URLs with CAPS and URLs with non-CAPS existing on an IIS site matter nowadays?
I work on a couple ecommerce sites that are on IIS. Both sites have return a 200 header status for the CAPS and non CAPS version of the URLs. While I suppose it would be ok if the canonicals pointed to the same version of the page, in some cases it doesn't (ie; /Home-Office canonicalizes to itself and /home-office canonicalizes to itself). I came across this article (http://www.searchdiscovery.com/blog/case-sensitive-urls-and-seo-case-matters/) that is a few years old and I'm wondering how much of an issue it is and how I would determine if it is/isn't?
Intermediate & Advanced SEO | | OfficeFurn0 -
E-Commerce Panda Question
I'm torn. Many of our 'niche' ecommerce products rank well, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?
Intermediate & Advanced SEO | | saultienut0 -
SEO Overly-Dynamic URL Website with thousands of URLs
Hello, I have a new client who has a Diablo 3 database. They have created a very interesting site in which every "build" is it's own URL. Every page is a list of weapons and gear for the gamer. The reader may love this but it's nightmare for SEO. I have pushed for a blog to help generate inbound links and traffic but overall I feel the main feature of their site is a headache to optimize. They have thousands of pages index in google but none are really their own page. There is no strong content, H-Tags, or any real substance at all. With a lack of definition for each page, Google see's this as a huge ball of mess, with duplicate Page Titles and too many onpage links. The first thing I did was tell them to add a canonical link which seemed to drop the errors down 12K leaving only 2400 left...which is a nice start, but the remaining errors is still a challenge. I'm thinking about seeing if I can either find a way to make each page it's own blurb, H Tag or simple have the Nav bar and all the links in the database Noindex. That way the site is left with only a handful of URLs + the Blog and Forum Thought?
Intermediate & Advanced SEO | | MikePatch0 -
Changing Site URLs
I am working on a new client that hasn't implemented any SEO previously. The site has terrible url nomenclature and I am wondering if it is worth it to try and change it. Will I lose rankings? What is the best url naming structure? Here's the website http://www.formica.com/en/home/TradeLanding.aspx. (I am only working on the North America site.) Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0 -
Is it OK to have a site that has some URLs with hyphens and other, older, legacy URLs that use underscores?
I'm working with a VERY large site that has recently been redesigned/recategorized. They kept only about 20% of the URLs from the legacy site, the URLs that had revenue tied to them, and these URLs use underscores. Whereas the new URLs created for the site use hyphens. I don't think that this would be an issue for Google, as long as the pages are of quality, but I wanted to get everyone's opinion on this. Will it hurt me to have two different sets of URLs, those with using hyphens and those using underscores?
Intermediate & Advanced SEO | | Business.com0 -
Rewriting dynamic urls to static
We're currently working on an SEO project for http://www.gear-zone.co.uk/. After a crawl of their site, tons of duplicate content issues came up. We think this is largely down to the use of their brand filtering system, which works like this: By clicking on a brand, the site generates a url with the brand keywords in, for example: http://www.gear-zone.co.uk/3-season-synthetic-cid77.html filtered by the brand Mammut becomes: http://www.gear-zone.co.uk/3-season-synthetic-Mammut-cid77.html?filter_brand=48 This was done by a previous SEO agency in order to prevent duplicate content. We suspect that this has made the issue worse though, as by removing the dynamic string from the end of the URL, the same content is displayed as the unfiltered page. For example http://www.gear-zone.co.uk/3-season-synthetic-Mammut-cid77.html shows the same content as: http://www.gear-zone.co.uk/3-season-synthetic-cid77.html Now, if we're right in thinking that Google is unlikely to the crawl the dynamic filter, this would seem to be the root of the duplicate issue. If this is the case, would rewriting the dynamic URLs to static on the server side be the best fix? It's a Windows Server/asp site. I hope that's clear! It's a pretty tricky issue and it would be good to know your thoughts. Thanks!
Intermediate & Advanced SEO | | neooptic0