301 Re-Directs Puzzling Question on Page Returned in Search Results
-
On our website, www.BusinessBroker.net, we have 3 different versions of essentially the same page for each of our State Business for Sale Pages. Back in August, we did a test and did 301 redirects using 5 States. For a long while after doing the redirects, the pages fell out of Google search results - we used to get page 1 rankings. Just recently they started popping back up on Page 1. However, I noticed that the new page meta data is not what is being picked up -- here is the example.
Keyword Searched for in Google -- "Maine Business for Sale"
Our listing shows up on Page 1 -- # 8 Result
URL returned is correct preferred version: - http://www.businessbroker.net/state/maine-Businesses_For_Sale.aspx
However, the Page Title on this returned page is still the OLD page title -
OLD TITLE -- maine Business for Sale Ads - maine Businesses for Sale & Business Brokers - Sell a Business on Business Broker
Not the title that is designated for this page -
New Title - Maine Businesses for Sale - Buy or Sell a Business in ME | BusinessBroker.net
Ditto for Meta Description.
Why is this happening?
Also have a problem with lower case showing up rather than upper case -- what's causing this?
http://www.businessbroker.net/state/maine-Businesses_For_Sale.aspx
versus -- http://www.businessbroker.net/State/Maine-Businesses_For_Sale.aspx
Any help would be appreciated.
Thanks, MM
-
thanks - we did some more research on our end and our developer found this --
The problem with the title, description and keywords is because we updated these for just Wyoming, West Virginia, Vermont, Maine and Florida. I made the mistake of assuming the URL would always have the proper case of the state in the URL but as we have discovered that was a bad assumption. The code was looking for those 5 states with the first letter capitalized and the link from Google was not so it defaulted to the format for the other states that we haven't changed yet. I have fixed that code so now those 5 states will display the correct title, description and keywords regardless of the case of the state in the URL. I will update the live site in the morning so this issue will be taken care of. We will still need to discuss the how best to handle the URLs that Google is getting with the incorrect case.
-
I can't say for sure what happened last time since I am not exactly sure what you did. But as long as the 301 redirects are set up correctly and Google is not having any trouble accessing and crawling them, then you shouldn't experience any major negative results over the long term.
Now that I've read your initial post again, I see that the Maine page is one of the States you tried to redirect as part of your test. However, as I posted above, the old page is not being 301 redirected to the new page, so Google may have dropped your site in the rankings since you essentially had two very similar pages competing against each other for the same terms.
-
The page that is ranking #8 in Google for me is http://www.businessbroker.net/state/maine-Businesses_For_Sale.aspx, and on that page, it has the old Title tag and it is not redirected to the version of the URL with the new Title tag.
When I visit http://www.businessbroker.net/State/Maine-Businesses_For_Sale.aspx, I am seeing the new Title tag.
Since these are two completely different pages you will need to 301 redirect the URL with the old Title tag to the new one. That should solve your problems.
-
follow up question regarding the upper and lower case question from our web developer ---
The question hasn't been how to do it. The question is what happens to all of the pages that are indexed by Google improperly when we do this? Are we going to see the same thing as when we redirected the states with a big drop for 6 months?
Keith
-
Thanks for the response, appreciate it. I'm pretty confident we re-directed the non-preferred URLs to this preferred page --
http://www.businessbroker.net/State/Maine-Businesses_For_Sale.aspx
This page has the updated Title Tag, Meta Description, etc. however, is not the one that shows up in the Google Search Result for "Maine Business for Sale"
-
I visited the page http://www.businessbroker.net/state/maine-Businesses_For_Sale.aspx and the Title tag in the HTML is "maine Business for Sale Ads - maine Businesses for Sale & Business Brokers - Sell a Business on Business Broker" so perhaps you did not publish the new versions of the Title tags?
As for your lower case/upper case issue, I went to both URLs and they both resolve to an active page. I would suggest making the URLs consistent to minimize the risk of duplicate content. First, I would set the designated URL in the rel="canonical" tag for each page. And depending on the type of server, I would suggest forcing the URLs to 301 redirect to a single version of the URL. Here is a good blog post on how to address this specific issue - http://www.seomoz.org/blog/common-technical-seo-problems-and-how-to-solve-them
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Merge Strategy: Choosing Target Pages for 301 Redirects
I am going to be merging two sites. One is a niche site, and it is being merged with the main site. I am going to be doing 301 redirects to the main site. My question is, what is the best way of redirecting section/category pages in order to maximize SEO benefits. I will be redirecting product to product pages. The questions only concerns sections/categories. Option 1: Direct each section/category to the most closely matched category on the main site. For example, vintage-t-shirts would go to vintage-t-shirt on main site. Option 2: Point as many section/category pages to larger category on main site with selected filters. We have filtered navigation on our site. So if you wanted to see vintage t-shirts, you could go to the vintage t-shirt category, OR you could go to t-shirts and select "vintage" under style filter. In the example above, the vintage-t-shirt section from the niche site would point to t-shirts page with vintage filter selected (something like t-shirts/#/?_=1&filter.style=vintage). With option 2, I would be pointing more links to a main category page on the main site. I would likely have that page rank higher, because more links are pointing to it. I may have a better overall user experience, because if the customer decides to browse another style of t-shirt, they can simply unselect the filter and make other selections. Questions: Which of these options is better as far as: (1) SEO, (2) User experience If I go with option 2, the drawback is that the page titles will all be the same (i.e vintage-t-shirts pointing to the page with filter selected would have "t-shirts" as page title instead of a more targeted page with page title "vintage t-shirts." I believe a workaround would be to pull filter values from the URL and append them to the page title. That way page title for URL t-shirts/#/?=1&filter.style=vintage_ would be something like "vintage, t-shirts." Is this the appropriate way to deal with it? Any thoughts, suggestions, shared experiences would be appreciated.
Intermediate & Advanced SEO | | inhouseseo0 -
Better to 301 or de-index 403 pages
Google WMT recently found and called out a large number of old unpublished pages as access denied errors. The pages are tagged "noindex, follow." These old pages are in Google's index. At this point, would it better to 301 all these pages or submit an index removal request or what? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Google Is Indexing My Internal Search Results - What should i do?
Hello, We are using a CMS/E-Commerce platform which isn't really built with SEO in mind, this has led us to the following problem.... a large number of internal (product search) search result pages, which aren't "search engine friendly" or "user friendly", are being indexed by google and are driving traffic to the site, generating our client revenue. We want to remove these pages and stop them from being indexed, replacing them with static category pages - essentially moving the traffic from the search results to static pages. We feel this is necessary as our current situation is a short-term (accidental) win and later down the line as more pages become indexed we don't want to incur a penalty . We're hesitant to do a blanket de-indexation of all ?search results pages because we would lose revenue and traffic in the short term, while trying to improve the rankings of our optimised static pages. The idea is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages. Our main focus is to improve user experience and not have customers enter the site through unexpected pages. All thoughts or recommendations are welcome. Thanks
Intermediate & Advanced SEO | | iThinkMedia0 -
Should we show(to google) different city pages on our website which look like home page as one page or different? If yes then how?
On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.
Intermediate & Advanced SEO | | sanchitmalik0 -
Why would our server return a 301 status code when Googlebot visits from one IP, but a 200 from a different IP?
I have begun a daily process of analyzing a site's Web server log files and have noticed something that seems odd. There are several IP addresses from which Googlebot crawls that our server returns a 301 status code for every request, consistently, day after day. In nearly all cases, these are not URLs that should 301. When Googlebot visits from other IP addresses, the exact same pages are returned with a 200 status code. Is this normal? If so, why? If not, why not? I am concerned that our server returning an inaccurate status code is interfering with the site being effectively crawled as quickly and as often as it might be if this weren't happening. Thanks guys!
Intermediate & Advanced SEO | | danatanseo0 -
How to remove my site's pages in search results?
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
Intermediate & Advanced SEO | | esiow2013
Disallow: /
Allow: /$0 -
How come I get different rankings on same word in local search results of Google?
Dear fellow Mozzer's, for one of my clients I get different local results in Google. My client is a real-estate broker and when I search on "real-estate agent" + the city name we are on top. So whoohoo you would say BUT when Firefox has the exact city name determined as the location I am in and I only use "real-estate agent" I get also the local results but we are listed as number 8?? Hope anyone can give me insights as I have no idea what's causing this. Thanks in advance for your help!
Intermediate & Advanced SEO | | newtraffic0