301 Re-Directs Puzzling Question on Page Returned in Search Results
-
On our website, www.BusinessBroker.net, we have 3 different versions of essentially the same page for each of our State Business for Sale Pages. Back in August, we did a test and did 301 redirects using 5 States. For a long while after doing the redirects, the pages fell out of Google search results - we used to get page 1 rankings. Just recently they started popping back up on Page 1. However, I noticed that the new page meta data is not what is being picked up -- here is the example.
Keyword Searched for in Google -- "Maine Business for Sale"
Our listing shows up on Page 1 -- # 8 Result
URL returned is correct preferred version: - http://www.businessbroker.net/state/maine-Businesses_For_Sale.aspx
However, the Page Title on this returned page is still the OLD page title -
OLD TITLE -- maine Business for Sale Ads - maine Businesses for Sale & Business Brokers - Sell a Business on Business Broker
Not the title that is designated for this page -
New Title - Maine Businesses for Sale - Buy or Sell a Business in ME | BusinessBroker.net
Ditto for Meta Description.
Why is this happening?
Also have a problem with lower case showing up rather than upper case -- what's causing this?
http://www.businessbroker.net/state/maine-Businesses_For_Sale.aspx
versus -- http://www.businessbroker.net/State/Maine-Businesses_For_Sale.aspx
Any help would be appreciated.
Thanks, MM
-
thanks - we did some more research on our end and our developer found this --
The problem with the title, description and keywords is because we updated these for just Wyoming, West Virginia, Vermont, Maine and Florida. I made the mistake of assuming the URL would always have the proper case of the state in the URL but as we have discovered that was a bad assumption. The code was looking for those 5 states with the first letter capitalized and the link from Google was not so it defaulted to the format for the other states that we haven't changed yet. I have fixed that code so now those 5 states will display the correct title, description and keywords regardless of the case of the state in the URL. I will update the live site in the morning so this issue will be taken care of. We will still need to discuss the how best to handle the URLs that Google is getting with the incorrect case.
-
I can't say for sure what happened last time since I am not exactly sure what you did. But as long as the 301 redirects are set up correctly and Google is not having any trouble accessing and crawling them, then you shouldn't experience any major negative results over the long term.
Now that I've read your initial post again, I see that the Maine page is one of the States you tried to redirect as part of your test. However, as I posted above, the old page is not being 301 redirected to the new page, so Google may have dropped your site in the rankings since you essentially had two very similar pages competing against each other for the same terms.
-
The page that is ranking #8 in Google for me is http://www.businessbroker.net/state/maine-Businesses_For_Sale.aspx, and on that page, it has the old Title tag and it is not redirected to the version of the URL with the new Title tag.
When I visit http://www.businessbroker.net/State/Maine-Businesses_For_Sale.aspx, I am seeing the new Title tag.
Since these are two completely different pages you will need to 301 redirect the URL with the old Title tag to the new one. That should solve your problems.
-
follow up question regarding the upper and lower case question from our web developer ---
The question hasn't been how to do it. The question is what happens to all of the pages that are indexed by Google improperly when we do this? Are we going to see the same thing as when we redirected the states with a big drop for 6 months?
Keith
-
Thanks for the response, appreciate it. I'm pretty confident we re-directed the non-preferred URLs to this preferred page --
http://www.businessbroker.net/State/Maine-Businesses_For_Sale.aspx
This page has the updated Title Tag, Meta Description, etc. however, is not the one that shows up in the Google Search Result for "Maine Business for Sale"
-
I visited the page http://www.businessbroker.net/state/maine-Businesses_For_Sale.aspx and the Title tag in the HTML is "maine Business for Sale Ads - maine Businesses for Sale & Business Brokers - Sell a Business on Business Broker" so perhaps you did not publish the new versions of the Title tags?
As for your lower case/upper case issue, I went to both URLs and they both resolve to an active page. I would suggest making the URLs consistent to minimize the risk of duplicate content. First, I would set the designated URL in the rel="canonical" tag for each page. And depending on the type of server, I would suggest forcing the URLs to 301 redirect to a single version of the URL. Here is a good blog post on how to address this specific issue - http://www.seomoz.org/blog/common-technical-seo-problems-and-how-to-solve-them
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How get google reviews on search results?
Hi, We have good google reviews. (4,8) Can we get this rating stars also on our organic search results ? Best remco
Intermediate & Advanced SEO | | remcoz0 -
Search Console - Best practice to fetch pages when you update them?
Hi guys, If you make changes to a page e.g. add more content or something is it good practice to get google to fetch that page again in search console? My assumption is this way, Google can review the updated page quicker, resulting in faster changes in the SERPs for that page. Thoughts? Cheers.
Intermediate & Advanced SEO | | wozniak650 -
Scary bug in search console: All our pages reported as being blocked by robots.txt after https migration
We just migrated to https and created 2 days ago a new property in search console for the https domain. Webmaster Tools account for the https domain now shows for every page in our sitemap the warning: "Sitemap contains urls which are blocked by robots.txt."Also in the dashboard of the search console it shows a red triangle with warning that our root domain would be blocked by robots.txt. 1) When I test the URLs in search console robots.txt test tool all looks fine.2) When I fetch as google and render the page it renders and indexes without problem (would not if it was really blocked in robots.txt)3) We temporarily completely emptied the robots.txt, submitted it in search console and uploaded sitemap again and same warnings even though no robots.txt was online4) We run screaming frog crawl on whole website and it indicates that there is no page blocked by robots.txt5) We carefully revised the whole robots.txt and it does not contain any row that blocks relevant content on our site or our root domain. (same robots.txt was online for last decade in http version without problem)6) In big webmaster tools I could upload the sitemap and so far no error reported.7) we resubmitted sitemaps and same issue8) I see our root domain already with https in google SERPThe site is https://www.languagecourse.netSince the site has significant traffic, if google would really interpret for any reason that our site is blocked by robots we will be in serious trouble.
Intermediate & Advanced SEO | | lcourse
This is really scary, so even if it is just a bug in search console and does not affect crawling of the site, it would be great if someone from google could have a look into the reason for this since for a site owner this really can increase cortisol to unhealthy levels.Anybody ever experienced the same problem?Anybody has an idea where we could report/post this issue?0 -
Multiple Results On First Page
Hi Guys, First question here, after splitting our content across 2 subdomains (~6 months ago) we've noticed google showing several of our pages on page 1. Would it be better to somehow consolidate to just one page (in the hopes that together it would push the rank higher or is it better left to google to work out on its own? I've attached an example of this happening with one of our targeted keywords. HwEARxd
Intermediate & Advanced SEO | | mattjamesaus0 -
ECommerce search results to noindex?
Hi, To avoid duplicated content and the possibility of thousands additional pages to an ecommerce website would it be a reasonable solution to have the page as a no-index, would this benefit the site? Thanks **Lantec **
Intermediate & Advanced SEO | | Lantec0 -
No matter what I do, my website isn't showing up in search results. What's happening?
I've checked for meta-robots, all SEO tags are fixed, reindexed with google-- basically everything and it's not showing up. According to SEOMoz all looks fine, I am making a few fixes, but nothing terribly major. It's a new website, and i know it takes a while, but there is no movement here in a month. Any insights here?
Intermediate & Advanced SEO | | Wabash0 -
301 redirect or Link back from old to new pages
Hi all, We run a ticket agent, and have multiple events that occur year after year, for example a festival. The festival has a main page with each event having a different page for each year like the below: Main page
Intermediate & Advanced SEO | | gigantictickets
http://www.gigantic.com/leefest-tickets (main page) Event pages:
http://www.gigantic.com/leefest-2010-tickets/hawksbrook-lane-beckenham/2009-08-15-13-00-gce/11246a
http://www.gigantic.com/leefest-2010-tickets/highhams-hill-farm-warlingham/2010-08-14-13-00-gce/19044a
http://www.gigantic.com/leefest-2011-tickets/highhams-hill-farm-warlingham/2011-08-13-13-00-gce/26204a
http://www.gigantic.com/leefest-2012-tickets/highhams-hill-farm-warlingham/2012-06-29-12-00-gce/32168a
http://www.gigantic.com/leefest-2013/highhams-hill-farm/2013-07-12-12-00 my question is: Is it better to leave the old event pages active and link them back to the main page, or 301 redirect these pages once they're out of date? (leave them there until there is a new event page to replace it for this year) If the best answer is to leave the page there, should i use a canonical tag back to the main page? and what would be the best way to link back? there is a breadcrumb there now, but it doesn't seem to obvious for users to click this. Keywords we're aming for on this example are 'Leefest Tickets', which has good ranking now, the main page and 2012 page is listed. Thanks in advance for your help.0 -
No equivalent page to re-direct to for highly trafficked pages, what should we do?
We have several old pages on our site that we want to get rid of, but we don't want to 404 them since they have decent traffic numbers. Would it be fine to set up a 301 re-direct from all of these pages to our home page? I know the best option is to find an equivalent page to re-direct to, but there isn't a great equivalent.
Intermediate & Advanced SEO | | nicole.healthline0