Changing the city of operation and trying to know the best way of informing Google
-
We are having a business operating out of three cities A, B and C with A being the primary address and the business provides its services in B and C as well. Business has decided to shut shop in C and instead add D as another city. Currently the URLs are like www.domainname.com/A/productswww.domainname.com/B/productswww.domainname.com/C/productsPlease help us in understanding the best way to inform google that City C is non operational now.Do we need to do the redirects, and if yes, should we do the redirects to Home Page?Or can we just remove the C city URLs from the webmaster tool and inform Google.
-
Hi Sukhbir,
Currently, the best way I know of reporting that a location has closed is to go to this page:
https://support.google.com/places/
Click the red 'Contact Us' button.
Go through the wizard, choosing the 'my listing has incorrect information' and then the 'this business no longer exists' options.
Link to the URL of the Google+ Local page for the closed location, and in the additional notes section, explained that this branch of the business has closed, though the others remain open.
My understanding is that this does not completely delete the location from Google's system - there is currently no way to do so - but will prevent it from appearing for your service related terms. It may still appear for people searching specifically for that location, but will have a label on it stating that the business is closed. Not a perfect solution, but the best I know of that Google currently offers.
Beyond this, I would recommend that you manually remove as many third party citations of the business on other directories such as YP.com, Yelp, CitySearch, etc., so that you are getting rid of as much data as possible that supports the existence of the business.
Your website should be edited to remove absolutely all references to the closed location. Not sure about redirecting pages. My main goal would simply be to get rid of any references to the closed location.
Hope this helps!
-
Hi Sukhbir,
What you could is redirect the C city urls to one of the closest locations so at least your users will know that location C is closed. You then could also add the URLs for location C to your robots.txt to make sure Google won't be crawling them anymore and stop indexing hopefully.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bigcommerce only allows us to have https for our store only, not the other pages on our site, so we have a mix of https and http, how is this hurting us and what's the best way to fix?
So we aren't interested in paying a thousand dollars a month just to have https when we feel it's the only selling point of that package, so we have https for our store and the rest of the site blogs and all are http. I'm wondering if this would count as duplicate content or give us some other unforeseen penalty due to the half way approach of https being implemented. If this is hurting us, what would you recommend as a solution?
Technical SEO | | Deacyde0 -
When will all of Google Maps be the same again?
As many of you are aware that the pigeon update was only applied to the new Google maps resulting in very different search results for Google local business. When you search for a business on old Google maps then you get totally different results vs the new Google maps. Some businesses totally disappeared completely from the search results. I have done my research and found out that it's because the new Algo was only applied to the new maps. Also new algo does not apply to other countries. Well the reason I posted this topic is because I have noticed that all the new Google Business listings I am verifying for my clients are all being put under the old Google maps and not the new ones. They come up fine when searching from old maps but not the new ones. I understand Google has not rolled out the pigeon on all data centers but why? Will Google eventually roll out the update to old maps? Since Google is adding businesses to old google maps then what's the point of even adding new listings?
Technical SEO | | bajaseo0 -
Should I change the URL now?
Hi all, I have a client website that got hit in the latest algorithm update. It since appears that it had over 100 suspect links to it. I performed the Disavow procedure a few weeks ago via my Google Webmaster account, but have not received a message yet to say its been actioned. The majority of these suspect links go to one page. I am considering changing the base category (in Wordpress) to a different keyphrase and then submitting a new sitemap for indexing. This way there will be no actual link from a suspect website to a page on my website. Do you see what I mean? Will this help do you think? Thanks in advance.
Technical SEO | | BrandC0 -
Best Way to Break Down Paginated Content?
(Sorry for my english) I have lots of user reviews on my website and in some cases, there are more than a thousand reviews for a single product/service. I am looking for the best way to break down these reviews in several sub-pages. Here are the options I thought of: 1. Break down reviews into multiple pages / URL http://www.mysite.com/blue-widget-review-page1
Technical SEO | | sbrault74
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be indexed by search engines. Pros: all the reviews are getting indexed Cons: It will be harder to rank for "blue widget review" as their will be many similar pages 2. Break down reviews into multiple pages / URL with noindex + canonical tag http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be set to noindex and the canonical tag would point to the first review page. Pros: only one URL can potentially rank for "blue widget review" Cons: Subpages are not indexed 3. Load all the reviews into one page and handle pagination using Javascript reviews, reviews, reviews
more reviews, more reviews, more reviews
etc... Each page would be loaded in a different which would be shown or hidden using Javascript when browsing through the pages. Could that be considered as cloaking?!? Pros: all the reviews are getting indexed Cons: large page size (kb) - maybe too large for search engines? 4. Load only the first page and load sub-pages dynamically using AJAX Display only the first review page on initial load. I would use AJAX to load additional reviews into the . It would be similar to some blog commenting systems where you have to click on "Load more comments" to see all the comments. Pros: Fast initial loading time + faster loading time for subpages = better user experience Cons: Only the first review page is indexed by search engines ========================================================= My main competitor who's achieving great rankings (no black hat of course) is using technique #3. What's your opinion?0 -
Best way to manage SEO for a massive events listing website.
I run a website that tracks entertainment for the entire state of South Dakota. While I've made some fantastic strides in gaining traffic, I feel lost on how to manage all those entries in an SEO friendly manner. I have a TON of errors showing on my crawl diagnostics and I just don't know what to do. The nature of the website is such that there are going to be duplications all over the place. I know that I can help some of this by getting my canonical links setup properly (that's coming in my next version of the site's theme), but what else should I do to make those event listings friendly for the SE's?? http://www.entertainsd.com
Technical SEO | | jcherland0 -
Google Indexed URLs for Terms Have Changed Causing Huge SERP Drop
We haven't made any significant changes to our website, however the pages that google has indexed for our critical keywords have changed to pages that have caused our SERP to drop dramatically for those pages. In some cases, the changes make no sense at all. For example, one of our terms that used to be indexed to our homepage is now indexed to a dead category page that has nothing on it. One of our biggest terms, where we were 9th, changed and is now indexed to our FAQ. As a result, we now rank 44th. This is having a MAJOR impact on our business so any help on why this sudden change happened and what we can do to combat it is greatly appreciated.
Technical SEO | | EvergladesDirect0 -
Google penalty
Anyone have any success stories on what they did to get out of Google penalty?
Technical SEO | | phatride0 -
Site Change of Address - best method?
When changing domains, there's the obvious anxiety about sacrificing the value of your old domain. A client recently changed domains, immediately killed the old site (did everything properly with 301s, Webmaster Tools etc etc etc) and lost rankings completely for weeks. Turns out the site had been 'burnt' by the previous owner and it took a reconsideration request from Google before things recovered. Cost them rankings and cash with extra PPC spend. My question is: In order to avoid this potential hazard, what are your thoughts on submitting a change of address in Webmaster tools, but then leaving old site live for a few weeks to see how things pan out? I have never tried it and it seems to go against the grain, but interested to hear other people's experiences and how they have managed to change domain with minimal temporary damage. Thanks.
Technical SEO | | RiceMedia0