Changing the city of operation and trying to know the best way of informing Google
-
We are having a business operating out of three cities A, B and C with A being the primary address and the business provides its services in B and C as well. Business has decided to shut shop in C and instead add D as another city. Currently the URLs are like www.domainname.com/A/productswww.domainname.com/B/productswww.domainname.com/C/productsPlease help us in understanding the best way to inform google that City C is non operational now.Do we need to do the redirects, and if yes, should we do the redirects to Home Page?Or can we just remove the C city URLs from the webmaster tool and inform Google.
-
Hi Sukhbir,
Currently, the best way I know of reporting that a location has closed is to go to this page:
https://support.google.com/places/
Click the red 'Contact Us' button.
Go through the wizard, choosing the 'my listing has incorrect information' and then the 'this business no longer exists' options.
Link to the URL of the Google+ Local page for the closed location, and in the additional notes section, explained that this branch of the business has closed, though the others remain open.
My understanding is that this does not completely delete the location from Google's system - there is currently no way to do so - but will prevent it from appearing for your service related terms. It may still appear for people searching specifically for that location, but will have a label on it stating that the business is closed. Not a perfect solution, but the best I know of that Google currently offers.
Beyond this, I would recommend that you manually remove as many third party citations of the business on other directories such as YP.com, Yelp, CitySearch, etc., so that you are getting rid of as much data as possible that supports the existence of the business.
Your website should be edited to remove absolutely all references to the closed location. Not sure about redirecting pages. My main goal would simply be to get rid of any references to the closed location.
Hope this helps!
-
Hi Sukhbir,
What you could is redirect the C city urls to one of the closest locations so at least your users will know that location C is closed. You then could also add the URLs for location C to your robots.txt to make sure Google won't be crawling them anymore and stop indexing hopefully.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not using redirect
We have a GEO-IP redirect in place for our domain, so that users are pointed to the subfolder relevant for their region, e.g: Visit example.com from the UK and you will be redirected to example.com/uk This works fine when you manually type the domain into your browser, however if you search for the site and come to example.com, you end up at example.com I didn't think this was too much of an issue but our subfolders /uk and /au are not getting ranked at all in Google, even for branded keywords. I'm wondering if the fact that Google isn't picking up the redirect means that the pages aren't being indexed properly? Conversely our US region (example.com/us) is being ranked well. Has anyone encountered a similar issue?
Technical SEO | | ahyde0 -
What is the best way to use canonical tag
Hi, i have been researching this since yesterday and have looked at this subject many times before but still cannot get my head around it. i done a report on my site which was very useful, i used http://www.juxseo.com for my site www.in2town.co.uk and it brought me some useful information but part of that info was it was telling me that i should have on my home page a canonical tag which would improve my seo. Now i am using sh404sef for my friendly urls and i am using joomla 3.0 and when i approached the makers of the sh404sef to ask about the tag they said i would need to be careful of using it as it could damage my site and my rankings. i have read lots of information but still do not have a clear understanding behind it. can anyone please explain the best way to use this and should i be using where i may have some sort of duplicate page, any help to understand this would be great.
Technical SEO | | ClaireH-1848860 -
Why are my changes not being updated?
Last week, I was notified of having a lot of duplicated title pages. I've recently made the changes on my website with unique content. I went back on MOZ this morning, and I'm still notified with the same problems. However, when I check the back end of that specific page, I see the changes already made. My question is, why are my changes not being updated in MOZ? Does it take awhile for MOZ to recognize this or am I missing a step?
Technical SEO | | ckroaster0 -
Google Places Page Changes
We had a client(dentist) hire another marketing firm(without our knowledge) and due to some Google page changes they made, their website lost a #1 ranking, was disassociated with the places page and was placed at result #10 below all the local results. We quickly made some changes and were able to bring them up to #2 within a few days and restore their Google page after about a week, but the tracking/forwarding phone number the marketing company was using shows up on the page despite attempts to contact Google through updating the business in places management as well as submit the phone number as incorrect while providing the correct phone number. And because the client fired that marketing company, the phone number will no longer be active in a few days. Of course this is very important for a dental office. Has anyone else had problems with the speed and updating Google Places/Plus pages for businesses? What's the most efficient way to make changes like this?
Technical SEO | | tvinson0 -
How to optimize for different google seach center (google.de, google.ch) ?
We all use Deutsch language and (.com) domains for the sites. I ranked well in google.com ,but not so well in google.de , google.ch , my competitors ranked much better in google.de,google.ch. I checked most of their outbound-links, but get few information. Links from (.DE) domains or links from sites located in German help the rank for special google seach center ? (google.de, google.ch) . Or some other factors i missed? please help.
Technical SEO | | sunvary0 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1 -
Problems with google cache
Hi Can you please advise if the following website is corrupted in the eyes of Google, it has been written in umbraco and I have taken over it from another developer and I am confused to why it is behaving the way it is. cache:www.tangoholidaysolutions.com When I run this all I see is the header, the start of the main content and then the footer. If I view text view all the content is visible. The 2nd issue I have with this site is as follows: Main Page: http://www.tangoholidaysolutions.com/holiday-lettings-spain/ This page is made up of widgets i.e. locations, featured villas, content However the widgets are their own webpages in their own right http://www.tangoholidaysolutions.com/holiday-lettings-spain/location-picker/ My concern is that this part pages will affect the performance of the seo on the site. In an ideal world I would have the CMS setup so these widgets are not classed as pages, but I am working on this. Thanks Andy
Technical SEO | | iprosoftware0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0