Remove URLs from App
-
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview
Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable.
Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing.
Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs?
Thanks for your help and advice!
-
So, you basically can't 'force' Google to do anything but there may be better ways to encourage them to remove these URLs
The only way to force Google to remove a URL is to use the URL removal tool in Google Search Console but this only removes a page temporarily and it's a pain to do en-masse submissions. As such, not my recommendation
One thing to keep in mind. You have loads of pages with no-index directives on, but Google is also blocked frown crawling those pages via robots.txt. So if Google can't crawl the URLs, how can it find the no-index directives you have given? Robots.txt should be used for this - but your chronological deployment is off it's too early. You should put this on at the very, very end when Google has 'gotten the message' and de-indexed most of the URLs (makes sense, yes?)
My steps would be:
- No-index all these URLs either with the HTML or X-Robots (HTTP header) deployment (there are multiple Meta robots deployments, if editing the page-code is gonna be difficult! Read more here)
- Also deploy noarchive in the same way to stop Google caching the URLs. Also deploy nosnippet to remove the snippets from Google's results for these pages, which will make them less valuable to Google in terms of ranking them
- For the URLs that you don't want indexed, make the page or screen obviously render content that says the page is not available right now. This one might be tricky for you as you can't do it just for Googlebot, that would be considered cloaking under some circumstances
- On the pages which you have no-indexed, serve status code 404 to Google only (if it's just a status code, it's not considered cloaking). So for useragent GoogleBot make the HTTP response a 404 on those URLs (temporarily available but coming back). Remember to leave the actual, physical contents of the page the same for both Googlebot and users, though
- If that doesn't work swap out the 404 (sent only to GoogleBot) with a 410 (status code: gone, not coming back) to be more aggressive. Note that it will then be harder to get Google to re-index these URLs later. Not impossible, but harder (so don't open with this)
- Once most URLs have been de-indexed and de-cached by Google, put the robots.txt rule(s) back on to stop Google crawling these URLs again
- Reverse all changes once you want the pages to rank (correct the page's contents, remove nosnippet, noarchive and noindex directives, correct the status code, lift the robots.txt rules etc)
Most of this hinges on Google agreeing with and following 'directives'. These aren't hard orders, but the status code alterations in particular should be considered much harder signals
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang: customize, selection the best URL structure
Hi All,
Local Website Optimization | | SergeyFufaev
We have two websites:
example.info - this is a working site in Russian hreflang="ru"
example.com - this new site We want to start with US. For the US, we will have: local address and phone, currency in $, fully translated content.
In the future we want to expand the business (ie en-GB, en-CA, de-DE, fr-CA, fr-FR). For each country, a regional dialect, currency, address and telephone number will be used. I need to choose the right URL structure so that there won't be problems in the future. 1. When configuring geotargeting (ie fr-CA and en-CA ) in the URL of the page specify: • http://example.com/ca/ - hreflang="en-CA" - Can use Search Console geotargeting
• http://example.com/ca/fr/ - hreflang="fr-CA"
or
• http://example.com/en-ca/ - hreflang="en-CA" - Can I use a geo-targeting search console?
• http://example.com/fr-ca/ - hreflang="fr-CA" .
or
• http://example.com/ca-en/ - hreflang="en-CA" - Can I use a geo-targeting search console?
• http://example.com/ca-fr/ - hreflang="fr-CA" . quote: To geotarget your site on Google:
o Page or site level: Use locale-specific URLs for your site or page. 2. If I set the target (ie "en-CA", "fr-CA" and "fr-FR"). Can I use the page http://example.com/fr/ with customized targeting (hreflang = "fr-FR") for french speakers worldwide (hreflang= "fr"), ie: french speakers worldwide quote: "If you have several alternate URLs targeted at users with the same language but in different locales, it's a good idea also to provide a catchall URL for geographically unspecified users of that language. For example, you may have specific URLs for English speakers in Ireland (en-ie), Canada (en-ca), and Australia (en-au), but should also provide a generic English (en) page for searchers in, say, the US, UK, and all other English-speaking locations. It can be one of the specific pages, if you choose." 3. Where is it better to place select of language and country on the page?
Header, footer, pop-up window ......
The page http://example.com will be used for hreflang = "en". In my case, do I need x-default? Can I use a page with hreflang="en"configured as the x-default version? ie: Is it right?0 -
Help choosing ideal URL structure
Hi All, We are considering changing the link structure for the website of a large restaurant group, which represents about 100 restaurants in the USA. While I have some opinions, I'd very much welcome the opinions of some other seasoned SEO's as well. There are two options on the table for the link structure, which you can see below. The question is for restaurants with multiple locations, and how we structure those URLs. The main difference is whether we include the "/location/" of the URL, or if that is overkill? I suppose maybe it could have some value if someone is searching a term like "Bub City Location", with "location" right in the search. But otherwise, it just adds to the length of the URL, and I'm not sure if it'll bring any extra value... In this example, "bub-city" is the restaurant name, and "mb-financial-park" is one of the locations. Option A
Local Website Optimization | | SMQ
http://leye.local/restaurant/bub-city
http://leye.local/restaurant/bub-city/location/mb-financial-park/ Option B
http://leye.local/restaurant/bub-city
http://leye.local/restaurant/bub-city/mb-financial-park/ Thoughts?0 -
I have a Wordpress site that ranks well and a blog (uses blogger) with slightly different URL/domain that also ranks decently. Should I combine the 2 under the website domain or keep both?
I realize that I am building essentially 2 different sites even though they are connected, but on some local town pages i have 2-3 results on Page #1. Nice problem to have eh? But i am worried as for a lot of my surrounding towns my competitor has the top listing or definitely ahead of me, so i am wondering if i combine or convert my blog into the same domain as my site, then all of that content + links should hopefully propel my site to #1. Anyone have an experience like this? thanks, Chris
Local Website Optimization | | Sundance_Kidd0 -
URL and title strategy for multiple location pages in the same city
Hi, I have a customer which opens additional branches in cities where he had until now only one branch. My question is: Once we open new store pages, what is the best strategy for the local store pages in terms of URL and title?
Local Website Optimization | | OrendaLtd
So far I've seen some different strategies for URL structure:
Some use [URL]/locations/cityname-1/2/3 etc.
while others use [URL]/locations/cityname-zip code/
I've even seen [URL]/locations/street address-cityname (that's what Starbucks do) There are also different strategies for the title of the branch page.
Some use [city name] [state] [zip code] | [Company name]
Other use [Full address] | [Company name]
Or [City name] [US state] [1/2/3] | [Company name]
Or [City name] [District / Neighborhood] [Zip Code] | [Company name] What is the preferred strategy for getting the best results? On the one hand, I wish differentiate the store pages from one another and gain as much local coverage as possible; on the other hand, I wish to create consistency and establish a long term strategy, taking into consideration that many more branches will be opened in the near future.1 -
Removed huge spammy location footer, looking to rebuild traffic the right way
Hello, On this site, I removed a huge spammy location footer with hundreds of cities, states, and dog training types. The traffic and rankings have gone down a lot, and I'd like a discussion on how to rebuild things the right way. There's some local adjustments to be made to the home page content, but other than that: My plans: 1. Analyze top 10 Google analytics keyword queries and work them into the content as best as possible, though I am debating whether the client should make new pages and how many. 2. I'm going to suggest he add a lot of content to the home page, perhaps a story about a dog training that he did in Wisconsin. I'll think about what else. Any advice is appreciated. Thanks.
Local Website Optimization | | BobGW0 -
Multiple Locations with Branded Name/Keyword in URL
I have a client, let's call him "Bob". Bob has 2 stores where he sells "Widgets", Bob's Widgets and Bob's Widgets South. These locations are roughly 40 miles from each other and serve two different marketplaces. Each location has their own website "www.bobswidgets.com & www.bobswidgetssouth.com". Each location is run by different individuals. The Store Manager at Bob's Widgets is complaining that when you type "Bob's Widgets" into the search engines "Bob's Widgets South" website is indexing in the 2nd and/or 3rd position. The Store Manager at Bob's Widgets feels that Bob's Widgets South could be stealing business from him because of the way Google is indexing the sites. I have explained to him that the keyword the user is typing in is in both names of the locations and in each URL and this is prompting the search engine to index both sites. Am I missing something else???
Local Website Optimization | | mittcom0 -
Best way to remove spammy landing pages?
Hey Mozzers, We recently took over a website for a new client of ours and discovered that their previous webmaster had been using a WordPress plugin to generate 5,000+ mostly duplicated local landing pages. The pages are set up more or less as "Best (service) provided in (city)" I checked Google Webmaster Tools and it looks like Google is ignoring most of these spammy pages already (about 30 pages out of nearly 6,000 are indexed), but it's not reporting any manual webspam actions. Should we just delete the landing pages all at once or phase them out a few (hundred) at a time? Even though the landing pages are mostly garbage, I worry that lopping off over 95% of a site's pages in one fell swoop could have other significant consequences. Thanks!
Local Website Optimization | | BrianAlpert780 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0