Remove URLs from App
-
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview
Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable.
Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing.
Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs?
Thanks for your help and advice!
-
So, you basically can't 'force' Google to do anything but there may be better ways to encourage them to remove these URLs
The only way to force Google to remove a URL is to use the URL removal tool in Google Search Console but this only removes a page temporarily and it's a pain to do en-masse submissions. As such, not my recommendation
One thing to keep in mind. You have loads of pages with no-index directives on, but Google is also blocked frown crawling those pages via robots.txt. So if Google can't crawl the URLs, how can it find the no-index directives you have given? Robots.txt should be used for this - but your chronological deployment is off it's too early. You should put this on at the very, very end when Google has 'gotten the message' and de-indexed most of the URLs (makes sense, yes?)
My steps would be:
- No-index all these URLs either with the HTML or X-Robots (HTTP header) deployment (there are multiple Meta robots deployments, if editing the page-code is gonna be difficult! Read more here)
- Also deploy noarchive in the same way to stop Google caching the URLs. Also deploy nosnippet to remove the snippets from Google's results for these pages, which will make them less valuable to Google in terms of ranking them
- For the URLs that you don't want indexed, make the page or screen obviously render content that says the page is not available right now. This one might be tricky for you as you can't do it just for Googlebot, that would be considered cloaking under some circumstances
- On the pages which you have no-indexed, serve status code 404 to Google only (if it's just a status code, it's not considered cloaking). So for useragent GoogleBot make the HTTP response a 404 on those URLs (temporarily available but coming back). Remember to leave the actual, physical contents of the page the same for both Googlebot and users, though
- If that doesn't work swap out the 404 (sent only to GoogleBot) with a 410 (status code: gone, not coming back) to be more aggressive. Note that it will then be harder to get Google to re-index these URLs later. Not impossible, but harder (so don't open with this)
- Once most URLs have been de-indexed and de-cached by Google, put the robots.txt rule(s) back on to stop Google crawling these URLs again
- Reverse all changes once you want the pages to rank (correct the page's contents, remove nosnippet, noarchive and noindex directives, correct the status code, lift the robots.txt rules etc)
Most of this hinges on Google agreeing with and following 'directives'. These aren't hard orders, but the status code alterations in particular should be considered much harder signals
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rank Tracking URLs from specific locations
Hi, i'm trying to report on the ranks of my local landing page URLs within my website. What is the best way of seeing this data from certain locations around the UK? For example - I have a landing page that is targeting London. How can I see how that ranks in the SERPs from various locations within the Greater London area? Can this be done accurately on MOZ or SEMrush? I would like to see how other people track their local pages for ranking locally. Thanks
Local Website Optimization | | SeoSheikh0 -
301 or 302 Redirects with locale URLs?
Hi Mozers, I have a bit of a tricky question I need some help answering. My agency are building a brand new website for a client of ours which means changing the domain name (yay...). So! I have my 301's all ready to go for the UK locale, however, the issue I have is that the site will also eventually have French, German and Spanish locales - but these won't be ready to go until later this year. We will be launching in just English for September. The current site already has the French and German locales on it as well. Just to make sure I'm being clear, the site will be www.example.com for launch, but by lets say November, we will also have a www.example.com/fr/ and www.example.com/de/ site launched too. So what do I do with the locale URLs? As I said above, the exisitng site already has the French and German locales on it, so I don't particularly want to redirect the /fr/ and /de/ URLs to the English homepage, as I will want to redirect them to the new URLs in November, and redirecting more than once is bad for SEO right? Any ideas? Would 302s maybe be the best suggestion? Thanks! Virginia
Local Website Optimization | | Virginia-Girtz1 -
Local SEO - Multiple stores on same URL
Hello guys, I'm working on a plan of local SEO for a client that is managing over 50 local stores. At the moment all the stores are sharing the same URL address and wanted to ask if it s better to build unique pages for each of the stores or if it's fine to go with all of them on the same URL. What do you think? What's the best way and why? Thank you in advance.
Local Website Optimization | | Noriel0 -
Which URL structure should I use?
samhillbands.com/bands/Charlottesville-VA-Wedding-Bands samhillbands.com/wedding/bands/Charlottesville-VA-Bands
Local Website Optimization | | brianvest0 -
How to approach SEO for a national website that has multiple chapter/location websites all under different URLs
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national umbrella site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local website. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago.) We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
Should I use keywords in all my URLs?
I couldn't find anything online that really covers my exact question. If I wanted to change my home page URL, (currently along the lines of "http://example.com/home") would it be a good idea to change it to "http://example.com/dallas-auto-repair"? Then on the "services page" I might change the URL to "http://example.com/dallas-auto-services". Pages like the contact page would probably remain simply "example.com/contact" Theoretically by putting my main keywords right there in the URL, I would imagine that I could get moved up in the SERPs. Am I wrong? So if this is a bad idea, please let me know why. If this is a good idea, do you have any articles or references that cover this, or even personal experience?
Local Website Optimization | | Marshall_Motors0 -
Changing Menu Url and Menu Anchor Text
Hi All, I have a well established site. I would like to make a couple of adjustments to my main menu. 1] Replace a menu url with an established page url.
Local Website Optimization | | Mark_Ch
2] Rename a menu anchor text to something more meaningful. What impact would changing the menu have? Thanks Mark0 -
What's the best way to add phrase keywords to the URL?
Hi, Our keywords are all our service + a list of towns (for example, "carpet cleaning St. Louis"). The issue I'm having is that one particular site could be targeting "carpet cleaning St. Louis", "carpet cleaning Manchester", "carpet cleaning Ballwin", "carpet cleaning Kirkwood", etc. etc. etc... up to maybe 15 different towns. Is there a way to effectively add these keywords into the URL without making it look spammy? I'm having the same issue with adding the exact keywords to the page title, img alt tag, etc. Thanks for any advice/input!
Local Website Optimization | | nataliefwc0