How Google's "Temporarily remove URLs" in search console works?
-
Hi,
We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar.
We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
-
Hi there
Totally agree with Logan here. I would also make sure that you update your sitemap XMLs to include the new subdomain URLs, and also make sure your internal links are updated as well. If you are able to update high value links to the old subdomain to the new subdomain, that would be hugely beneficial as well.
Hope this helps! Good luck!
Patrick -
I'd recommend 301 redirecting the old version of the content to its new location on the new sub-domain. That's generally the quickest way to let search engines (and people) know you've relocated important content. Hiding URLs from Search Console is temporary only and not really intended for pointing search engines to relocated content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword optimisation: Google's eyes before users' eyes?
Hi all, So the default and ultimate suggestion about how to rank a page high is to get favoured by users, so by the Google. But if write content in favour of users, it may miss out the keywords or will not have much keyword density and variety of keywords to get in to Google's eyes. Then we may appear around 3rd page; then how do we get into top slots? I can see some top results without even a single mention of the keyword they are ranking for. How that would be possible? Thanks
Algorithm Updates | | vtmoz0 -
Google's spell check recognize a keyword with volume
When the keyword "acls recertification" (an important keyword for our client) is typed into the Google search box, the word "recertification" is underlined in red. Note that you only need to type "acls rec" to make the red underline appear.BUT, Google does not underline the word "recertification" when it is typed into the search box alone, nor does Google underline the word "recertification" when the following keywords are searched: cpr recertification bls recertification pals recertification ^These are all closely related to the keyword "acls recertification," so this spell check behavior is very inconsistent.Why does this matter? Because no matter how close you come to typing "acls recertification," Google's autocomplete suggestions never include "acls recertification" (because of the perceived misspelling?).BUT, Google does suggest "acls recertification online" in the dropdown menu. If you select the "acls recertification online" suggestion then backspace until the word "online" is gone, the red underline disappears, and "acls recertification" becomes an autocomplete suggestion. VERY strange behavior...I have replicated this issue on various depersonalized browsers and devices, so I am confident that this is not related to my personal settings.This keyword contributes to a large portion of our client's business (they specialize in acls certification and recertification), so you can imagine how concerning this is for us. Note that until very recently (3-4 months ago), this keyword did NOT have any spell-check issues. This keyword averages 2400 searches per month according to AdWords which should be enough volume to allow Google to recognize the correct spellingI posted this issue in the Google product forums, where I was advised to submit feedback directly on the search results page via Google's "feedback" link. I have submitted this feedback to Google, but I thought I would bring this to the MOZ community as well to see if anyone has experienced a similar issue, or has any ideas as to what could be causing this issue.
Algorithm Updates | | RyanKent0 -
Removing an old Google places listing for a newer version?
Hey there, I was wondering whether you could help me out on the following; One of our clients has a Google places listing that we created for their business but it appears to be being blocked - or at least conflicting - with an old listing. As such, Google appears to be showing the old listing with an outdated URL and company name - rather than the new one. Does anyone know how I can go about removing this listing or showing that the newer one is now more relevant? Unfortunately, I don't have the logins for the old places listing. Old listing; https://plus.google.com/105224923085379238289 New listing; https://plus.google.com/b/114641937407677713536/114641937407677713536
Algorithm Updates | | Webrevolve0 -
Is it OK to 301 redirect the index page to a search engine friendly url
Is it OK to 301 redirect the index page to a search engine friendly url.
Algorithm Updates | | WinningInch0 -
What Is The Deal Between Indeed and Google?
Anyone notice the love affair of Indeed and Google lately? Indeed is cannibalizing the top 30 SERPs for job related keywords. Seeing keywords where Indeed has 10-15 of the organic listings in the top 30. Compete.com is showing a +8% increase in search volume between in April and May. But it seems as if they really started to cannibalize the SERPS since the Penguin update at end of May. Any one else noticing this?
Algorithm Updates | | joncrowe0 -
Why am i seeing a "conduit" line for search engine sources in Google Analytics ?
Among Google, Yahoo, Bing etc... One of the line is "Conduit". I never heard about this engine but, accordingly to Google Analytics metrics, it is the engine that bring the best traffic to my site in terms of pages per visit.
Algorithm Updates | | betadvisor0 -
If we are getting clicks from a local one box as a citation in the serps's would we see this as the referrer in GA?
If we are getting clicks from a local one box as a citation in the serps's
Algorithm Updates | | Mediative
would we see this as the referrer in GA?0 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0