How Google's "Temporarily remove URLs" in search console works?
-
Hi,
We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar.
We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
-
Hi there
Totally agree with Logan here. I would also make sure that you update your sitemap XMLs to include the new subdomain URLs, and also make sure your internal links are updated as well. If you are able to update high value links to the old subdomain to the new subdomain, that would be hugely beneficial as well.
Hope this helps! Good luck!
Patrick -
I'd recommend 301 redirecting the old version of the content to its new location on the new sub-domain. That's generally the quickest way to let search engines (and people) know you've relocated important content. Hiding URLs from Search Console is temporary only and not really intended for pointing search engines to relocated content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword optimisation: Google's eyes before users' eyes?
Hi all, So the default and ultimate suggestion about how to rank a page high is to get favoured by users, so by the Google. But if write content in favour of users, it may miss out the keywords or will not have much keyword density and variety of keywords to get in to Google's eyes. Then we may appear around 3rd page; then how do we get into top slots? I can see some top results without even a single mention of the keyword they are ranking for. How that would be possible? Thanks
Algorithm Updates | | vtmoz0 -
Images not getting indexed in google image search :( " site: hdwallpaperzones.com " )
hi as i have mentioned in title.. my website images are not getting indexed in google image search engine.. out of 360 images only 5 got indexed from 3 days.. please help me out.. thanks
Algorithm Updates | | toxicpls0 -
AS we using the keyword related to our link but we are not listed in first page of Google search
AS we using the keyword related to our link but we are not listed in first page of Google search, but our competitors using the same keyword , they are listing in first page. how we can short this problem and get into first page on search
Algorithm Updates | | krisanantha0 -
Webpage is ranking on google.ie / google.co.uk but not google.com?
One of our site webpage appears to be found in the first few pages on google.ie / google.co.uk but not on google.com. Is there such a thing being penalised on a specific Google domain? Traffic is healthy despite this but I want to rank well for the page in google.com. Any ideas?
Algorithm Updates | | notnem0 -
Website "penalized" 3 times by Google
I have a website that I'm working with that has had the misfortune of gaining rankings/traffic on Google, then having the rankings/traffic removed...3 times! (Very little was changed on the site to gain or lose "favor" with Google, either.) Notes: Site is a mixture of high quality original content and duplicate content (vacation rental listings) When traffic crashes, we lose nearly all rankings and traffic (90+%) When traffic crashes, we lose all rankings sitewide, including those gained by our high quality, unique pages None of the "crash" dates appear to coincide with any Panda update dates We are working on adding unique content to our pages with duplicate content, but it's a long process and so far doesn't seem to have made any difference I'm confounded why Google keeps "changing its mind" about our site We have an XML sitemap, and Google keeps our site indexed pretty well, even when we lose our rankings Due to the drastic and sitewide loss of rankings, I'm assuming we are dealing with some sort of algorithmic penalty Timeline: Traffic steadily grows starting in Jan 2011 Traffic crashes on Feb 19, 2011. We assumed it was due to a pre-panda anti-scraper update, but don't know. Google sends traffic to our site on March 1, then none the next day On June 16th, I block part of the site using robots.txt (most of the section wasn't indexed anyway) On June 17th, Google starts ranking our site again. I thought it might be due to the robots.txt change, but I had just made the change a few hours ago, and Google wasn't even indexing the part of the site I blocked Traffic/rankings crash again on July 6th. No theory why. Site URL: http://www.floridaisbest.com Traffic Stats: Attached I know that we need more backlinks and less duplicate content, but I can't explain why our Google rankings are "on again, off again". I have never seen a site gain and lose all of its rankings/traffic so drastically multiple times, for no apparent reason. Any thoughts or ideas would be welcome. Thanks! t8IqB
Algorithm Updates | | AdamThompson0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0 -
Classifieds and Google Panda
It seems Google's Panda update is targetting low quality sites with little unique content (I know there's more to it than that). It makes sense that they may want to do this but what about classified sites. They may use some scraped content as well as unique ads, and the ads may lack content as they rely on the users writing the ads. However, they are helpful to the people that use classifieds. Because of these factors, these sites are suffering with the release of the latest Panda update. Any advice for classified sites and how they can combat the rankings drops???
Algorithm Updates | | Sayers0 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0