Duplicate content issue: staging urls has been indexed and need to know how to remove it from the serps
-
duplicate content issue: staging url has been indexed by google ( many pages) and need to know how to remove them from the serps.
Bing sees the staging url as moved permanently
Google sees the staging urls (240 results) and redirects to the correct url Should I be concerned about duplicate content and request Google to remove the staging url removed
Thanks Guys
-
Thanks for helping Malika! To clarify for other readers, blocking in robots.txt after the pages have been indexed will actually prevent them from being removed from the index with a meta noindex tag, since Google won't be able to crawl the pages to see the noindex tag.
If staging URLs have been indexed already (and assuming they still need to exist), here's the steps I would take:
- Add meta noindex tags to every staging URLs
- If urgent, also do a URL removal request in Webmaster Tools (but this is usually not needed)
- Wait until the staging URLs are noindexed - you can check periodically by doing site: searches in Google.
- Only after they are noindexed, block Search Engines from crawling them with the robots.txt file.
-
Generally you'll want to hide your staging site from search engines and as Malika mentioned, the best way to do this is via robots.txt.
That lets you essentially set a rule stating that no crawlers are to access anything on that domain. Beyond that, nothing else is really relevant; if crawlers can't see your site, it doesn't matter what you do with it! You don't even need to worry about 301 redirects once this is done.
Once you apply that change in robots.txt, you may still see your staging site indexed for a little while (anywhere from hours to a couple of months) but this is normal and it will drop away soon enough.
Search engines are pretty good at determining which is the real site these days anyway!
-
Thanks for your suggestions Peter and Malika,
By the wayt The staging site had it's own url..
I think I need help with the canonical stuff, as I am not really sure how to use it.
-
Quick way to remove staging url is sending HTTP error 410 as result.
Other is to use in SearchConsole Remove URLs function https://www.google.com/webmasters/tools/url-removalAbout duplicate content - you must see actual canonical. If on stage URL there is canonical point to normal site then you shouldn't hesitating. But if staging and normal point to different URLs then you can see some algo filter.
-
I am assuming that these pages don't hold any authority or backlinks at all. You can simply delete these pages (if the purposes of these pages has been solved.
Or if you still need these pages live, use Robots.txt file to make these pages (or the whole subdomain/directory they are sitting as disallowed, no-index)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Content
We have multiple collections being flagged as duplicate content - but I can't find where these duplications are coming from? The duplicate content has no introductory text, and no meta description. Please see examples:- This is the correct collection page:-
Technical SEO | | Caroline_Ardmoor
https://www.ardmoor.co.uk/collections/deerhunter This is the incorrect collection page:-
https://www.ardmoor.co.uk/collections/vendors How do I stop this incorrect page from showing?0 -
Duplicate content issues... en-gb V en-us
Hi everyone, I have a global client with lots of duplicate page issues, mainly because they have duplicate pages for US, UK and AUS... they do this because they don't offer all services in all markets, and of course, want to show local contact details for each version. What is the best way to handle this for SEO as clearly I want to rank the local pages for each country. Cheers
Technical SEO | | Algorhythm_jT0 -
How do I Address Low Quality/Duplicate Content Issue for a Job portal?
Hi, I want to optimize my job portal for maximum search traffic. Problems Duplicate content- The portal takes jobs from other portals/blogs and posts on our site. Sometimes employers provide the same job posting to multiple portals and we are not allowed to change it resulting in duplicate content Empty Content Pages- We have a lot of pages which can be reached via filtering for multiple options. Like IT jobs in New York. If there are no IT jobs posted in New York, then it's a blank page with little or no content Repeated Content- When we have job postings, we have about the company information on each job listing page. If a company has 1000 jobs listed with us, that means 1000 pages have the exact same about the company wording Solutions Implemented Rel=prev and next. We have implemented this for pagination. We also have self referencing canonical tags on each page. Even if they are filtered with additional parameters, our system strips of the parameters and shows the correct URL all the time for both rel=prev and next as well as self canonical tags For duplicate content- Due to the volume of the job listings that come each day, it's impossible to create unique content for each. We try to make the initial paragraph (at least 130 characters) unique. However, we use a template system for each jobs. So a similar pattern can be detected after even 10 or 15 jobs. Sometimes we also take the wordy job descriptions and convert them into bullet points. If bullet points already available, we take only a few bullet points and try to re-shuffle them at times Can anyone provide me additional pointers to improve my site in terms of on-page SEO/technical SEO? Any help would be much appreciated. We are also thinking of no-indexing or deleting old jobs once they cross X number of days. Do you think this would be a smart strategy? Should I No-index empty listing pages as well? Thank you.
Technical SEO | | jombay3 -
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
API for testing duplicate content
Does anyone know a service or API or php lib to compare two (or more) pages and to return their similiarity (Level-3-Shingles). API would be greatly prefered.
Technical SEO | | Sebes0 -
Duplicate content error from url generated
We are getting a duplicate content error, with "online form/" being returned numerous times. Upon inspecting the code, we are calling an input form via jQuery which is initially called by something like this: Opens Form Why would this be causing it the amend the URL and to be crawled?
Technical SEO | | pauledwards0 -
Duplicate Page Issue
Dear All, I am facing stupid duplicate page issue, My whole site is in dynamic script and all the URLs were in dynamic, So i 've asked my programmer make the URLs user friendly using URL Rewrite, but he converted aspx pages to htm. And the whole mess begun. Now we have 3 different URLs for single page. Such as: http://www.site.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=Multi-Day+City+Tours http://www.tsite.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=multi-day-city-tours http://www.site.com/city-tour/multi-day-city-tours/page4-0.htm I think my programmer messed up the URL Rewrite in ASP.net(Nginx) or even didn't use it. So how do i overcome this problem? Should i add canonical tag in both dynamic URLs with pointing to pag4-0.htm. Will it help? Thanks!
Technical SEO | | DigitalJungle0 -
Duplicate canonical URLs in WordPress
Hi everyone, I'm driving myself insane trying to figure this one out and am hoping someone has more technical chops than I do. Here's the situation... I'm getting duplicate canonical tags on my pages and posts, one is inside of the WordPress SEO (plugin) commented section, and the other is elsewhere in the header. I am running the latest version of WordPress 3.1.3 and the Genesis framework. After doing some testing and adding the following filters to my functions.php: <code>remove_action('wp_head', 'genesis_canonical'); remove_action('wp_head', 'rel_canonical');</code> ... what I get is this: With the plugin active + NO "remove action" - duplicate canonical tags
Technical SEO | | robertdempsey
With the plugin disabled + NO "remove action" - a single canonical tag
With the plugin disabled + A "remove action" - no canonical tag I have tried using only one of these remove_actions at a time, and then combining them both. Regardless, as long as I have the plugin active I get duplicate canonical tags. Is this a bug in the plugin, perhaps somehow enabling the canonical functionality of WordPress? Thanks for your help everyone. Robert Dempsey0