Test contet/pages indexed by search engines
-
During the web development stages of our Joomla CMS website, we have managed to get our site indexed for totally irrelevant test pages mainly to do with Joomla and some other equally irrelevant test content. How damaging is this to our domain from an SEO prospective and is there something we can do about it?
When we do a site:domain.com search we see hundreds of testpages with test/irrelevant meta tags etc.
-
Search engines regularly recrawl every website and will update their information based on changes you make to your site. It is a natural part of the internet. The "site under construction" information is not harmful, but in the future should be blocked from indexing.
-
Thankfully its only test urls that have been indexed by Google only.
However all 3 major engines have indexed our domain against "Site under construction" page with untitled/incomplete tags.
Is this harmful or will this be overwritten when we launch properly and get our site indexed?
-
When you begin developing a site, you should use the robots.txt file to block all search engine access to the site. This is one of the few times where a robots.txt file is very useful.
With respect to fixing the issue, it depends on whether the URLs will be used on the live site, how long it will be until your site launched, and whether unique URLs such as /testing were used or you are working with the same URLs which will exist on the live site.
If your site is still in testing and it will remain in testing for 30+ days, you could add the noindex tag sitewide. Once all the pages were removed from the index, you can then add the robots.txt file. Be careful not to adjust the robots.txt file prior to the pages being removed as the search engines wont be able to see the noindex tag.
You did not mention which search engine indexed your pages. If you are working with Google and the URLs will not exist on the live site, you could use the Google Removal Tool. This is really overkill and should not be necessary, but if the site owner is paranoid about the test pages causing damage to SEO you can take this approach. Any URL removed in this manner cannot be re-added to the index for 90 days.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to hide our duplicate pages from SERP? Best practice to increase visibility to new pages?
Hi all, We have total 4 pages about same topic and similar keywords. These pages are from our main domain and sub domains too. As the pages from sub domains are years old and been receiving visits from SERP, they stick to 1st position. But we have recently created new pages on our main domain which we are expecting to rank on 1st position. I am planning to hide the sub domain pages from SERP using "Remove URLs" for some days to increase visibility to new pages from main domain. Is this the right and best practice to proceed with? Thanks
Algorithm Updates | | vtmoz0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Delay between being indexed and ranking for new pages.
I've noticed with the last few pages i've built that there's a delay between them being indexed and them actually ranking. Anyone else finding that? And why is it like that? Not much of an issue as they tend to pop up after a week or so, but I am curious. Isaac.
Algorithm Updates | | isaac6630 -
Lots of dublicate titles and pages on search page
I own a paiting website with a lot of searchable paintings. The "search paintings" feature creates tons of dublicate pages and titles. See here:
Algorithm Updates | | KasperGJ
http://www.maleribasen.dk/soegmaleri.asp I guess the problem is, that the URL can actually be different and still return the same content. First time you click the "Search paintings" the URL will shown as above. But as soon as users
begin to definere they search to the left and use the "Search button" the top URL changes. So, depending on how the top URL looks different results are shown. This is pretty standard in searches. But it returns tons of dublicate pages and titles. How, do you guys cope with that? Is there a clever way to use ref="cannonical" or some other smart way to avoid this? /Kasper0 -
Future address change and local search
I have a client who targets a particular city, and up until now has had his physical location in the suburbs of that city. This April 1, his office will have the city address he has been targeting. I have spent a lot of time over the past year claiming ownership of all local directory listings and consolidating addresses as he has moved several times in the past 5 years. Looking at this as an opportunity to get the official USPS address he will be using and use the exact same address for everything. So many different variations out there right now for him. Wondering if it would be ok to start promoting the new address before the April 1 move and also when to start with the directory listings. Also, have held off on purchasing the yahoo directory link because of the suburban address but reconsidering this as of April 1 as well.
Algorithm Updates | | c2g0 -
Home page rank for keyword
Hi Mozers I have traded from my website balloon.co.uk for over 10 years. For a long while the site ranked first for the word 'balloon' across the UK on google.co.uk (first out of 41 million). Around the time Penguin launched the site began to drop and currently sits on about page 5. What's confusing is that for a search on 'balloons' ('s' on the end of balloon) it ranks 2nd in the location of Birmingham where I'm based. That's 2nd in the real search rather than a map local search. But - if I search 'balloon' from the location of Birmingham my contact page ranks 5th: http://www.balloon.co.uk/contact.htm but the home page ranks nowhere. So - it's gone from ranking 1st nationally to ranking nowhere with my contact page ranking above the home page (which is a generic word domain). Any ideas?
Algorithm Updates | | balloon.co.uk0 -
301 Redirect has removed search rankings
As per instructions from a SEO , we did a 301 redirect on our url to a new url (www.domain.com to subdomain xxxx.domain.com). But the problem is we lost all the google rankings that the previous url had gained. How can we rollback this situation. Can we retrieve the rankings of the previous url if we remove 301 permenant move redirection ? The new url does not figure in the google search for the keyword that use to fetch the previous url at no 3 in the results Please help ...
Algorithm Updates | | BizSparkSEO0 -
How do I get the expanded results in a Google search?
I notice for certain site (ex: mint.com) that when I search, the top result has a very detailed view with options to click to different subsections of the site. However for my site, even though we're consistently the top result for our branded terms, the result is still only a single line item. How do I adjust this?
Algorithm Updates | | syount1