Content From One Domain Mysteriously Indexing Under a Different Domain's URL
-
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info:
Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only.
Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below.
Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK
When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com http://screencast.com/t/FkUgz8NGfFe
All of these links give you a 404 when clicked...
Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke.
The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports.
services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me.
the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent.
Any ideas? As one could imagine this is not an ideal scenario for either website.
-
A similar thing happened to me once. In my case, the DNS settings were incorrect. Check that
-
I'm not sure what would be causing this. It looks like the pages did exist on the services subdomain at one time. Maybe try adding the subdomain in Webmaster tools and removing all pages. You might also want to add a robots.txt to the subdomain and disallow bots from crawling.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
What sort of content for 'non-niche' website?
Hey guys, had a question with regards to content production. We run an store called Yellow Octopus in Australia and we've literally got thousands of products (4500 skus last count). We've got everything from novelty mugs to kitchen accessories to gag gifts, t-shirts and tech gadgets. I've read a lot of material on creating awesome content to attract backlinks and we are ready to craft our content strategy. We've got a team in place - graphic designer, illustrator and writers to execute that strategy. It's just a matter of formulating the strategy! Largely speaking I have an idea of the quality of content required because I look at a lot of it. The real issue is what type of content is right for us? Most of the articles I have read focus on niche industries i.e. SEO, Piano sales or health foods. Right off the bat I can come up with hundreds of content pieces that work around those niches. However, with such a diverse range of products I'm unsure of what our niche really is, in fact not having a niche is almost our niche. Of course we could do gift guides like '30 Unbelievable Gifts for Foodies' (and we do, do those). However they aren't really the type of posts that are likely to attract back-links. Is the best strategy to split the content into categories? What sort of content pieces would you suggest for a company such as ours? Many thanks in advance!
Intermediate & Advanced SEO | | TheGreatestGoat0 -
Index or not index Categories
We are using Yoast Seo plugin. On the main menu we have only categories which has consist of posts and one page. We have category with villas, category with villa hotels etc. Initially we set to index and include in the sitemap posts and excluded categories, but I guess it was not correct. Would be a better way to index and include categories in the sitemap and exclude the posts in order to avoid the duplicate? It somehow does not make sense for me, If the posts are excluded and the categories included, will not then be the categories empty for google? I guess I will get crazy of this. Somebody has perhaps more experiences with this?
Intermediate & Advanced SEO | | Rebeca10 -
Keep older blog content indexed or no?
Our really old blog content still sees traffic, but engagement metrics aren't the best (little time on site), and as a result, traffic has gradually started to decrease. Should we de-index it?
Intermediate & Advanced SEO | | nicole.healthline0 -
One Web site many Domains
One of my client have about 12 domains related to his one web site .all domain name relevant to keywords .but doing seo for one target domain name.Now he ask what to with rest od domains plz advice and experts advices are highly appreciate..
Intermediate & Advanced SEO | | innofidelity0 -
How Long Before a URL is 'Too Long'
Hello Mozzers, Two of the sites I manage are currently in the process of merging into one site and as a result, many of the URLs are changing. Nevertheless (and I've shared this with my team), I was under the impression that after a certain point, Google starts to discount the validity of URLs that are too long. With that, if I were to have a URL that was structured as follows, would that be considered 'too long' if I'm trying to get the content indexed highly within Google? Here's an example: yourdomain.com/content/content-directory/article and in some cases, it can go as deep as: yourdomain.com/content/content-directory/organization/article. Albeit there is no current way for me to shorten these URLs is there anything I can do to make sure the content residing on a similar path is still eligible to rank highly on Google? How would I go about achieving this?
Intermediate & Advanced SEO | | NiallSmith0 -
Most Painless way of getting Duff Pages out of SE's Index
Hi, I've had a few issues that have been caused by our developers on our website. Basically we have a pretty complex method of automatically generating URL's and web pages on our website, and they have stuffed up the URL's at some point and managed to get 10's of thousands of duff URL's and pages indexed by the search engines. I've now got to get these pages out of the SE's indexes as painlessly as possible as I think they are causing a Panda penalty. All these URL's have an addition directory level in them called "home" which should not be there, so I have: www.mysite.com/home/page123 instead of the correct URL www.mysite.com/page123 All these are totally duff URL's with no links going to them, so I'm gaining nothing by 301 redirects, so I was wondering if there was a more painless less risky way of getting them all out the indexes (IE after the stuff up by our developers in the first place I'm wary of letting them loose on 301 redirects incase they cause another issue!) Thanks
Intermediate & Advanced SEO | | James770 -
Don't want to lose page rank, what's the best way to restructure a url other than a 301 redirect?
Currently in the process of redesigning a site. What i want to know, is what is the best way for me to restructure the url w/out it losing its value (page rank) other than a 301 redirect?
Intermediate & Advanced SEO | | marig0