Content From One Domain Mysteriously Indexing Under a Different Domain's URL
-
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info:
Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only.
Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below.
Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK
When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com http://screencast.com/t/FkUgz8NGfFe
All of these links give you a 404 when clicked...
Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke.
The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports.
services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me.
the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent.
Any ideas? As one could imagine this is not an ideal scenario for either website.
-
A similar thing happened to me once. In my case, the DNS settings were incorrect. Check that
-
I'm not sure what would be causing this. It looks like the pages did exist on the services subdomain at one time. Maybe try adding the subdomain in Webmaster tools and removing all pages. You might also want to add a robots.txt to the subdomain and disallow bots from crawling.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is a canonicalized URL still in index?
Hi Mozers, We recently canonicalized a few thousand URLs but when I search for these pages using the site: operator I can see that they are all still in Google's index. Why is that? Is it reasonable to expect that they would be taken out of the index? Or should we only expect that they won't rank as high as the canonical URLs? Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
How do the Quoras of this world index their content?
I am helping a client index lots and lots of pages, more than one million pages. They can be seen as questions on Quora. In the Quora case, users are often looking for the answer on a specific question, nothing else. On Quora there is a structure setup on the homepage to let the spiders in. But I think mostly it is done with a lot of sitemaps and internal linking in relevancy terms and nothing else... Correct? Or am I missing something? I am going to index about a million question and answers, just like Quora. Now I have a hard time dealing with structuring these questions without just doing it for the search engines. Because nobody cares about structuring these questions. The user is interested in related questions and/or popular questions, so I want to structure them in that way too. This way every question page will be in the sitemap, but not all questions will have links from other question pages linking to them. These questions are super longtail and the idea is that when somebody searches this exact question we can supply them with the answer (onpage will be perfectly optimised for people searching this question). Competition is super low because it is all unique user generated content. I think best is just to put them in sitemaps and use an internal linking algorithm to make the popular and related questions rank better. I could even make sure every question has at least one other page linking to it, thoughts? Moz, do you think when publishing one million pages with quality Q/A pages, this strategy is enough to index them and to rank for the question searches? Or do I need to design a structure around it so it will all be crawled and each question will also receive at least one link from a "category" page.
Intermediate & Advanced SEO | | freek270 -
Similar product descriptions but with different urls
I had this question before and was not fully satisfied with the answer.. We are selling adhesives and some of the products have the same name and description, the only thing that separates them are the width on the roll.. Are old/online setup are as following, each product has its own product page with more or less the same description. For example here http://siga-sverige.se/siga/fentrim-2-100/ and here http://siga-sverige.se/siga/fentrim-2-150/ The above product pages are for a product called Fentrim 2. its availiable in widhts from 75 to 300mm.. so, its six diffent products pages with more or less the same description. The other variations of the products besides the width. are Fentrim 20, Fentrim IS 2 and Fentrim IS 20. So this gives us 6 x Fentrim 20 product pages with the same description, just the width that changes. 6 x Fentrim 2 product pages with the same description, just the width that changes. 6 x Fentrim IS 20 product pages with the same description, just the width that changes. 6 x Fentrim IS 2 product pages with the same description, just the width that changes. I get that this can cause us problems in the terms of duplicate content. The plan that we have now is to have 4 different product pages with variations instead. For each of those for product pages we have well written and unique content. And have the old ones 301 redirected to them. Like this http://siga-sverige.se/siga/fentrim-2 http://siga-sverige.se/siga/fentrim-20 http://siga-sverige.se/siga/fentrim-IS-2 http://siga-sverige.se/siga/fentrim-IS-20 Today we gain traffic from one product page per variation and it seems that google has picked those ones out randomly, see the attached screenshot.. Will we loose rank? will this increase our position, whats your ideas? // Jonas PG4aAcM
Intermediate & Advanced SEO | | knubbz0 -
Community Discussion - What's the ROI of "pruning" content from your ecommerce site?
Happy Friday, everyone! 🙂 This week's Community Discussion comes from Monday's blog post by Everett Sizemore. Everett suggests that pruning underperforming product pages and other content from your ecommerce site can provide the greatest ROI a larger site can get in 2016. Do you agree or disagree? While the "pruning" tactic here is suggested for ecommerce and for larger sites, do you think you could implement a similar protocol on your own site with positive results? What would you change? What would you test?
Intermediate & Advanced SEO | | MattRoney2 -
Removing content from Google's Indexes
Hello Mozers My client asked a very good question today. I didn't know the answer, hence this question. When you submit a 'Removing content for legal reasons report': https://support.google.com/legal/contact/lr_legalother?product=websearch will the person(s) owning the website containing this inflammatory content recieve any communication from Google? My clients have already had the offending URL removed by a court order which was sent to the offending company. However now the site has been relocated and the same content is glaring out at them (and their potential clients) with the title "Solicitors from Hell + Brand name" immediately under their SERPs entry. **I'm going to follow the advice of the forum and try to get the url removed via Googles report system as well as the reargard action of increasing my clients SERPs entries via Social + Content. ** However, I need to be able to firmly tell my clients the implications of submitting a report. They are worried that if they rock the boat this URL (with open access for reporting of complaints) will simply get more inflammatory)! By rocking the boat, I mean, Google informing the owners of this "Solicitors from Hell" site that they have been reported for "hosting defamatory" content. I'm hoping that Google wouldn't inform such a site, and that the only indicator would be an absence of visits. Is this the case or am I being too optimistic?
Intermediate & Advanced SEO | | catherine-2793880 -
Can I, in Google's good graces, check for Googlebot to turn on/off tracking parameters in URLs?
Basically, we use a number of parameters in our URLs for event tracking. Google could be crawling an infinite number of these URLs. I'm already using the canonical tag to point at the non-tracking versions of those URLs....that doesn't stop the crawling tho. I want to know if I can do conditional 301s or just detect the user agent as a way to know when to NOT append those parameters. Just trying to follow their guidelines about allowing bots to crawl w/out things like sessionID...but they don't tell you HOW to do this. Thanks!
Intermediate & Advanced SEO | | KenShafer0 -
Canonical URL redirect to different domain - SEO benefits?
Hello Folks, We are having a SEO situation here, and hope your support will help us figure out that. Let's say there are two different domains www.subdomian.domianA.com and www.domainB.com. subdomain.domainA is what we want to promote and drive SEO traffic. But all our content lies in domainB. So one of the thoughts we had is to duplicate the domainB's content on subdomian.domainA and have a canonical URL redirect implemented. Questions: Will subdomain.domainA.com get indexed in search engines for the content in domainB by canonical redirect? Do we get the SEO benefits? So is there any other better way to attain this objective? Thanks in advance.
Intermediate & Advanced SEO | | NortonSupportSEO0 -
Need to duplicate the index for Google in a way that's correct
Usually duplicated content is a brief to fix. I find myself in a little predicament: I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal. The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads. So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines. I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path. So the question is: Have someone else stumbled upon a similar problem? If so...? How did you fix it.
Intermediate & Advanced SEO | | Gustav-Northclick0