Content From One Domain Mysteriously Indexing Under a Different Domain's URL
-
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info:
Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only.
Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below.
Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK
When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com http://screencast.com/t/FkUgz8NGfFe
All of these links give you a 404 when clicked...
Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke.
The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports.
services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me.
the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent.
Any ideas? As one could imagine this is not an ideal scenario for either website.
-
A similar thing happened to me once. In my case, the DNS settings were incorrect. Check that
-
I'm not sure what would be causing this. It looks like the pages did exist on the services subdomain at one time. Maybe try adding the subdomain in Webmaster tools and removing all pages. You might also want to add a robots.txt to the subdomain and disallow bots from crawling.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Is there any SEO advantage to sharing links on twitter using google's url shortener goo.gl/
Hi is there any advantage to using <cite class="vurls">goo.gl/</cite> to shorten a URL for Twitter instead of other ones? I had a thought that <cite class="vurls">goo.gl/</cite> might allow google to track click throughs and hence judge popularity.
Intermediate & Advanced SEO | | S_Curtis0 -
Can you Canonical to a URL in a different folder under the same domain?
I want to know if it's possible to add a canonical tag to a URL that points to a URL under a different folder. Content is just about the same. Here's an example (fake urls and product, but structure and parameters are similar to my client's website): domain.com/toy-ducks-results.aspx?color=Purple&model=Elvis domain.com/toy-ducks-details.aspx?color=Purple&model=Elvis&style=Sparkly Let's say that my purple Elvis ducks are really popular. Is there any harm in putting a rel=canonical on the Sparkly Elvis ducks page to the purple Elvis ducks page? Even though they are two different folders? /toy-ducks-results and /toy-ducks-details So, in effect, the preferred folder is /toy-ducks-results Thanks in advance for any help.
Intermediate & Advanced SEO | | EEE30 -
SEO for one web site two domains
I have web site www.sxxxcafe.com and there is a another domain for the same like xxx.com .How can i use second domain for the same web site keeping SEO up and without loosing ranking .
Intermediate & Advanced SEO | | innofidelity0 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
Duplicate content on index.htm page
How do I avoid duplicate content on the index.htm page . I need to redirect the spider from the /index.htm file to the main root of http://www.manandhisvan.com.au and hence avoid duplicate content. Does anyone know of a foolproof way of achieving this without me buggering up the complete site Cheers Freddy
Intermediate & Advanced SEO | | Fatfreddy0 -
Questions regarding Google's "improved url handling parameters"
Google recently posted about improving url handling parameters http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html I have a couple questions: Is it better to canonicalize urls or use parameter handling? Will Google inform us if it finds a parameter issue? Or, should we have a prepare a list of parameters that should be addressed?
Intermediate & Advanced SEO | | nicole.healthline0 -
Link building maximum to different sub domains?
Hi All, I'm launching a new website with a number of country specific sub-domains and I wanted to know if Google will calculate the number of new links as a root domain or if it will treat each subdomain seperately? For instance if I built 50 links per month to each of my five proposed subdomains would google see it as 250 links built to one root domain(and penalise me as a result) or will they view these subdomains independantly and accept these 50 links per page as an acceptable amount per sub domain. Thanks in advance. Ross
Intermediate & Advanced SEO | | Mulith0