Google Indexed Site A's Content On Site B, Site C etc
-
Hi All,
I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly.
I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert.
My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer.
I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great.
Thanks
Eric -
Hi Eric,
Thanks for the update.
The screenshot showing the 301 is correct - all good there.
Regarding the sitemap, sorry I should have been clearer on this - can you exclude that from the redirects so that when Google crawl it, they don't get redirected and instead find all of the URLs from the old site?
Cheers.
Paddy
-
Hi Paddy,
Its been a few days since I added the sites into webmaster tools and I'm now seeing the following (attached image) on all of them. Would that be correct or is there something else that I need to do?
Also when I submit a sitemap for the sites with the 301 redirect it loads up the sitemap on my correct site (since its a redirect site). I assume that would be correct but just wanted clarification on that.
Thanks
Eric
-
Great thank you I'll give it a shot ant let you know how it worked.
-
Hi Eric,
I'd set up a profile for whichever version of the URLs 301 to your main site. So if the https version redirects, then use that one.
I don't think you need to submit every single URL, I'd recommend submitting a handful of the main ones (in terms of traffic or site architecture) and asking Google to also crawl all links on the page.
On the sitemap, you'd enter the URLs that have redirects in place which is your old site. In your example, this would be sites B,C and D which all need their own Search Consoles + XML sitemaps for the pages on those sites with redirects.
Cheers.
Paddy
-
Hi Paddy,
I do have access to all of those domains so I can set them up in search console. Would I setup the https version in search console and then run the crawl?
I have about 100 urls on each site that are wrong. Its not a huge deal for me to do it manually but is there a faster way to have it submitted and recrawled. If I do the sitemap would I enter in the old urls that are indexed or the new url that I want it to go to?
Thanks
Eric
-
Hi Eric,
Thanks for the question.
Are you able to register each of the duplicate sites with Google Search Console? If so, you could do that and then use the Fetch as Google feature which then lets you submit pages to the Google index. So you could enter the URL of a page that is now redirected and ask Google to recrawl it.
You could also setup sitemaps for the duplicate sites and submit those to try and prompt Google to recrawl them.
Hope that helps!
Paddy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript content not being indexed by Google
I thought Google has gotten better at picking up unique content from javascript. I'm not seeing it with our site. We rate beauty and skincare products using our algorithms. Here is an example of a product -- https://www.skinsafeproducts.com/tide-free-gentle-he-liquid-laundry-detergent-100-fl-oz When you look at the cache page (text) from google none of the core ratings (badges like fragrance free, top free and so forth) are being picked up for ranking. Any idea what we could do to have the rating incorporated in the indexation.
Intermediate & Advanced SEO | | akih0 -
How necessary is it to disavow links in 2017? Doesn't Google's algorithm take care of determining what it will count or not?
Hi All, So this is a obvious question now. We can see sudden fall or rise of rankings; heavy fluctuations. New backlinks are contributing enough. Google claims it'll take care of any low quality backlinks without passing pagerank to website. Other end we can many scenarios where websites improved ranking and out of penalty using disavow tool. Google's statement and Disavow tool, both are opposite concepts. So when some unknown low quality backlinks are pointing and been increasing to a website? What's the ideal measure to be taken?
Intermediate & Advanced SEO | | vtmoz0 -
Multiple 301 redirects and old site content appearing in Google results
I have found that for some Google searches the old version of the site on a completely different domain is appearing on page one of the results, while the newer site is only on page 3. The old site is redirecting to the new site with a 301 redirect, however there is also an additional redirect on the new site to force SSL. Despite this when you view the Google cache of the result that appears in Google the content of the page is still the old site. Is this normal or is Google not following the chain of 301 redirects? Edit: I just found out that downloading the page by right clicking a link and clicking download rather than viewing it in a browser leads to the old site appearing and the 301 redirect not being followed.
Intermediate & Advanced SEO | | freshleafmedia0 -
Google Indexed Old Backups Help!
I have the bad habit of renaming a html page sitting on my server, before uploading a new version. I usually do this after a major change. So after the upload, on my server would be "product.html" as well as "product050714".html. I just stumbled on the fact G has been indexing these backups. Can I just delete them and produce a 404?
Intermediate & Advanced SEO | | alrockn0 -
Google's Exact Match Algorithm Reduced Our Traffic!
Google's first Panda de-valued our Web store, www.audiobooksonline.com, and our traffic went from 2500 - 3000 (mostly organic referrals) per month to 800 - 1000. Google's under-valuing of our Web store continued to reduce our traffic to 400-500 for the past few months. From 4/5/2013 to 4/6/2013 our traffic dropped 50% more, because (I believe) of Google's "exact domain match" algorithm implementation. We were, even after Panda and up to 4/5/2013 getting a significant amount of organic traffic for search terms such as "audiobooks online," "audio books online," and "online audiobooks." We no longer get traffic for these generic keywords. What I don't understand is why a UK company, www.audiobooksonline.co.uk/, with a very similar domain name, ranks #5 for "audio books online" and #4 for "audiobooks online" while we've almost disappeared from Google rankings. By any measurement I am aware of, our site should rank higher than audiobooksonline.co.uk. Market Samurai reports for "audio books online" and "audiobooks online" shows that our Web store is significantly "stronger" than audiobooksonline.co.uk but they show up on Google's first page and we are down several pages. I also checked a few titles on audiobooksonline.co.uk and confirmed they are using the same publisher descriptions we and many other online book / audiobook merchants do = duplicate content. We have never received notice that our Web store was being penalized. Why would audiobooksonline.co.uk rank so much higher than audiobooksonline.com? Does Google treat non-USA sites different than USA sites?
Intermediate & Advanced SEO | | lbohen0 -
How get rid of duplicate content, titles, etc on php cartweaver site?
my website http://www.bartramgallery.com was created using php and cartweaver 2.0 about five years ago by a web developer. I was really happy with the results of the design was inspired to get into web development and have been studying ever since. My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more. The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc. This is likely from the dynamic urls and same pages with secondary results etc. I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site? There are many pages with the same content competing for page rank and it is a bit frustrating to say the least. If anyone has any advice it would be greatly appreciated even pointing me in the right direction. Thank you, Jesse
Intermediate & Advanced SEO | | WSOT0 -
Pro's & Con's of registering your customers?
I know that making a user register will drop the the conversion rate. However, there are a lot of sites that still stand by making users register before you can purchase. I was wondering if they know something that I don't that would outweigh the loss of those conversions. What exactly are the Pro's & Con's of making your customers register before being able to purchase an item?
Intermediate & Advanced SEO | | HCGDiet0 -
Do sites with a small number of content pages get penalized by Google?
If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?
Intermediate & Advanced SEO | | RightDirection0