Google Indexed Site A's Content On Site B, Site C etc
-
Hi All,
I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly.
I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert.
My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer.
I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great.
Thanks
Eric -
Hi Eric,
Thanks for the update.
The screenshot showing the 301 is correct - all good there.
Regarding the sitemap, sorry I should have been clearer on this - can you exclude that from the redirects so that when Google crawl it, they don't get redirected and instead find all of the URLs from the old site?
Cheers.
Paddy
-
Hi Paddy,
Its been a few days since I added the sites into webmaster tools and I'm now seeing the following (attached image) on all of them. Would that be correct or is there something else that I need to do?
Also when I submit a sitemap for the sites with the 301 redirect it loads up the sitemap on my correct site (since its a redirect site). I assume that would be correct but just wanted clarification on that.
Thanks
Eric
-
Great thank you I'll give it a shot ant let you know how it worked.
-
Hi Eric,
I'd set up a profile for whichever version of the URLs 301 to your main site. So if the https version redirects, then use that one.
I don't think you need to submit every single URL, I'd recommend submitting a handful of the main ones (in terms of traffic or site architecture) and asking Google to also crawl all links on the page.
On the sitemap, you'd enter the URLs that have redirects in place which is your old site. In your example, this would be sites B,C and D which all need their own Search Consoles + XML sitemaps for the pages on those sites with redirects.
Cheers.
Paddy
-
Hi Paddy,
I do have access to all of those domains so I can set them up in search console. Would I setup the https version in search console and then run the crawl?
I have about 100 urls on each site that are wrong. Its not a huge deal for me to do it manually but is there a faster way to have it submitted and recrawled. If I do the sitemap would I enter in the old urls that are indexed or the new url that I want it to go to?
Thanks
Eric
-
Hi Eric,
Thanks for the question.
Are you able to register each of the duplicate sites with Google Search Console? If so, you could do that and then use the Fetch as Google feature which then lets you submit pages to the Google index. So you could enter the URL of a page that is now redirected and ask Google to recrawl it.
You could also setup sitemaps for the duplicate sites and submit those to try and prompt Google to recrawl them.
Hope that helps!
Paddy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Change Google's version of Canonical link
Hi My website has millions of URLs and some of the URLs have duplicate versions. We did not set canonical all these years. Now we wanted to implement it and fix all the technical SEO issues. I wanted to consolidate and redirect all the variations of a URL to the highest pageview version and use that as the canonical because all of these variations have the same content. While doing this, I found in Google search console that Google has already selected another variation of URL as canonical and not the highest pageview version. My questions: I have millions of URLs for which I have to do 301 and set canonical. How can I find all the canonical URLs that Google has autoselected? Search Console has a daily quota of 100 or something. Is it possible to override Google's version of Canonical? Meaning, if I set a variation as Canonical and it is different than what Google has already selected, will it change overtime in Search Console? Should I just do a 301 to highest pageview variation of the URL and not set canonicals at all? This way the canonical that Google auto selected might get redirected to the highest pageview variation of the URL. Any advice or help would be greatly appreciated.
Intermediate & Advanced SEO | | SDCMarketing0 -
I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
So here is the situation. I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date. They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results. Here's what I'm thinking, but let me know if you think theres a better way to handle this: Put a "View New List" banner on the old page Make sure all internal links point to the new page Rel=canonical tag on the old list pointing to the new list Does this seem like a reasonable way to handle this?
Intermediate & Advanced SEO | | jim_shook0 -
How to get a site out of Google's Sandbox
Hi I am working on a website that is ranking well in bing for the domain name / exact url search but appears no where in Google or Yahoo. I have done the site search in Google and it is indexed so I am presuming it is in the sandbox. The website was originally developed in India and I do not know whether it had some history of bad backlinks. The website itself is well optimised and I have checked all pages in Moz - getting a grade A. Webmaster Tools is not showing any manual actions - I was wondering what I could do next?
Intermediate & Advanced SEO | | AllieMc0 -
Google Indexed Old Backups Help!
I have the bad habit of renaming a html page sitting on my server, before uploading a new version. I usually do this after a major change. So after the upload, on my server would be "product.html" as well as "product050714".html. I just stumbled on the fact G has been indexing these backups. Can I just delete them and produce a 404?
Intermediate & Advanced SEO | | alrockn0 -
'Nofollow' footer links from another site, are they 'bad' links?
Hi everyone,
Intermediate & Advanced SEO | | romanbond
one of my sites has about 1000 'nofollow' links from the footer of another of my sites. Are these in any way hurtful? Any help appreciated..0 -
What's the best internal linking strategy for articles and on-site resources?
We recently added an education center to our site with articles and information about our products and industry. What is the best way to link to and from that content? There are two options I'm considering: Link to articles from category and subcategory pages under a section called "related articles" and link back to these category and subcategory pages from the articles: category page <<--------->> education center article education center article <<---------->> subcategory page Only link from the articles to the category and subcategory pages: education center article ---------->> category page education center article ---------->> subcategory page Would #1 dilute the SEO value of the category and subcategory pages? I want to offer shoppers links to more information if they need it, but this may also take them away from the products. Has anyone tested this? Thanks!
Intermediate & Advanced SEO | | pbhatt0 -
How to get content to index faster in Google.....pubsubhubbub?
I'm curious to know what tools others are using to get their content to index faster (other than html sitmap and pingomatic, twitter, etc) Would installing the wordpress pubsubhubbub plugin help even though it uses pingomatic? http://wordpress.org/extend/plugins/pubsubhubbub/
Intermediate & Advanced SEO | | webestate0 -
Remove content that is indexed?
Hi guys, I want to delete a entire folder with content indexed, how i can explain to google that content no longer exists?
Intermediate & Advanced SEO | | Valarlf0