Question on Google's Site: Search
-
A client currently has two domains with the same content on each. When I pull up a Cached version of the site, I noticed that it has a Cache of the correct page on it. However, when I do a site: in Google, I am seeing the domain that we don't want Google indexing. Is this a problem? There is no canonical tag and I'm not sure how Google knows to cache the correct website but it does. I'm assuming they have this set in webmaster tools?
Any help is much appreciated!
Thanks!
-
Okay that is what I figured. Thank you
-
OH! I am sorry for the confusion - yes, pick the site you want to index and redirect the site you don't want to be indexed over to the new site.
To answer your question, yes, both sites are being indexed.
-
Hi Patrick,
The name of the business is fairly long so they decided to purchase a shortened version of the site but I believe setting up a redirect would be the best solution... But as for my question, if I do a site: in google and both websites are populating the search results, does this mean that Google is indexing two versions of the website?
-
Hi there
You will want to add canonical tags to the indexing site to the site you want indexed. Is there a particular reason you have two sites?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google My Business Service Area Question
Hello Moz Friends I just wanted to make sure I'm doing things correctly. On google my business your given the option to list your service area. I serve the entire state of Colorado with my internet marketing services. So I listed Colorado as my service area. but Moz Friends, is this the wrong idea? Like should I list the major cities and call it good? So instead of service area Colorado, I should put Denver, Colorado Springs, Pueblo etc Thank you for your friendly help Chris
Technical SEO | | asbchris0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Google Seeing Way More Pages Than My Site Actually Has
For one of my sites, A-1 Scuba Diving And Snorkeling Adventures, Google is seeing way more pages than I actually have. It sees almost 550 pages but I only have about 50 pages in my XML. I am sure this is an error on my part. Here is the search results that show all my pages. Can anyone give me some guidance on what I did wrong. Is it a canonical url problem, a redirect problem or something else. Built on Wordpress. Thanks in advance for any help you can give. I just want to make sure I am delivering everything I can for the client.
Technical SEO | | InfinityTechnologySolutions0 -
Redirecting .edu subdomains to our site or taking the link, what's more valuable?
We have a relationship built through a service we offer to universities to be issued a .edu subdomain that we could redirect to our landing page relevant to that school. The other option is having a link from their website to that same page. My first question is, what would be more valuable? Can you pass domain authority by redirecting a subdomain to a subdirectory in my root domain? Or would simply passing the link equity from a page in their root domain to our page pass enough value? My second question is, if creating a subdomain with a redirect is much more valuable, what is the best process for this? Would we simply have their webmaster create the subdomain for us an have them put a 301 redirect to our page? Is this getting in the greyer hat area? Thanks guys!
Technical SEO | | Dom4410 -
My blog page isn't ranking in Google
Hi, I noticed that my blog page on my site isn't in Google when i search for full URL link http://www.asggutter.com/blog/ instead i see page that isn't even working asggutter.com/sitemap.xml screen shot http://screencast.com/t/6OVFLwL8nTL How i can i fix that. Thanks
Technical SEO | | tonyklu0 -
We're no longer turning up in Google SERP for our brand search when we used to be #1 after our site update. Any ideas why?
We recently updated our website and during the push, someone mistakenly 301 redirected "www.brandx.com" to "brandx.com" instead of the otherway. Since then, our website no longer turns up for the search "brandx" on Google. We have reversed the mistake a few days ago, but we're still not turning up, and we used to rank #1 in Google SERP. Could it just be due to timing between the crawls and that our www. site didn't make it in Google's index due to this mistake? We have submitted our new sitemap to google a couple of days ago as well, as a side we're still showing up #1 in Bing's results however. And it should still show up based on SEOMoz's SERP report. Any help would help as I'm growing increasingly concerned.
Technical SEO | | JoeLin0 -
Google Duplicate Content Penalty On My Own Site?
I am certain that I have hit a google penalty filter for my site http://www.playpokeronline.ca for my main keywords "play poker online" in google.ca I rank 670th and used to be on the first page between 1 and 10 in June. On Bing I am like 9th On my site I found the entire site duplicated as follows Original: www.playpokeronline.ca Duplicate www.playpokeronline.ca/playpokeronline/ this duplicate was not intentional and seems to be a result of my hosting at godaddy. for every page on my site and it shows up in webmaster tools I blocked the duplicate with robots.txt and a few days ago dropped it and wrote a rel=connonical tag in the top of each page visitors dropped from 100 per day in august to 12-20 in the last month. Google says that if duplicate content is made to try to game serps they may filter or penalize my site. Have I triggered this penalty or a different sort of over optimization penalty? Will the rel= canonical tags fix this or should i do something else? This Penalty Business is Not my Idea of a good time Thank You Jeb
Technical SEO | | PokerCanada0