Multi URL treated as one?
-
I had previous asked this question, where the issue turned out to be that I didn't have all the URLs in Google Search console. Whoops!
So I have added 4 properties that are really all the same property:
- https://
- https://www
- http://
- http://www
I have added all of these. This has raised a few more questions:
- Can I get Google Search Console to treat these (and even group these together) to show as one property? Right now they are all listed separately. I know in Site Settings you can set a Preferred Site. Even so, they show as separate sites with data separately. Can I merge these?
- What about Moz? Should I do something similar to see traffic for each of these in Moz? It looks like we are missing a ton of info. Does Moz get this from GSC automatically?
- What about sitemaps? Can I fix this in sitemaps? Do I need separate sitemaps for each property?
-
Hi TapGoods,
In Search Console, next to the red 'ADD A PROPERTY' button, there is a grey 'Create a set' button.
Click this button and you can group your accounts and be able to view a combined data from these accounts.
You can read more about this here: https://support.google.com/webmasters/answer/6338828
With the Moz data, once you've created your property set, give Moz access to the set instead of the single account it would've had access to.
About the sitemap, assuming that the four URLs resolve to a single URL (which they should), you only need to submit your sitemap to the account for the URL that your site uses.
Cheers,
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitting URLs After New Search Console
Hi Everyone I wanted to see how people submit their urls to Google and ensure they are all being indexed. I currently have an ecommerce site with 18,000 products. I have sitemaps setup, but noticed that the various product pages haven't started ranking yet. If I submit the individual url through the new Google Search Console I see the page ranking in a matter of minutes. Before the new Google Search Console you could just ask Google to Fetch/Render an XML sitemap and ask it to crawl all the links. I don't see the same functionality working today on Google Search Console and was wondering if there are any new techniques people could share. Thanks,
Intermediate & Advanced SEO | | abiondo
Anthony1 -
E-commerce duplicate URLS
Hi I just realized that my e-commerce products do not have any difference except the SKUS, PRICE and THE PRODUCT name. Apart from each page has the same sidebar and a piece of content ( same ) under each product pages. And this is the reason why i am getting too many duplicate urls warning through Moz analytics. I do not have any other contents to add for each product because of the nature of the product. Only the price, product name and the SKUs will be different and rest will all be same for each products. How can i fix this ? Thanks
Intermediate & Advanced SEO | | MindlessWizard0 -
Why is this SERP displaying an incorrect URL for my homepage?
The full URL of a particular site's homepage is something like http://www.example.com/directory/.
Intermediate & Advanced SEO | | TheEspresseo
The canonical and og URLs match.
The root domain 301 redirects to it using the absolute path. And yet the SERP (and the cached version of the page) lists it simply as http://www.example.com/. What gives? Could the problem be found at some deeper technical level (.htaccess or DirectoryIndex or something?) We fiddled with things a bit this week, and while our most recent changes appear to have been crawled (and cached), I am wondering whether I should give it some more time before I proceed as if the SERP won't ever reflect the correct URL. If so, how long? [EDIT: From the comments, see here: https://www.youtube.com/watch?v=z8QKIweOzH4#t=2838]0 -
URL rewrites
We have a problem whereby a number of our urls are adressable from different urls - I'm told because of a quirk of developing in .net. e.g. mysite/FundComparison mysite/Fund-comparison mysite/fund-comparison We asked our supplier who hosts this section of our site to do some url rewrites so that the duplicates would 301 to the correct url. They're on IIS 6.0 and are not ready to upgrade to IIS 7.0 (my recommendation, which makes it easier for them to do the rewrite using the rewrite module). They said it would take 6-8 weeks to implement a web controller to do this. "The bulk of the time for this implementation is in the build of the engine + the addition of all the possible permutations of the URL to redirect to the proper URL." This sounds absolutely insane to me. I would have thought it could be done in a matter of hours. What do people think?
Intermediate & Advanced SEO | | SearchPM0 -
Looking for re-assurance on this one: Sitemap approach for multi-subdomains
Hi All: Just looking for a bit of "yeah it'll be fine" reassurance on this before we go ahead and implement: We've got a main accommodation listing website under www.* and a separate travel content site using a completely different platform on blog.* (same domain - diffn't sub-domain). We pull in snippets of content from blog.* > www.* using a feed and we have cross-links going both ways, e.g. links to find accommodation in blog articles and links to blog articles from accommodation listings. Look-and-feel wise they're fully integrated. The blog.* site is a tab under the main nav. What i'd like to do is get Google (and others) to view this whole thing as one site - and attribute any SEO benefit of content on blog.* pages to the www.* domain. Make sense? So, done a bit of reading - and here's what i've come up with: Seperate sitemaps for each, both located in the root of www site www.example.com/sitemap-www www.example.com/sitemap-blog robots.txt in root of www site to have single sitemap entry: sitemap : www.example.com/sitemap-www robots.txt in root of blog site to have single sitemap entry: sitemap: www.example.com/sitemap-blog Submit both sitemaps to Webmaster tools. Does this sound reasonable? Any better approaches? Anything I'm missing? All input appreciated!
Intermediate & Advanced SEO | | AABAB0 -
Multi domain redirect to single domain
Hello, all SEOers. Today, I would like to get some ideas about handling multiple domains. I have a client who bought numerous domains under purpose of prevent abuse of their brand name and at the same time for future uses. This client bought more than 100 domains. Some domains are paused, parked, lived and redirected to other site. I don't worry too much of parked domains and paused domains. However, what I am worrying is that there are about 40 different domains are now redirected to single domain and meta refresh was used for redirections. As far as I know, this can raise red flag for Google. I asked clients to clean up unnecessary domains, yet they want to keep them all. So now I have to figure out how to handle all domains which are redirect to single domain. So far, I came up with following ideas. 1. Build gateway page which shows lists of my client sites and redirect all domains to gateway page. 2. Implement robots.txt file to all different domains 3. Delete the redirects and leave it as parked domains. Could anyone can share other ideas in order to handling current status? Please people, share your ideas for me.
Intermediate & Advanced SEO | | Artience0 -
What Is The Preferred Url Structure For Se’s?
Here is my issue, my domain is abcdomian.com and I’m trying to rank the site for the keyword “example”. All of my content is under “abcdomain.com/folder/example/” and building content off of “abcdomain.com/example” is not an option. So I’m thinking about moving the content to “abcdomain.com/online-example/” and 301ing the old pages . Of the two paths below, which will have a greater impact on my rankings for the term “example”? Current: abcdomain.com/folder/example/
Intermediate & Advanced SEO | | samp582
Proposed: abcdomain.com/online-example/ Thoughts?0 -
Best multi-language site strategy?
When reading about multi-language site structure, general knowledge says that there are 2 right ways of doing it right: Assign one domain per region/ language: www.domain.fr www.domain.de www.domain.co.uk ... If a country has more than one language, such as Switzerland, you can create folders for those languages: www.domain.ch/fr - in french www.domain.ch/de - in german Have a unique domain www.domain.com for the whole site and create folders for language region: www.domina.com/fr www.domain.com/uk ... If a language is spoken in more than one country, you can create subfolders www.domain.com/fr-ch - french in switzerland www.domain.com/de-ch - german in switzerland At first sight, it seems that option 1 is the right one. However, sites such as www.apple.com are using option 2. I am unable to decide... what would you recommend? Any objective criteria?
Intermediate & Advanced SEO | | hockerty0