Multiple domains different content same keywords
-
what would you advice on my case:
It is bad for google if i have the four domains.
I dont link between them as i dont want no association, or loss in rakings in branded page.
Is bad if i link between them or the non branded to them branded domain.
Is bad if i have all on my webmaster tools, i just have the branded
My google page is all about the new non penalized domain. altough google gave a unique domain +propdental to the one that he manually penalized. (doesn't make sense)
So. What are the thinks that i should not do with my domain to follow and respect google guidelines. As i want a white hat and do not do something that is wrong without knowledge
-
301 the additional domains to the one you want to focus on - except the domain that has the penalty, if you're certain it does.
-
Four domains = splitting your efforts by 4, splitting your potential links and DA by four etc.
If you work on just one domain, you can put all of your effort into it - this is the way to go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive Spam attack against my domain - automate disvow of tld?
We've been getting hundreds of new links from unique domains every day - all the domains follow a pattern like this: www.someword-1f4163e1.space/wiki/Someterm Hundreds... every day. What techniques exist to deal with a prolonged negative seo attack of this type. By the time we can detect and disvow, the damage is done.
White Hat / Black Hat SEO | | sonar0 -
Are online tools considered thin content?
My website has a number of simple converters. For example, this one converts spaces to commas
White Hat / Black Hat SEO | | ConvertTown
https://convert.town/replace-spaces-with-commas Now, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commas Similarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more) If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for. It is obvious what these pages do so I do not want to clutter the page up with unnecessary content. However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?1 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Does Duplicate Content Actually "Penalize" a Domain?
Hi all, Some co-workers and myself were in a conversation this afternoon regarding if duplicate content actually causes a penalty on your domain. Reference: https://support.google.com/webmasters/answer/66359?hl=en http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459 Both sources from Google do not say "duplicate content causes a penalty." However, they do allude to spammy content negatively affecting a website. Why it came up: We originally were talking about syndicated content (same content across multiple domains; ex: "5 explanations of bad breath") for the purpose of social media sharing. Imagine if dentists across the nation had access to this piece of content (5 explanations of bad breath) simply for engagement with their audience. They would use this to post on social media & to talk about in the office. But they would not want to rank for that piece of duplicated content. This type of duplicated content would be valuable to dentists in different cities that need engagement with their audience or simply need the content. This is all hypothetical but serious at the same time. I would love some feedback & sourced information / case studies. Is duplicated content actually penalized or will that piece of content just not rank? (feel free to reference that example article as a real world example). **When I say penalized, I mean "the domain is given a negative penalty for showing up in SERPS" - therefore, the website would not rank for "dentists in san francisco, ca". That is my definition of penalty (feel free to correct if you disagree). Thanks all & look forward to a fun, resourceful conversation on duplicate content for the other purposes outside of SEO. Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Domain.com/XXX or domain.com/blog/XXX ?
i have a business and a side blog on the website. is it fine to turn my blog to domain.com/XXX instead of domain.com/blog/XXX? does it in anyway of these affect the SEO?
White Hat / Black Hat SEO | | andzon0 -
From keyword rankings to ......... what KPI?
Hi Folks, I have a customer whos keyword rankings for Google have fluctuated rather widly over the past two months which has caused some consternation on their part dispite our reassurance. This is caused in large part due to their lack of understanding of SEO, little effort on their part in implementing changes for SEO and what I belive to be unrealistic expectations (having done no SEO work on their site wanting to see first page ranking for competitive keywords like 'heart attack' within 4-8 weeks). At the moment we are using just keyword rankings as a KPI and I wish to reframe them by using additional or alternative KPIs so that as rankings fluctuate with future Google seach algorithm tweaks and changes that the customer isn't solely focused on them. I am still in the process of formulating this list but so far I have decided to include the KPIs listed below. Month-on-Month / Quarter-On-Quarter Organic search traffic volume (should be rising) Top landing pages excluding branding keywords and homepage (should corelate to content created to target specific keywords) Number of landing pages on the client site that rank List of landing pages and bounce rates (are the 'gateway pages' holding visitors due to meeting their search requirements?) Average number of keywords per landing page (possibly integrated with the landig page reports above as a dimension to demonstrate correlation of # of keywords to landing pages) Some visibility on top keyword search terms (provided from Adwords where possible and GWT also) Top organic keywords (from adwords and GWT) Conversions from organic search (will vary from client to client for their own needs but will primarily be implemented using Google Tag Manager event tracking for things like enquiry forms etc) Referral traffic Delta/Ranking trends over large set of datapoints (will depend on how often you poll/track rankings but for example if you track rankings weekly then assess the trend of the rankings over 3-6months to smooth out the fluctuations) Your thoughts and feedback on this would be greatly appreciated. Regards, Dave
White Hat / Black Hat SEO | | icanseeu1 -
Webiste Ranking Differently Based on IP/Data Center
I have a site which I thought was ranking well, however that doesn't seem to be the case. When I check the site from different IPs within the US it shows that the site is on page 1 and on other IPs it shows that it's on page 5 and for some keywords it shows it's not listed. This site was ranking well, before but I think google dropped it when I was giving putting in too much work with it (articles and press releases), but now it seems to have recovered when I check with my IP, but on other data centers it still shows it prior to recovering. It was able to recover after not building links to for a period of time, it showed it moved back up from the data center I'm connected to, but it still shows the possibly penalized results on other data centers. Is it possible that site is still penalized? So the question is why does it show it recovered in some data centers and not others? How do I fix this? It's been about 2 months since it's recovered from some data centers. Is this site still penalized or what's going on? There are no warnings in web master tools. Any insights would be appreciated! This isn't an issue with the rank tracking software, I've tested this on a multitude of IPs with varying differences. Thanks!
White Hat / Black Hat SEO | | seomozzy0