1500 Domains... Where to begin? & Web Structure Question.
-
So, as the title says, I am stuck. I recently have been brought on as the SEO guru for a small-mid size company with the task of rebuilding their web presence. Their website is in pretty unfortunate condition. The more research I do, the farther and farther I am going down the rabbit hole of chaos.
Essential the previous CEO was doing all SEO work. He purchased 1500 domains, all keyword specific. Installed wordpress on roughly 1,000 and then began pumping out content. Of the 1,000 roughly 300 of them have about 600-2,000 characters worth of content that is absolute fluff. From there the linking began.
Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web.
My advice is to cut those links ASAP and remove the previous work. At the same time, I also don't want them to lose rank. So I guess I am asking a whole slew of questions...
- Am I right in thinking that we have to build a bridge before we burn a bridge?
- Is it worth fixing up some of those other domains to have original content to try and bolster what we already have?
- Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.?
- Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web.
Lastly, any thoughts you all have would be greatly appreciated. I realistically have minimal experience in this realm. I am a a major nub. I understand SEO in theory, sorta. So I'm getting there!
-
UPDATE
I was just informed by the agency that we are working with that two pages on our website have been compromised.
- Index.php
- Hello.php
Does anybody have any recommendations for malicious code identification software? Has anybody used the Browseo cloak identification tool? What is everybody's go-to tool for this sort of thing?
-
Nice Work!
-
We have teamed up with a local agency in Los Angeles who will be assisting us with a redesign. They are also going to look through our SEO and assist, if possible, with that. I am currently working with the studio, our treatment team, and myself to come up with a website architecture that is user friendly and effective.
As far as what has been done with the link structure. I was able to narrow down the 1000 domain list to roughly 200 that actually had content and were actually linking to us. I changed the nameservers on all of them which should not only speed up the private server we are hosting on, but also lower the number of domains on our CBlock.
I have spoken with a few members from this forum offline who I will not name unless they give me approval to give them credit.
Thanks Everybody!
-
Thank you both! You are both SEOmnipotent. See what I did there?
I might pinch that
With the links, something else you will probably need to do, is perform a disavow of the ones you don't want to be associated with. Google would eventually get round to them all, but i'll bet that with a bit of digging, you won't just find the links from the network the only issues.
And thank you Egol
-Andy
-
HA! Yea, I think that sorta answered my question.
-
I think you need that consultant.
-
Thank you both! You are both SEOmnipotent. See what I did there?
Anyway, that's actually, kinda nice... One of the methods I have been doing is looking at the Open Web Explorer.
So we have 145 Root Links, 2,296 total links a DA of 35 which I know is nothing great. At our best we have a link from psychologytoday.com with a DA of 90, and at the other end a bunch of links from plenty of our random websites with a DA of around 20. I have been going through each domain trying to determine where the "hives" are and which ones are just the outliers. Am I on the right track here?
-
Good info from Andy.
Trying to combine the 'fluff' content of 1000 sites into a primary domain could seal your fate. I would advise heavily against doing this.
Yes. If you are going to recycle anything be sure that it is not duplicate content from anywhere else.
I would guess you need to look to start a structured takedown of the network, but you must handle this carefully.
Yes, I would be using a hatchet more than a hammer on this job. But, careful study before starting.
-
Egol has pretty much hit the nail on the head.
You have a big old mess there and you don't want to end up doing more damage than good - it is easily done!
Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web.
That there is a big problem - link networks, as that is what this basically is, are a huge headache and I can only assume it is because it is poor quality, that Google hasn't found it yet. However, they probably will at some point so you want to back out of that as soon as possible.
Am I right in thinking that we have to build a bridge before we burn a bridge?
Yes... And no. If you get this bit wrong and suddenly Google becomes aware of what is going on in the background, you will bring them down on you quicker than you can take the sites offline. I am guessing that 99.9% of these sites have no reason to be there? If that is the case, I would be looking at a structured takedown / decommission.
Is it worth fixing up some of those other domains to have original content to try and bolster what we already have?
Probably not. If the content is how you describe it, Google has probably already made their mind up about the site and trying to fix a site of poor quality, can sometimes be an uphill battle because you are starting at a negative score. That said, it is difficult to advise exactly on this. You want to get out of this whole link network though.
Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.?
Trying to combine the 'fluff' content of 1000 sites into a primary domain could seal your fate. I would advise heavily against doing this. Perhaps there are some sites with decent content on them that could be used to make up a blog post, but you would have to be careful about checking to see if was indexed before placing it elsewhere.
Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web.
I wouldn't want to advise incorrectly on this, but I would guess you need to look to start a structured takedown of the network, but you must handle this carefully. Either that or distancing the main site by removing links from the network to the primary domain.
I hope this helps a little.
-Andy
-
Yeah, thank you! That was sorta where my head was going as well. Thanks again!
-
If I was in your situation, I would prepare a one- or two-page document that explains how there are hundreds of spam domains in the wild that contain links and garbage that could damage the company reputation and rankings.
The goal of this document would be to justify funds to hire an experienced consultant who can advise you on how to fix this problem and create a plan for developing the company's web assets going forward.
There is nothing wrong with an in-house SEO getting appraisals or consultations from experts or just second opinions from colleagues. I do this all of the time - at least a few times a month before I make moves or decisions. It is not a confession that you are a noob or that you need assistance to do your job. I get consulting to run my sites better. It pays.
From what you have described, the size of this problem is enormous and probably too complex to get good well-considered advice for free from a forum. It will probably get more complex after you start digging.
I am not recommending paid advice because I am looking for work. I don't do paid consultations for anyone and this job is too big to understand in the amount of time spent on a forum question. You don't want poorly considered advice. That usually costs a lot more in repairs than good advice costs from the start.
There are plenty of people who post here who are qualified to help you. I suggest searching the Q&A for problems similar to this, see who replied, who was helpful, who got lots of thumbs up for relevant questions and who has a style that you like. Don't rush to hire. Be careful. Do some research.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Witch domain to use the highest authority and backlinks or the new one?
Hello I have 2 domain names to the same website as described: domain.net is redirecting to domain.com 301. domain.net age = 16 years domain.com age = 18 years domain.net was in use before the 301 redirects for many years, and it have a better moz authourity(55) from domain.com(45). also have a better count of linking domains(3.2K backlinks) backlinks from domain.com(1.2K backlinks). all data is from Moz pro. the question was what to use for my website for more SERP traffic? Is using .net domain better? or make them both work without 301 redirect with a canonical meta tag? if yes to witch domain best to point this meta? Thanks
Moz Pro | | alarab.net0 -
Domain Authority Goes Down After Https change
We just moved our domain from http to https. We have been at a solid 89 for domain authority for the past year, but this month we are now at 88. Does changing to https cause the domain authority to go down and is this just a temporary fluctuation?
Moz Pro | | greenjane0 -
Big drop in Domain Authority on 17 Nov 2015
Hello, My domain authority dropped on 17 Nov 2015 Domain name is: http://www.proprofs.com/ drop from 74 to 68 Also on another domain : http://www.marriage.com/. Drop from 38 to 34. Very much surprised by this big drop? Would appreciate everyone's feedback? Thanks
Moz Pro | | SameerBhatia0 -
Noob question about keyword research
I have a client that does commercial building renovations. I looked at a lot of their competitors and did some googling to see what words people were using. Some terms such as commercial fit-outs, didn't really have any search volume. I did pick out some that had some volume, Google KW Tool said anywhere from 46 local searches for something like "commercial construction contractors" to 320 for "commercial construction companies". However, no data really shows up for local specifics like "commercial construction companies philadelphia". I tried looking at competitors in the Moz site explorer, but not too many were doing much link building to look at anchor text. Question 1) When it says "local monthly searches", is it defaulting to MY local area based on my google info? So if im in Philly area that local area is whats displaying? Question 2) I tried to run the same keyword list I pulled from Adwords tool through Moz KW tool, and it tells me some of the terms don't have ANY traffic! So know I am real confused as to whether the KW's i picked are good or crap! I attached image of the stuff. CnYyNb3
Moz Pro | | satoridesign0 -
How can we efficiently use Fresh Web Explorer and Just Discovered Links?
Love the fresh data sources SEOmoz is building for us. However, I'm frustrated by the lack of scale the tools offer. Let's say I have 30 competitors I want to watch (which is pretty conservative - if we're targeting 100 keywords on a site, we could easily have 100's of top 20 ranked competitors). If I have to run individual reports for each using OSE and Fresh Web Explorer, that would be hours of work every day/week. Ideally, I'd like to see a campaign feature where you could add 2-200 competitors to view in one report. You could view recent links (from FWE and JDL) for all competitors on one handy report, and sort by various metrics. So for example, if you wanted to view the top 10 links your competitors have gotten in the past week, you could see that in 30 seconds of work, vs many hours of work. Any others who think this would be useful? Any ideas for how we can use the data in such a way without this feature?
Moz Pro | | AdamThompson1 -
Ask a Question
SEO Moz has found a bunch of duplicate content errors on one of my sites because blog archives has created a new page for each tag. For example: http://examplesite.com/blog/tag/travel http://examplesite.com/blog/tag/travel-photography http://examplesite.com/blog/tag/tours http://examplesite.com/blog/tag/accommodations The above urls are just tags for the same exact page, hence making them all duplicates. Is it normal for each tag to be a separate page? Do the search engines see this as duplicate content? If so, how do I correct this issue if I want to keep the tags? Regards,Dino
Moz Pro | | Dino640 -
Question to SEOMOZ when will the On-Page Optimization tool be Updated
Hi, When will your On-page Optimization Tool be updated to reflect the Over Optimization Penalty which is coming. I would think that grading out at an 'A' will have adverse effects.
Moz Pro | | Bucky0 -
Press Release - using moz bar/OSE is reading domain not page? How? Why?
A question posed by Christopher Glaeser from early today:low PA high DA, had a follow up response from him providing 2 urls from PR WEB for separate press releases: http://www.prweb.com/releases/2011/11/prweb8923419.htm (HP White) On moz bar Page Analysis/Link Data = PA - 47 DA - 36 http://www.prweb.com/releases/2011/12/prweb9051351.htm (Golfer's Advice) On moz bar Page Analysis/Link Data = PA - 1 DA - 96 I kept scratching my head as to how a press release of 6 weeks ago had garnered such attention from a company that would not seem to have a huge traffic due to more obscure product offering and scientific subject (Analyses of Armor Industry versus Golf Advice).
Moz Pro | | RobertFisher
Then I realized that for HP White, Link Data was not about the PR. The url from mozbar was HPWhite.com not the above, I did not notice until I used OSE where same thing was happening. When I cut and pasted the above press release url for HP White and placed it in OSE this changed: PA - 49 DA - 96 (2 links 2 linking root domains) For Golfers advice (0 links from 0 linking domains) Note to all: the links to the PR WEB release for the HP included a low end directory type link and a link from PR WEB (. For Golfer's Advice there was not a link back to the release from PR WEB: Note that Golfer's Advice is a newer release (6 weeks). So, any link from HPWhite release would equal more juice to HP White and PR Web and Vocus. Any link to Golfer's Advice from release offers......???? to Golfer's Advice and who cares to Vocus and PR Web. So, I guess this begs a couple of questions: Why the mozbar link analysis difference for one versus the other? Does PR Web treat some differently than others? Who benefits most from me paying a PR Web to do press releases for a client, PR Web and Vocus or my client and I???? I have tried to order the images to make sense: L to R top, then bottom is last. [](<a href=)" target="_blank">a> [
](<a href=)" target="_blank">a> [
](<a href=)" target="_blank">a> [
](<a href=)" target="_blank">a> [
](<a href=)" target="_blank">a> [
](<a href=)" target="_blank">a>
0