1500 Domains... Where to begin? & Web Structure Question.
-
So, as the title says, I am stuck. I recently have been brought on as the SEO guru for a small-mid size company with the task of rebuilding their web presence. Their website is in pretty unfortunate condition. The more research I do, the farther and farther I am going down the rabbit hole of chaos.
Essential the previous CEO was doing all SEO work. He purchased 1500 domains, all keyword specific. Installed wordpress on roughly 1,000 and then began pumping out content. Of the 1,000 roughly 300 of them have about 600-2,000 characters worth of content that is absolute fluff. From there the linking began.
Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web.
My advice is to cut those links ASAP and remove the previous work. At the same time, I also don't want them to lose rank. So I guess I am asking a whole slew of questions...
- Am I right in thinking that we have to build a bridge before we burn a bridge?
- Is it worth fixing up some of those other domains to have original content to try and bolster what we already have?
- Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.?
- Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web.
Lastly, any thoughts you all have would be greatly appreciated. I realistically have minimal experience in this realm. I am a a major nub. I understand SEO in theory, sorta. So I'm getting there!
-
UPDATE
I was just informed by the agency that we are working with that two pages on our website have been compromised.
- Index.php
- Hello.php
Does anybody have any recommendations for malicious code identification software? Has anybody used the Browseo cloak identification tool? What is everybody's go-to tool for this sort of thing?
-
Nice Work!
-
We have teamed up with a local agency in Los Angeles who will be assisting us with a redesign. They are also going to look through our SEO and assist, if possible, with that. I am currently working with the studio, our treatment team, and myself to come up with a website architecture that is user friendly and effective.
As far as what has been done with the link structure. I was able to narrow down the 1000 domain list to roughly 200 that actually had content and were actually linking to us. I changed the nameservers on all of them which should not only speed up the private server we are hosting on, but also lower the number of domains on our CBlock.
I have spoken with a few members from this forum offline who I will not name unless they give me approval to give them credit.
Thanks Everybody!
-
Thank you both! You are both SEOmnipotent. See what I did there?
I might pinch that
With the links, something else you will probably need to do, is perform a disavow of the ones you don't want to be associated with. Google would eventually get round to them all, but i'll bet that with a bit of digging, you won't just find the links from the network the only issues.
And thank you Egol
-Andy
-
HA! Yea, I think that sorta answered my question.
-
I think you need that consultant.
-
Thank you both! You are both SEOmnipotent. See what I did there?
Anyway, that's actually, kinda nice... One of the methods I have been doing is looking at the Open Web Explorer.
So we have 145 Root Links, 2,296 total links a DA of 35 which I know is nothing great. At our best we have a link from psychologytoday.com with a DA of 90, and at the other end a bunch of links from plenty of our random websites with a DA of around 20. I have been going through each domain trying to determine where the "hives" are and which ones are just the outliers. Am I on the right track here?
-
Good info from Andy.
Trying to combine the 'fluff' content of 1000 sites into a primary domain could seal your fate. I would advise heavily against doing this.
Yes. If you are going to recycle anything be sure that it is not duplicate content from anywhere else.
I would guess you need to look to start a structured takedown of the network, but you must handle this carefully.
Yes, I would be using a hatchet more than a hammer on this job. But, careful study before starting.
-
Egol has pretty much hit the nail on the head.
You have a big old mess there and you don't want to end up doing more damage than good - it is easily done!
Now the content is different enough that Google doesn't seem to notice that its the SAME FREEKIN THING on each domain, but I am very concerned. The company has their main multi-page domain which has other links and sources of traffic, but in essence the previous owner created a micro link web.
That there is a big problem - link networks, as that is what this basically is, are a huge headache and I can only assume it is because it is poor quality, that Google hasn't found it yet. However, they probably will at some point so you want to back out of that as soon as possible.
Am I right in thinking that we have to build a bridge before we burn a bridge?
Yes... And no. If you get this bit wrong and suddenly Google becomes aware of what is going on in the background, you will bring them down on you quicker than you can take the sites offline. I am guessing that 99.9% of these sites have no reason to be there? If that is the case, I would be looking at a structured takedown / decommission.
Is it worth fixing up some of those other domains to have original content to try and bolster what we already have?
Probably not. If the content is how you describe it, Google has probably already made their mind up about the site and trying to fix a site of poor quality, can sometimes be an uphill battle because you are starting at a negative score. That said, it is difficult to advise exactly on this. You want to get out of this whole link network though.
Would it be better to combine everything into one website, or try and have different domains represent different things. For example Envato.com is an umbrella website with 8 separate websites operating under the same roof using different domains.?
Trying to combine the 'fluff' content of 1000 sites into a primary domain could seal your fate. I would advise heavily against doing this. Perhaps there are some sites with decent content on them that could be used to make up a blog post, but you would have to be careful about checking to see if was indexed before placing it elsewhere.
Where do I begin? I feel like I have started this project numerous times. I know the keywords, I know where the duplicate content is, I know the structure of the main domain, I am getting the structure of the entire link web.
I wouldn't want to advise incorrectly on this, but I would guess you need to look to start a structured takedown of the network, but you must handle this carefully. Either that or distancing the main site by removing links from the network to the primary domain.
I hope this helps a little.
-Andy
-
Yeah, thank you! That was sorta where my head was going as well. Thanks again!
-
If I was in your situation, I would prepare a one- or two-page document that explains how there are hundreds of spam domains in the wild that contain links and garbage that could damage the company reputation and rankings.
The goal of this document would be to justify funds to hire an experienced consultant who can advise you on how to fix this problem and create a plan for developing the company's web assets going forward.
There is nothing wrong with an in-house SEO getting appraisals or consultations from experts or just second opinions from colleagues. I do this all of the time - at least a few times a month before I make moves or decisions. It is not a confession that you are a noob or that you need assistance to do your job. I get consulting to run my sites better. It pays.
From what you have described, the size of this problem is enormous and probably too complex to get good well-considered advice for free from a forum. It will probably get more complex after you start digging.
I am not recommending paid advice because I am looking for work. I don't do paid consultations for anyone and this job is too big to understand in the amount of time spent on a forum question. You don't want poorly considered advice. That usually costs a lot more in repairs than good advice costs from the start.
There are plenty of people who post here who are qualified to help you. I suggest searching the Q&A for problems similar to this, see who replied, who was helpful, who got lots of thumbs up for relevant questions and who has a style that you like. Don't rush to hire. Be careful. Do some research.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Moz recognize rel next prev tags? Magento question
Howdy Mozzers! We are running a store in magento where we have many products in each category. Hence view all for category pages is not an option. We have applied rel next prev tags to our paginated pages in the following manner Example for page 2 in a category: The issue we are facing is that Moz suggests www.domain.com/category and www.domain.com/category?p=1 as duplicates, even though rel next prev tags are implemented. 1. Does nel next prev consolidate link juice?
Moz Pro | | MozAddict
2. Does Moz recognize the tags?
3. Will this work for us or should we implement canonical tags as well?0 -
Why has the Historical Domain Analysis report not updated since 11th July 2013?
I find the Historical Domain Analysis report within Moz really useful but it's stopped updating for some reason. Is anyone else seeing this? When can we next expect it to update?
Moz Pro | | TranslateMediaLtd0 -
One of my sites got hit with the exact match domain algo change in google.
one of my sites got hit with the exact match domain algo change in google. I was wondering if it would be trouble or beneficial to add about 10 directory links in the the highest page rank directories in the seomoz directory list. It looks like i only have about 17 root domains linking to my site so far that are natural links and not directory links. i have no directory links as of yet. Will this help or does it spell trouble getting this many directory links in a few months? Thanks mozzers! Ron
Moz Pro | | Ron100 -
Do new Mozzers realise it takes effort to respond to their questions?
I see a lot of people asking questions and getting some pretty good responses, but the people who are responding, often do not get a thumbs up for their answer. If you are new and you are reading this, then you maybe do not understand that a thumbs up actually helps the person who gave their time to answer your question. There are several ways it helps, including giving feedback that the answer or the attempt to answer was actually useful, that you learned something or you appreciate the time it took to check your questions and give you feedback that you don't get in other forums, where you might be ignored for weeks or months. This is a collaborative forum and we are all here to learn something and to pass some of our knowledge to others who need it. Not every answer we give needs a thumbs up, but if you got something out of the answer, then, surely it is worth a second of your time to say "Thanks, that helped me" or "Yes, I agree with this"
Moz Pro | | loopyal13 -
2 SEO questions
Hello everyone 🙂 I have two questions for all you fine seo people this morning... 1 - Is there a website that I could go to to make sure that I do not have duplicate content floating around on the web that I am not aware of? Sometimes people take information from my site and post it as their own and I want to make sure that google does not ping me for it. 2 - Does anyone know how I can report a spam site to google? I have filled out the reports many times over the past year and posted it in the webmaster discussion forum and it is still up there 🙂 I sent one email to bing and the next day they contacted me with a thank you email and indicated that I was 100% correct and they removed the site from their index. Thank you all!!!!
Moz Pro | | nazmiyal0 -
Why Do Domain Link Metrics Differ in Keyword Difficulty Report vs OpenSiteExplorer
If anyone could shed some light on this that would be great. It seems as if the two tools are using different versions of the database?
Moz Pro | | belasco0 -
Keyword Rankings Question
Does the Keyword Rankings tool track all urls or just the main. Example: Say I put www.domain.com to have tracked. Will the Keyword Rankings tool also track www.domain.com/landing-page and any derivative of the main domain or would this be considered a separate campaign. Thanks
Moz Pro | | montage0 -
The value of .uk.com domains
It was a long night last night (working, not playing), therefore forgive the potential stupidness of this question... Running an OSE report on a website earlier their top linking domain was *.uk.com. This is a valid TLD, and I know for sure that the website has not acquired links direct from the uk.com website but from some-other-registered-domain.uk.com. What I am asking is does the Moz Calculation and/or the pagerank calculation give more value to these links? *.uk.com has a Domain Authority of 71 and 156,497 linking root domains. If so i'm gonna go buy me a bunch of .uk.com domains..... Opinions please.
Moz Pro | | eseyo2