Is this organic search sketchiness worth unwinding?
-
Started working on a site and learned that the person before me had done a fairly sketchy maneuver and am wondering if it's a net gain to fix it.
The site has pages that it wanted to get third party links linking to. Thing is, the pages are not easy to naturally link to boost them in search.
So, the woman before me started a new blog site in the same general topic area as the first/main site. The idea was to build up even the smallest bit of authority for the new blog, without tipping Google off to shared ownership. So, the new blog has a different owner/address/registrar/host and no Google Analytics or Webmaster Tools account to share access to.
Then, as one method of adding links to the new blog, she took some links that originally pointed to the main site and re-directed them to the blog site.
And voila! ...Totally controllable blog site with a bit of authority linking to select pages on the main site!
At this point, I could un-redirect those links that give the blog site some of its authority. I could delete the links to the main site on the blog pages.
However, on some level it may have actually helped the pages linked to on the main site.
The whole thing is so sketchy I wonder if I should reverse it.
I could also just leave it alone and not risk hurting the pages that the blog currently links to.
What do you think? Is there a serious risk to the main site in this existing set up? The main site has hundreds of other links pointing to it, a Moz domain authority of 43, thousands of pages of content, 8 years old and Open Site Explorer Spam Score of 1. So, not a trainwreck of sketchiness besides this issue.
To me, the weird connection for Google is that third party sites have links that (on-page-code-wise) still point to the main site, but that resolve via the main site's redirects to the blog site. BTW, the blog site points to other established sites besides the main site. So, it's not the exclusive slave to the main site.
Please let me know what you think. Thanks!
-
I agree with the two methods that both you and Gaston have pointed out.
The downside to reversing those links is that the domain authority could drop a bit—which could impact their rankings on the SERPs. If this happens, the client might think you are doing something wrong and causing their rankings to rank when, in theory, you were trying to help get rid of any sketchy links. In my opinion, I’d keep them. They’ll make your work perform better. Disavowing them could yield worse results than what their former SEO provided. If that happens, you're playing defense and blaming.
Hope this helps!
-
Well, I like Gaston's answers on these boards and at the same time was curious if that seemed like the concensus.... leave it cause no real risk.
-
Hi 94501! Did Gaston answer you question, and if so, would you mind marking his response a "Good Answer?"
Otherwise, how else can we help?
-
Thanks, Gaston!
Any other insights, folks?
Mike
-
Hi there,
There are 2 exits here, and you've pointed them:
- Reverse those links
- Leave all as it is now.
On one hand, if you aren't confortable with those links, just reverse all.
On the other hand, you've said that the main site has a lot of links and it those 'unnatural links' will not make harm and that the satellite blog has really few conections to the latter. I'd say that there isnt, almost nothing, risk. So, i'd leave as it is now.
Hope it helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitelink Search Box mark-up when multiple query strings are used
Hi all, I'm looking to implement sitelink search box mark-up in Google Tag Manager in JSON-LD format. This would be popped into the Custom HTML tag and would look a little something like: The above option is great if you have one query string for your search term, but what if you had a URL that triggered two query strings - for example: https://www.example.com/search?q=searchterm&category=all Would you need to amend the code something like the below: Any help would be much appreciated! Cheers, Sean
Intermediate & Advanced SEO | | seanginnaw0 -
Ranking on google search
Hello Mozzers Moz On page grader shows A grade for the particular URL,but my page was not ranking on top 100 Google search. Any help is appreciated ,Thanks
Intermediate & Advanced SEO | | sobanadevi0 -
An improved search box within the search results - Results?
Hello~ Does anyone have any positive traffic results to share since implementing this? Thanks! MS
Intermediate & Advanced SEO | | MargaritaS0 -
NEw domain extensions, are they worth it seo wise?
Hello I am curious if all of these new extensions for domains are worth it? So say you are a home builder and you bought homebuilder.construction - where as construction is a new extension, does this help seo? Or is it all just a big sales gimmick? Thank you for your thoughts
Intermediate & Advanced SEO | | Berner1 -
Why does the site I am working on have so few visits from organic search results?
Hello! I am not very experienced with SEO, but I am trying to help out on a site that has been around since 2010 and has well over a thousand pages of high-quality, original content, with more being added all the time. Only around 65 of the site's daily visits come from organic search results; this seems very low. There has already been significant SEO work done on the site. Is there something about the site that strikes anyone as obviously getting in the way of organic traffic? The URL is ellenjovin.com. I would appreciate any thoughts you may have. Thank you very much!
Intermediate & Advanced SEO | | nyc-seo0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Is it worth re-doing SEO for all existent products
We have a website and when we started, we had no clue about SEO, nor did we really understand the full extent of CRO amongst other things. We have slowly learnt that there are many changes that need to happen to our site; however...do we need to re SEO all the content that is already on the website or can we purely start a fresh with the new products we feed through? The website is: www.onlineforequine.co.uk if you need to take a look at the kind of platform we are working with.
Intermediate & Advanced SEO | | onlineforequine0 -
Block all search results (dynamic) in robots.txt?
I know that google does not want to index "search result" pages for a lot of reasons (dup content, dynamic urls, blah blah). I recently optimized the entire IA of my sites to have search friendly urls, whcih includes search result pages. So, my search result pages changed from: /search?12345&productblue=true&id789 to /product/search/blue_widgets/womens/large As a result, google started indexing these pages thinking they were static (no opposition from me :)), but i started getting WMT messages saying they are finding a "high number of urls being indexed" on these sites. Should I just block them altogether, or let it work itself out?
Intermediate & Advanced SEO | | rhutchings0