Does google sandbox aged domains too?
-
Hello, i have a question. Recently i bought a domain from godaddy auction which is 23 years old and have DA 37 PA 34
Before bidding i check out the domain on google using this query to make sure if pages of this website are showing or not (site:mydomain.com) only home page was indexed on google. Further i check the domain on archive web the domain was last active in 2015. And then it parked for long about 4 years.
So now my question does google consider these type of domain as new or will sandboxed them if i try to rebuild them and rank for other niche keywords ? Because its been 4 weeks i have been building links to my domain send several profile and social signals to my domain. My post is indexed on google but not showing in any google serp result.
-
My keywords now start to show on google second and third page. I think I should wait to see some more improvement. Only few links are showing in search console. Moz and ahref shows 300+ referring domains. I should have to wait more until all referring domains start to show in search console.
-
I am not hoping to see immediate effect. I know seo is the game which takes time to show proper result. I think i should have wait more than a month or two. After this i'll decide to invest in another domain. What do you think about my this idea?
-
The authority has probably decayed, I think it's more a case of starting over and rebuilding the authority - rather than waiting and hoping for the best. I know, it sucks when you have shelled out on a domain. But in my experience domain purchasing is really hit and miss. If you don't see an immediate difference, often you don't see one at all. Maybe others have different POVs though
-
Thanks for clearing. That is why my keywords are not showing up in google search. Because domain was parked for about 5 years. May I know the duration actually how long i'll have to wait more to see some positive improvements?
-
I would say that if the domain had been parked for an extensive duration it probably would count as fresh, especially if (once the domain were resuscitated) the content was very different from Google's last 'active' cache. They don't really want to give people free SEO authority just for buying old domains (that would make it way too easy to game Google's rankings)
They do a similar thing with 301 redirects now where, they check if the 301-receiving URL is 'similar' (probably in Boolean string similarity terms) to the last active cache of the old URL, so nowadays - even the mighty 301 often doesn't transfer much (or any) SEO authority. I guess it's because, the old URL (in this hypothetical redirect scenario) gained links from webmasters based upon the old content. If the new content is quite different, those webmasters may not have chosen to link to it, ergo the content is then expected to re-prove itself (sounds perfectly fair to me)
Another thing, Google don't use Moz's PA and DA metrics to rank pages. They're shadow metrics, metrics which our industry invented to mimic "PageRank" which Google don't show publicly, and never did (unless you count the old Toolbar PageRank, but that was grossly oversimplified and has been deprecated). As such, sometimes sites have moderate PA and DA without ranking well or at all on Google (Moz's link index is far superior to their keyword index)
Finally, Moz's PA and DA don't take into account hidden signals. The disavows on a domain, any penalties it might have. When a site gains a Google penalty (or algorithmic devaluation) Moz's tool does not get an update from Google on that
Buying old domains is a pretty hazy business, IMO there are too many variables to make most purchases (purely for SEO purposes) viable or worthwhile (or scaleable)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical cross domain Linkjuice
I know that back few years ago, rel=canonical used on cross-domain was passing link juice. As I've read based on many experts (case studies), the canonical cross-domain was working like implementing a 301. Is it still the case ? Does anyone tried to implement it recently and it worked ?
White Hat / Black Hat SEO | | manoman880 -
Should I disavow links to a dead sub domain?
I'm analyzing a client's website today and I found that they have over 300 spammy sites linking to a subdomain of their main site. So for example, say their site is clientsite.com, well they have hundreds of links pointing to deadsite.clientsite.com. That subdomain was used at one time as a staging site, and is no longer active. Are those hundreds of spammy sites hurting or potentially hurting my client's SEO? Or is it a non-issue because the links point to a dead subdomain? We believe that that staging sub domain site was hacked at one time, and thats where all those spammy links came from. Should I disavow them?
White Hat / Black Hat SEO | | rubennunez0 -
I have 4012 links from one blog - will Google penalise?
My website (http://www.gardenbeet.com) has 4012 links from http://cocomale.com/blog/ to my home page -a banner advert links from the blog - I also have 3,776 from another website to 6 pages of my website 1,832 from pinterest to 183 pages etc etc overall there are 627 domains linking to my website I have been advised by a SEO company that I was penalised in about may to july 2012 due to a large number of links coming from one domain or two domains is that true? should I ask the blog owner to remove my link?
White Hat / Black Hat SEO | | GardenBeet0 -
Have just submitted Disavow file to Google: Shall I wait until after they have removed bad links to start new content lead SEO campaign?
Hi guys, I am currently conducting some SEO work for a client. Their previous SEO company had built a lot of low quality/spam links to their site and as a result their rankings and traffic have dropped dramatically. I have analysed their current link profile, and have submitted the spammiest domains to Google via the Disavow tool. The question I had was.. Do I wait until Google removes the spam links that I have submitted, and then start the new content based SEO campaign. Or would it be okay to start the content based SEO campaign now, even though the current spam links havent been removed yet.. Look forward to your replies on this...
White Hat / Black Hat SEO | | sanj50500 -
How do I write tags on a youtube video for a local Google search?
I've been reading into tags, and I would like to know what the best ways to do them for a local search are. Right now I have a title that reads similar to, "Keyword1 and Keyword2 in City X" Would I make a corresponding tag that reads "Keyword 1 and Keyword 2 in City X,"? Or would I do "Keyword 1," "Keyword 2," and, "City X," as separate tags? Thanks!
White Hat / Black Hat SEO | | OOMDODigital0 -
Why Google still display search result so bad?
When I search this keyword Backlink คือ by Google browser(Google.co.th) then I saw these Domain that is spam keyword and worse content (Spin content and can not understand what it said) อํานาจเจริญ.dmc.tv/?p=19
White Hat / Black Hat SEO | | taradmkt
ฉะเชิงเทรา.dmc.tv/?p=28 พังงา.dmc.tv/?tag=backlink หนองคาย.dmc.tv/?p=97 ขอนแก่น.dmc.tv/?tag=backlink ชัยนาท.dmc.tv/?p=70 ตราด.dmc.tv/?tag=backlink etc As you can see the search result**.** My question is 1. How to tell Google to check these network 2. Why these network display Top 10 for 3 weeks!!!!! and after that they rank drop. 3. Why Facebook page rank on Google in the search result Please make me clear.0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Google Panelizes to much SEO
I just read this interesting article about a new Google Penalty that will be up in the next upcoming weeks/months about Google making changes to the algorithm. The penalty will be targeted towards websites that are over optimized or over seo'ed. What do you think about this? Is this a good thing or is this not a good thing for us as SEO marketeers? here's the link: SEL.com/to-much-seo I'm really curious as to your point of views. regards Jarno
White Hat / Black Hat SEO | | JarnoNijzing0