Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does google sandbox aged domains too?
-
Hello, i have a question. Recently i bought a domain from godaddy auction which is 23 years old and have DA 37 PA 34
Before bidding i check out the domain on google using this query to make sure if pages of this website are showing or not (site:mydomain.com) only home page was indexed on google. Further i check the domain on archive web the domain was last active in 2015. And then it parked for long about 4 years.
So now my question does google consider these type of domain as new or will sandboxed them if i try to rebuild them and rank for other niche keywords ? Because its been 4 weeks i have been building links to my domain send several profile and social signals to my domain. My post is indexed on google but not showing in any google serp result.
-
My keywords now start to show on google second and third page. I think I should wait to see some more improvement. Only few links are showing in search console. Moz and ahref shows 300+ referring domains. I should have to wait more until all referring domains start to show in search console.
-
I am not hoping to see immediate effect. I know seo is the game which takes time to show proper result. I think i should have wait more than a month or two. After this i'll decide to invest in another domain. What do you think about my this idea?
-
The authority has probably decayed, I think it's more a case of starting over and rebuilding the authority - rather than waiting and hoping for the best. I know, it sucks when you have shelled out on a domain. But in my experience domain purchasing is really hit and miss. If you don't see an immediate difference, often you don't see one at all. Maybe others have different POVs though
-
Thanks for clearing. That is why my keywords are not showing up in google search. Because domain was parked for about 5 years. May I know the duration actually how long i'll have to wait more to see some positive improvements?
-
I would say that if the domain had been parked for an extensive duration it probably would count as fresh, especially if (once the domain were resuscitated) the content was very different from Google's last 'active' cache. They don't really want to give people free SEO authority just for buying old domains (that would make it way too easy to game Google's rankings)
They do a similar thing with 301 redirects now where, they check if the 301-receiving URL is 'similar' (probably in Boolean string similarity terms) to the last active cache of the old URL, so nowadays - even the mighty 301 often doesn't transfer much (or any) SEO authority. I guess it's because, the old URL (in this hypothetical redirect scenario) gained links from webmasters based upon the old content. If the new content is quite different, those webmasters may not have chosen to link to it, ergo the content is then expected to re-prove itself (sounds perfectly fair to me)
Another thing, Google don't use Moz's PA and DA metrics to rank pages. They're shadow metrics, metrics which our industry invented to mimic "PageRank" which Google don't show publicly, and never did (unless you count the old Toolbar PageRank, but that was grossly oversimplified and has been deprecated). As such, sometimes sites have moderate PA and DA without ranking well or at all on Google (Moz's link index is far superior to their keyword index)
Finally, Moz's PA and DA don't take into account hidden signals. The disavows on a domain, any penalties it might have. When a site gains a Google penalty (or algorithmic devaluation) Moz's tool does not get an update from Google on that
Buying old domains is a pretty hazy business, IMO there are too many variables to make most purchases (purely for SEO purposes) viable or worthwhile (or scaleable)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I Report A SEO Agency to Google
Our competitor has employed the services of a spammy SEO agency that sends spammy links to our site. Though our rankings were affected we have taken the necessary steps. It is possible to send evidence to Google so that they can take down the site. I want to take this action so that other sites will not be affected by them again.
White Hat / Black Hat SEO | | Halmblogmusic0 -
Dfferent domains on same ip address ranking for the same keywords, is it possible?
Hello, I want to ask if two domains which r hosted on the same server and have the same ip ( usually happens with shared hosts ) tries to rank for the same keywords in google, does the same ip affects them or not.
White Hat / Black Hat SEO | | RizwanAkbar0 -
Best URL structure for SEO for Malaysian/Singapore site on .com.au domain
Hi there I know ideally i need a .my or .sg domain, however i dont have time to do this in the interim so what would be the best way to host Malaysian content on a www.domainname.com.au website? www.domainname.com.au/en-MY
White Hat / Black Hat SEO | | IsaCleanse
www.domainname.com.au/MY
domainname.com.au/malaysia
malaysia.domainname.com.au
my.domainname.com.au Im assuming this cant make the .com.au site look spammy but thought I'd ask just to be safe? Thanks in advance! 🙂0 -
Does Trade Mark in URL matter to Google
Hello community! We are planning to clean up TM and R in the URLs on the website. Google has indexed these pages but some TM pages are have " " " instead displaying in URL from SERP. What's your thoughts on a "spring cleaning" effort to remove all TM and R and other unsafe characters in URLs? Will this impact indexed pages and ranking etc? Thank you! b.dig
White Hat / Black Hat SEO | | b.digi0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Whats up with google scrapping keywords metrics
I've done a bit of reading on google now "scrapping" the keywords metrics from the analytics. I am trying to understand why the hell they would do that? To force people to run multiple adwords campaign to setup different keywords scenario? It just doesn't make sense to me...If i am a blogger or i run an ecommerce site...and i get a lot of visit regarding a particular post through a keyword they clicked on organically. Why would Google wanna hide this from people? It's great Data for us to carry on writing relevant content that appeals to people and therefore serves the need of those same people? There is the idea of doing White Hat SEO and focus on getting strong links and great content etc... How do we know we have great content if we are not seeing what is appealing to people in terms of keywords and how they found us organically... Is google trying to squash SEO as a profession? What do you guys think?
White Hat / Black Hat SEO | | theseolab0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0