Sub Directories Domain & Page Crawl Depth
-
Hi,
I just bought an old domain with good backlinks and authority, that domain was technology product formerly.
So, I want to make this domain for my money site. The purpose of this website is to serve technological information like WordPress tutorial and etc (free software or drivers).
And I just installed a sub directory on this domain like https://maindomain.com/subdirectory/ and this directory I made for a free software like graphics drivers download (NVIDIA or AMD). What you think with this website? Is it make sense?
Wait, I just added this domain to my campaign at MOZ and the result shown my sub directory was 6 times of crawl depth. Is it good for directory or I need to move the sub directory to my main site?
Thank you, hope someone answer my confuse.
Best Regard, Matthew.
-
Unless the content which earned the authority is very similar to the new content, Google might nuke the site's rankings anyway. I hope you also checked the site's historic SEO performance (traffic, ranking keywords) as, authority and links don't necessarily mean the site is work a damn
If the site had loads of good links but blocked Google from crawling it, then that authority may never have translated in SEO ranking power. If the site had strong SEO authority, that may have been offset by previous Google penalties and stuff like that - which don't reduce (but do entirely nullify) the benefit of having strong authority metrics
The days of just buying a site then putting your own stuff on and it does equally as well, are pretty much gone. Maybe you will be lucky, I don't know.
If Google does reboot your SEO authority, I don't think you'll get it back again with that kind of site. Free driver sites that push driver software (regmechanic, secunia PSI) are ten a penny and the best of them are 'iffy' at best (users often get issues like the wrong bit-architecture of the driver is installed on their OS - e.g: 32 bit driver is installed on system that supports 64 bit driver)
Some providers developed very powerful scanners, but they mostly evolved into software solutions (rather than more shallow websites). It's unlikely that in 2019, a site without a solid value-proposition (what unique value does it add to the web?) will take off or become Google-popular
If your SEO authority doesn't get nuked you might have a shot, but that will depend upon how similar your new content is to the old content (in mathematical terms, not human terms - e.g: Boolean string similarity comparison stats for whole content pieces)
It's kind of redundant when, most manufacturers produce software which automatically keeps all drivers up to date anyway (e.g: GeForce Experience). Those bespoke tools often do a much better job of driver installation too! One tool for all your PC upgrades has been a holy grail for decade, IMO it's a bit of a wild goose chase
Read this post also: https://moz.com/community/q/why-would-my-page-have-a-higher-pa-and-da-links-on-page-grade-still-not-rank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When serving a 410 for page gone, should I serve an error page?
I'm removing a bunch of old & rubbish pages and was going to serve 410 to tell google they're gone (my understanding is it'll get them out of the index a bit quicker than a 404). I should still serve an error page though, right? Similar to a 404. That doesn't muddy the "gone" message that I'm giving Google? There's no need to 410 and die?
Intermediate & Advanced SEO | | HSDOnline0 -
Google Indexing & Caching Some Other Domain In Place of Mine-Lost Ranking -Sucuri.net Found nothing
Again I am facing same Problem with another wordpress blog. Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain. Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites. site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site. As per Sucuri.net Senior Support It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results. As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email flang.juliette@yandex.com 2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time. Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again. Looking forward for possible solution Ankit
Intermediate & Advanced SEO | | killthebillion0 -
Question & Review should be seperate page
Hi pls look at the below page, http://www.powerwale.com/store/exide-xplore-xltz4-3ah-battery/76933 is questions and review should be in seperate page, as i think that in the future the comments, will become Key word stuffing for the product page. Pls suggest.. If yes, suggest the best url as well.. thanks
Intermediate & Advanced SEO | | Rahim1191 -
Parameter Strings & Duplicate Page Content
I'm managing a site that has thousands of pages due to all of the dynamic parameter strings that are being generated. It's a real estate listing site that allows people to create a listing, and is generating lots of new listings everyday. The Moz crawl report is continually flagging A LOT (25k+) of the site pages for duplicate content due to all of these parameter string URLs. Example: sitename.com/listings & sitename.com/listings/?addr=street name Do I really need to do anything about those pages? I have researched the topic quite a bit, but can't seem to find anything too concrete as to what the best course of action is. My original thinking was to add the rel=canonical tag to each of the main URLs that have parameters attached. I have also read that you can bypass that by telling Google what parameters to ignore in Webmaster tools. We want these listings to show up in search results, though, so I don't know if either of these options is ideal, since each would cause the listing pages (pages with parameter strings) to stop being indexed, right? Which is why I'm wondering if doing nothing at all will hurt the site? I should also mention that I originally recommend the rel=canonical option to the web developer, who has pushed back in saying that "search engines ignore parameter strings." Naturally, he doesn't want the extra work load of setting up the canonical tags, which I can understand, but I want to make sure I'm both giving him the most feasible option for implementation as well as the best option to fix the issues.
Intermediate & Advanced SEO | | garrettkite0 -
Why won't my sub-domain blog rank for my brand name in Google?
For six months or so, my team and I have been trying to get our blog to rank on page one in Google for the term "Instabill." The URL, http://blog.instabill.com, is a sub-domain of our company website and they both use the same IP address. Three pages on our www.Instabill.com site rank in the top three spots when searching our brand name in Google. However, our blog ranks 100+. For our blog, we are currently using b2evolution and nginx. We have tried adding static content on the home page, static content in the sidebar, static content on an About Instabill page, and optimizing blog posts for the keyword Instabill, but nothing seems to work. We appreciate any advice you can provide to us. Thank you!
Intermediate & Advanced SEO | | Instabill
Meghan0 -
Backlinks question: High Domain Authority, Lower Page Authority
We have a possibility of contributing guest blogs (with followed backlinks) to a site with very high domain authority (and highly trafficked), but when we've looked at the blog entires they already have, most of them have a much lower page authority. How do relevant links from a page with a lower PA but on a domain with a really high DA end up impacting our overall backlink profile? Can an expert or two give me some advice on what this may mean for us if we choose to go for it? In your opinion, does having lots of relevant links from a site with a much higher domain authority than ourselves (to give you an idea, our domain authority is in the low 60's, this site has a domain authority of almost 90) worth the time/effort/resources unto itself? Thanks!
Intermediate & Advanced SEO | | GrowOrganic0 -
Exact match domain or root domain for speedy SEO?
I am doing SEO for a website that has constantly rotating and only temporarily pertinent subjects on it. Let's say these information and subject cycles go for about 6 months. Assuming this would it be more effective to optimize exact match domains for each 6 month cycle or make a main domain with a few of the keywords and just target a page for each roaming subject? Advantage of the subject is I get domain authority to feed off of, advantage of the exact match is, of course exact match domains are a powerful tool to rank highly and it is only a medium competitive market, usually about 40 domain and page authority. What do you guys think? Do you have any techniques to dominate temporary and rotating markets?
Intermediate & Advanced SEO | | MarloSchneider0 -
Domain Authority / Page Authority
I manage a site that has home page authority of 69, and overall domain authority of 63. To improve domain authority, would it help to remove some of the pages that have 0 page authority? There are over 1,000 pages to this site, and I always thought that the more pages you have, the better (generally). But, does it actually hurt the site to have pages that Google perceives as having 0 page authority, or does this have no bearing? Any insight would be greatly appreciated.
Intermediate & Advanced SEO | | DiscoverBoating0