Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Forcing Google to Crawl a Backlink URL
-
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests).
My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
-
No problem!
-
Appreciate the ideas. I am considering pointing a link at it, but this requires a little more thought and effort to do so ethically. But, at this point, it's probably my best option. Thanks!
-
You might try pinging the site out or just building a link to the site.
-
Both are good ideas. Thank you!
-
Ahhhh, that's a bummer.
Well, you could try to submit a URL from the .gov site that isn't as buried but links to the URL you want crawled.
You could try emailing someone that manages the website, giving them a helpful reminder that they have quality pages not being indexed regularly by Google

Good luck!
-
Thanks for the suggestion! But I should have mentioned in the original post that I've submitted twice via Submit URL form and the url has yet to show up in Latest Links in Webmaster Tools.
-
You could try the URL submit tool: https://www.google.com/webmasters/tools/submit-url
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does google sandbox aged domains too?
Hello, i have a question. Recently i bought a domain from godaddy auction which is 23 years old and have DA 37 PA 34 Before bidding i check out the domain on google using this query to make sure if pages of this website are showing or not (site:mydomain.com) only home page was indexed on google. Further i check the domain on archive web the domain was last active in 2015. And then it parked for long about 4 years. So now my question does google consider these type of domain as new or will sandboxed them if i try to rebuild them and rank for other niche keywords ? Because its been 4 weeks i have been building links to my domain send several profile and social signals to my domain. My post is indexed on google but not showing in any google serp result.
White Hat / Black Hat SEO | | Steven231 -
Robots.txt file in Shopify - Collection and Product Page Crawling Issue
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.** Disallow: /collections/+ Disallow: /collections/%2B Disallow: /collections/%2b Disallow: /blogs/+ Disallow: /blogs/%2B Disallow: /blogs/%2b I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages? Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow: Thanks.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
Thumbtack Blatantly Violating Google TOS?
Hi, We have a business registered on Thumbtack so we receive their newsletters. I'm aware that review sites offering a "badge" or verification logo which links back to your profile is nothing new. But the email I received from Thumbtack is a fairly blatant attempt to game Google for popular keywords. I was just curious on your thoughts about this. I believe it was Overstock who did something like this and got slapped by Google pretty hard for a while. Could Thumbtack be heading down the same path? Image: http://i.imgur.com/FWPnmEP.jpg
White Hat / Black Hat SEO | | kirmeliux0 -
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live. Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
Google places VS position one ranking above the places.
Hi Guys, Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword? I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts? Or would they have that listing as well as the places listing? I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc. Appreciate some guidance Thanks. BC
White Hat / Black Hat SEO | | Bodie0 -
A Straight Answer to Outsourcing Backlinking, Directory Submission and Social Bookmarking
Hey SEOmoz Community! I've spent a bit of time now reading about SEO in books as well as online here within the SEOmoz community. However, I've still struggled to find a straight answer to whether or not directory submissions to non-penalized websites is acceptable.I suspect the reason I haven't found a straight YES or NO answer is because it isn't so straightforward and I respect that. My dilemma is as follows: I want to raise the domain authority for a few websites that I optimize for. I've submitted and gotten listed a bunch of excellent backlinks, however it still is a painfully slow process. My clients understandably want to see results faster, and because they have virtually no past outsourced link-building campaigns, I am beginning to think that I can invest some money for outsourcing directory submissions. I see more and more people talking about the latest Penguin updates, and how many of these sites are now penalized. BUT, is there any harm to submitting to directories such as the ones on SEOmoz's spreadsheet that aren't penalized? My concern is that in the future these will be penalized anyways, and is there a chance then that my site will also be de-listed from Google? At what point does Google completely 'blacklist' your site from its engine? Furthermore, I don't understand how Google can penalize a website to the point of de-listing it, because what would prevent other competitors from sending mass spammy back-links to another? What it all comes down to: At this point, are verified mass directory submissions through outsourcing still much more beneficial than detrimental to the ranking of a website? Thanks SEOmoz community, Sheldon
White Hat / Black Hat SEO | | swzhai0