Forcing Google to Crawl a Backlink URL
-
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests).
My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
-
No problem!
-
Appreciate the ideas. I am considering pointing a link at it, but this requires a little more thought and effort to do so ethically. But, at this point, it's probably my best option. Thanks!
-
You might try pinging the site out or just building a link to the site.
-
Both are good ideas. Thank you!
-
Ahhhh, that's a bummer.
Well, you could try to submit a URL from the .gov site that isn't as buried but links to the URL you want crawled.
You could try emailing someone that manages the website, giving them a helpful reminder that they have quality pages not being indexed regularly by Google
Good luck!
-
Thanks for the suggestion! But I should have mentioned in the original post that I've submitted twice via Submit URL form and the url has yet to show up in Latest Links in Webmaster Tools.
-
You could try the URL submit tool: https://www.google.com/webmasters/tools/submit-url
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google says Geolocation Redirects Are Okay - is this really ok ?
Our aim is to send a user from https://abc.com/en/us to** https://abc..com/en/uk/ **if they came to our US English site from the UK So we came across this document - https://webmasters.googleblog.com/2014/05/creating-right-homepage-for-your.html We are planning to follow this in our international website based on the article by google : automatically serve the appropriate HTML content to your users depending on their location and language settings. You will either do that by using server-side 302 redirects or by dynamically serving the right HTML content. Will there be any ranking issues/ penalty issue because of following this or because of 302 redirects ? **Another article - **https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
White Hat / Black Hat SEO | | NortonSupportSEO0 -
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
Why Google cached another site, not mine?
Hi Guys, please help me. I need your help regarding my business website i.e. https://www.kamagratablets.com/. Before 8-10 days it was ranked in top 10 from home page but I lost my position and ranking page also changed by Google. If you will check caching of this website then you will see Google cache another site - http://www.hiphoptoptower.com/ - I have checked my code and nothing found related to this website. Please check and help me on this point, how can I remove this site from caching and get my previous ranking in Google.
White Hat / Black Hat SEO | | Devtechexpert0 -
Google Finance Filled with Spam
Not sure if anyone else does anything with Google Finance. In the last few months, I have been noticing a lot of spam sites filling the search results in Google "ticker pages". In this example you can see 4 or the 5 top results are from the same blog network with spun low quality content.
White Hat / Black Hat SEO | | SuperMikeLewis0 -
Backlink an article thats already on the web
Hey Mozers, Just wondering I noticed a few sites show "this article first appeared on domain.com" if there has been an article published on another site and is now publsihed on ours, how do we create a backlink to say it had first appeared on "domain.com" Any advice would be much appreciated Thanks
White Hat / Black Hat SEO | | edward-may1 -
Is this traffic drop do to cutting backlinks or Penguin 2.0 (Graphs attached)
I've attached both graphs of the traffic drop. Our website rankings have been steadily declining since May of 2013. We have mostly return customers or our drop would have been much more severe. There's never been any warnings in GWT We cut a bunch (but not all) of our paid links in May of 2013. We didn't have a manual penalty or anything, we just wanted to see what happened if we moved towards being white hat. When our rankings plumited, we quit cutting links. We currently have about 30% paid links. Penguin 2.0 was May 22, 2013 In looking at these graphs, was it our cutting links that caused the traffic drop, or was it Penguin 2.0? I'm looking for people who have experience in diagnosing a "Unique Visits" Google analytics graph for Penguin and have experience with what happens when you cut links. It looks like, in viewing the graphs, that May 23 was more the day that the big drop happened, but you guys have more experience with this than me. Thank you. ga.png ga2.png
White Hat / Black Hat SEO | | BobGW0 -
Removing/ Redirecting bad URL's from main domain
Our users create content for which we host on a seperate URL for a web version. Originally this was hosted on our main domain. This was causing problems because Google was seeing all these different types of content on our main domain. The page content was all over the place and (we think) may have harmed our main domain reputation. About a month ago, we added a robots.txt to block those URL's in that particular folder, so that Google doesn't crawl those pages and ignores it in the SERP. We now went a step further and are now redirecting (301 redirect) all those user created URL's to a totally brand new domain (not affiliated with our brand or main domain). This should have been done from the beginning, but it wasn't. Any suggestions on how can we remove all those original URL's and make Google see them as not affiliated with main domain?? or should we just give it the good ol' time recipe for it to fix itself??
White Hat / Black Hat SEO | | redcappi0 -
404checker.com / crawl errors
I noticed a few strange crawl errors in a Google Webmaster Tools account - further investigation showed they're pages that don't exist linked from here: http://404checker.com/404-checker-log Basically that means anyone can enter a URL into the website and it'll get linked from that page, temporarily at least. As there are hundreds of links of varying quality - at the moment they range from a well known car manufacturer to a university, porn and various organ enlargement websites - could that have a detrimental effect on any websites linked? They are all nofollow. Why would they choose to list these URLs on their website? It has some useful tools and information but I don't see the point in the log page. I have used it myself to check HTTP statuses but may look elsewhere from now on.
White Hat / Black Hat SEO | | Alex-Harford0