Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Forcing Google to Crawl a Backlink URL
-
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests).
My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
-
No problem!
-
Appreciate the ideas. I am considering pointing a link at it, but this requires a little more thought and effort to do so ethically. But, at this point, it's probably my best option. Thanks!
-
You might try pinging the site out or just building a link to the site.
-
Both are good ideas. Thank you!
-
Ahhhh, that's a bummer.
Well, you could try to submit a URL from the .gov site that isn't as buried but links to the URL you want crawled.
You could try emailing someone that manages the website, giving them a helpful reminder that they have quality pages not being indexed regularly by Google
Good luck!
-
Thanks for the suggestion! But I should have mentioned in the original post that I've submitted twice via Submit URL form and the url has yet to show up in Latest Links in Webmaster Tools.
-
You could try the URL submit tool: https://www.google.com/webmasters/tools/submit-url
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the proper URL length? in seo
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google. but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me? my competitors have 8 characters domain url and keywords length of 13 and my site has 15 character domain url and keywords length of 13 which one will be prefered by google.
White Hat / Black Hat SEO | | calvinkj0 -
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
How to improve PA of Shortened URLs
Why some of shortened urls like bitly/owly/googl has PA>40? I tried everything to improve PA of my shortened urls like facebook shares, retweets and backlinks to them but still i have PA-1. Checkout this URL: https://moz.com/blog/state-of-links in MOZ OSE and you will many 301 links from shortners
White Hat / Black Hat SEO | | igains
I asked many seo experts about this but no one answered this question so today subscribed MOZ pro for the solution. Please give me the answer.0 -
Traffic exchange referral URL's
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
White Hat / Black Hat SEO | | Teamzig0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
A Straight Answer to Outsourcing Backlinking, Directory Submission and Social Bookmarking
Hey SEOmoz Community! I've spent a bit of time now reading about SEO in books as well as online here within the SEOmoz community. However, I've still struggled to find a straight answer to whether or not directory submissions to non-penalized websites is acceptable.I suspect the reason I haven't found a straight YES or NO answer is because it isn't so straightforward and I respect that. My dilemma is as follows: I want to raise the domain authority for a few websites that I optimize for. I've submitted and gotten listed a bunch of excellent backlinks, however it still is a painfully slow process. My clients understandably want to see results faster, and because they have virtually no past outsourced link-building campaigns, I am beginning to think that I can invest some money for outsourcing directory submissions. I see more and more people talking about the latest Penguin updates, and how many of these sites are now penalized. BUT, is there any harm to submitting to directories such as the ones on SEOmoz's spreadsheet that aren't penalized? My concern is that in the future these will be penalized anyways, and is there a chance then that my site will also be de-listed from Google? At what point does Google completely 'blacklist' your site from its engine? Furthermore, I don't understand how Google can penalize a website to the point of de-listing it, because what would prevent other competitors from sending mass spammy back-links to another? What it all comes down to: At this point, are verified mass directory submissions through outsourcing still much more beneficial than detrimental to the ranking of a website? Thanks SEOmoz community, Sheldon
White Hat / Black Hat SEO | | swzhai0 -
Do shady backlinks actually damage ranking?
That is, it looks like a whole bunch of sites got smacked around the penguin/panda updates, but is this by virtue of actually being hurt by google's algorithms, or by virtue of simply not being helped "as much"? That is, was it a matter of the sites just not having any 'quality' backlinks, having relied on things google no longer liked, which would result in not having as much to push them to the top? That is, they would have been in the same position had they not had those shoddy practices? Or was google actively punishing those sites? That is, are they worse off for having those shoddy practices? I guess the reason I ask is I'm somewhat terrified of going "out there" to get backlinks -- worst case scenario: would it just not do much to help, or would it actually hurt? Thanks!
White Hat / Black Hat SEO | | yoni450