When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
-
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago.
My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me...
If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign?
The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this.
Thanks,
-Eric
-
Eric,
I understand how it feels to get that first great link. Don't reindex or resubmit sitemap. Here is a little trick...
Go into webmaster tools and the 3rd or 4th button on the left going down is CRAWL. Click that and the third item on the dropdown is Fetch as Google. (You are going to use it for getting the page crawled, but that is not the original intent for this tool.)
Click Fetch as Google and put the URL of the page in. IF the page is your home page, leave blank and then hit fetch. That should get it crawled fairly quickly.
Note. This does not mean it will show up in links to your site in the next few days or even weeks. I have seen it take two months to get a link to show in GWMT.Hope this helps, good luck as you go forward.
Edit: Sorry click Fetch and Render then click Index. Forgot about this.
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Not getting any data in Search console
Hi there My website Ranking well, But in Search console it is not Fetching any Data, here is Screenshot http://prntscr.com/d4m2tz , Why i am not getting any report For Clicks, Impressions ?? is there any mistake which is made?? please any body can help out. Thanx,
White Hat / Black Hat SEO | | pooja.verify050 -
Competitor Inbound Links Increase from 175K to 1 million in 1 month, how?
Hi all, I was recently doing some competitive analysis on external links/DA and came across something peculiar. A competitor of ours had their external links go from 175,179 in August to 1,141,365 in September. I've attached a screenshot showing the increase. The competitors domain authority also increased from 82 to 89 in the same time span. Has anyone else come across such a large link increase in such a short period of time, while also being rewarded for it? Obviously at first glance it seemed extremely black hat and unnatural, but I would love to be proven wrong. Thanks! Cw5tN
White Hat / Black Hat SEO | | mstpeter0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
What to do with these toxic links?
Back in July I had posted here that I thought someone was doing negative SEO against us. We monitor our links on a daily basis, and a lot of toxic links came in quickly within a few days. So we were pro-active and ended up disavowing those links soon after we saw them. Shortly after that our ranking start to drop and we lost a good amount of traffic, though I do not know if its really connected since we only disavowed those toxic links and we weren't ranking FROM those links since they were disavowed so quickly. Now, its happening again. 20 new inbound domains linking to us from complete crap websites with crap content and not done by us. I want to disavow them, but I am thinking that maybe the first time we disavowed the links, it hurt us, and maybe disavowing now will hurt us further? I think Google should be able to filter out this crap but who knows, too much depends on this being handled correctly. Here are some of the crappy links: http://optibike.com/?home.php=page/loans/student-loan-without-a-cosigner-2.html
White Hat / Black Hat SEO | | DemiGR
http://designsbynickthegeek.com/?index.php=finance/loans/loan-for-you-3.html
http://www.nuvivaweightloss.com/?index.php=article/loans/300-loan-today.html
http://ecommercesalesmultipliersystem.com/?home.php=board/loans/fast-loan-with-monthly-payments-2.html They are mostly duplicate content across a network of sites. How would you guys handle this?0 -
Does linking older posts help?
Asking a blogger to add an anchor text into their old post that relates to my niche. does that help with backlinks? does the quality of backlinks determine by how new the post is or the page rank determines all? for example a new post with lesser page rank vs a old post with higher page rank which one is better to put your link on?
White Hat / Black Hat SEO | | andzon0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
What if White Hat SEO does not get results?
If company A is paying 5k a month and some of that budget is buying links or content that might be in the gray area but is ranking higher than company B that's following the "rules" and paying the same but not showing up at all, what's company B suppose to do?
White Hat / Black Hat SEO | | EmarketedTeam2 -
How does someone rank page one on google for one domain for over 150 keywords?
A local seo is exclaiming his fantastic track record for a pool company(amonst others) in our local market. Over 150 keywords on page one of google. I checked out a few things using some moz tools and didn't find anything that would suggest that this has come from white hat strategies, tactics or links etc. Interested in how he is doing this and if it is white hat? Thanks, C
White Hat / Black Hat SEO | | charlesgrimm0