How many days for a Backlink
-
Hi
One week ago, i created a blog on wordpress added the url of my blog on google, bing and yahoo.
In that blog i put a link of my webshop (the site im working on SEO) but when i checked the backlinks of my webshop (with seomoz tools and yahoo explorer) , the link from the blog still doesnt show.
How many days it takes for a backlink to be registered ?
Thanks
-
I use the OSE to monitor more backlinks from competitors because it takes longer to show up the links. The links that appear quickly are Google WebmasterTools and Majestic SEO. But still take about a month : )
-
From my experience twitter has proven to be a great tool to index new pages really fast on google, but never used it for backlinks. I think, maybe if you do get your backlink indexed the value of it would be close to none (even the profile link).
-
What about twitter ?
I have a profile for my webshop in twitter, i created maybe 6 months ago but no backlink !
Then i checked one of my competitor and he gets a backlink from his twitter profile page...
that is very weird
-
I notice that the higher the Page Rank of the page you posted the link the faster it is saw by the Search Engines. Don't know if that its true, its just my guess.
Also I recommend you to use the Yahoo Site Explorer, its the fasted tool in the web to show you the indexed links. There are several backlinks that I saw showing there in less then 24 hours after it were created but Google Webmaster Tools and the Tools here in SEOmoz normally takes longer, the Google Webmaster Tools for instance I think its pretty random to show the new links.
-
Lots of people think that they can toss up a blog and manufacture immediate backlinks for their primary domain. But if that blog has no inbound links from other domains that are indexed by the search engines then the links on the blog will pass zero value back to the primary domain. In fact, they might never be discovered by crawl.
-
Linkscape is only updated once a month, so be patient. search engines can take weeks or even monrths to find the link also.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No link data in many of my clients GSC profiles !!
Hi I notice today that a few of my clients GSC profiles are devoid of link data (that did have before) Anyone know if this is a bug with Google or other potential issue ? All Best
Technical SEO | | Dan-Lawrence
Dan Links to Your Site | Total links |
| No data available. | | Who links the most No data available. |0 -
Why does my site have so many crawl errors relating to the wordpress login / captcha page
Going through the crawl of my site, there were around 100 medium priority issues, such as title element too short, and duplicate page title, and around 80 high priority issues relating to duplicate page content - However every page listed with these issues was the site's wordpress login / captcha page. Does anyone know how to resolve this?
Technical SEO | | ZenyaS0 -
Question about breaking out content from one site onto many
We have a website and domain -- which is well-established (since 1998) -- that we are considering breaking apart for business reasons. This is a content site that hosts articles from a few of our brands in portal fashion. These brands are represented in print with their own magazines so it's important to keep their presence separate. All of the content on the site is related to a general industry, with each brand covering a unique segment in the industry. For example, think of a toy industry site that hosts content from it's brands covering stuffed animals, electronics and board games. The current thinking is to break out the content from a couple brands to their own sites and domains. The business case for this branding purposes. I'm of the opinion that this is a bad idea as we would likely see a noticeable decline in search traffic across the board, which we rely on for impressions for our advertisers. If we take the appropriate steps to carefully redirect pages to the new domains what kind of hit should we expect to take from this transition? Would it make much difference if we were transition from 1 to 2 sites vs 1 to 4? Should this move be avoided all together? Any advise would be appreciated.
Technical SEO | | accessintel0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
Accidentally blocked Googlebot for 14 days
Today after I noticed a huge drop in organic traffic to inner pages of my sites, I looked into the code and realized a bug in last commit cause the server to showing captcha pages to all Googlebot requests from Apr 24. My site has more than 4,000,000 in the index. Before last code change, Googlebot are exempt from being shown the captcha requests so each inner pages are crawled and indexed perfectly with no problem. The bug broke the whitelisting mechanism and treat requests from Google's ip addresses the same as regular users. It leads to the captcha page being crawled when Googlebot visits thousands of my site's inner pages. This makes Google thinks all my inner pages are identical to each other. Google remove all the inner pages from SERP starting from May 5th before when many of those inner pages have good rankings. I formerly thought this was a manual or algorithm penalty but 1. I did not receive a warning message in GWT
Technical SEO | | Bull135
2. The ranking for main url is good. I tried with "Fetch as Google" in GWT and realize all Googlebot saw in the past 14 days are the same captcha page for all my inner pages. Now, I have fixed the bug and updated the production site. I just wanted to ask: 1. How long will it take for Google to remove the "duplicated content" flag on my inner pages and show them in SERP again? From my experience, Googlebot revisits urls quite often. But once a url is flagged as "contains similar content", it could be difficult to recover, is it correct? 2. Besides waiting for Google to update its index, what else can I do right now? Thanks in advance for your answers.0 -
Can you redirect from a 410 server error? I see many 410s that should be directed to an existing page.
We have 150,000 410 server errors. Many of them should be redirected to an existing url. This is a result of a complete website redesign, including new navigation and new web platform. I believe IT may have inadvertently marked many 404s as 410s. Can I fix this or is a 410 error permanent? Thank you for your help.
Technical SEO | | sxsoule0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Too many 301 redirects - good or bad?
Hi, Currently, page A is redirecting to page B. I am in the process of developing new site for the same domain and this time page B will be redirected to page C. This is gonna happen on many pages. Is it correct or should i adopt some other strategy? Will it have adverse effect on the speed of my site? Page A -----> Page B ------> Page C Regards, Shailendra
Technical SEO | | IM_Learner0