Adding my web link on wikiquote, is it ok?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"One Page With Two Links To Same Page; We Counted The First Link" Is this true?
I read this to day http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 I thought to myself, yep, thats what I been reading in Moz for years ( pitty Matt could not confirm that still the case for 2014) But reading though the comments Michael Martinez of http://www.seo-theory.com/ pointed out that Mat says "...the last time I checked, was 2009, and back then -- uh, we might, for example, only have selected one of the links from a given page."
Technical SEO | | PaddyDisplays
Which would imply that is does not not mean it always the first link. Michael goes on to say "Back in 2008 when Rand WRONGLY claimed that Google was only counting the first link (I shared results of a test where it passed anchor text from TWO links on the same page)" then goes on to say " In practice the search engine sometimes skipped over links and took anchor text from a second or third link down the page." For me this is significant. I know people that have had "SEO experts" recommend that they should have a blog attached to there e-commence site and post blog posts (with no real interest for readers) with anchor text links to you landing pages. I thought that posting blog post just for anchor text link was a waste of time if you are already linking to the landing page with in a main navigation as google would see that link first. But if Michael is correct then these type of blog posts anchor text link blog posts would have value But who is' right Rand or Michael?0 -
Transferring link juice on a page with over 150 links
I'm building a resource section that will probably, hopefully, attract a lot of external links but the problem here is that on the main index page there will be a big number of links (around 150 internal links - 120 links pointing to resource sub-pages and 30 being the site's navigational links), so it will dilute the passed link juice and possibly waste some of it. Those 120 sub-pages will contain about 50-100 external links and 30 internal navigational links. In order to better visualise the matter think of this resource as a collection of hundreds of blogs categorised by domain on the index page (those 120 sub-pages). Those 120 sub-pages will contain 50-100 external links The question here is how to build the primary page (the one with 150 links) so it will pass the most link juice to the site or do you think this is OK and I shouldn't be worried about it (I know there used to be a roughly 100 links per page limit)? Any ideas? Many thanks
Technical SEO | | flo20 -
Should we rel=nofollow these links ?
On our website, we have a section of free to low-cost tools that could help small business increase their productivity without spending big bucks. For example, this is the page for online collaboration tools: http://www.bdc.ca/EN/solutions/smart_tech/tech_advice/free_low_cost_applications/Pages/online_collaboration_tools.aspx None of the company pay anything to be on these list. We actually do quite a lot of research to chose which should be listed there and which should not. Recently, one of the company in our lists asked us to add rel=nofollow to the link to their website because they add been targeted by a manual action on Google and want their link profile to be as clean as possible (probably too clean). My question is : Should we add rel=nofollow to all these links ? Thanks, Jean-François Monfette
Technical SEO | | jfmonfette0 -
Finding Broken Back Links
Hello there I am new here but really want to mend my broken website by myself as I enjoy a challenge! I used to have great rankings but have moved websites a few times (same domain) and the last move was to wordpress. I now have loads of broken links in the SERPS and wondered if there was an easy way to flush google of them as they are getting lots of 404 errors? They really are too many to do a 301 on (I have done the main pages) Also how do I do a crawl of my website for any internal broken links? Does SEOmoz have something or is there an external program you would recommend? Thanks Victoria
Technical SEO | | vcasebourne0 -
External Links Discrepancy
Hello folks Apologies for my ignorance, but a SEO novice here… One of our competitors boasts over 300,000 external links, however when we analysed their links via http://www.opensiteexplorer.org we can only see around 10,000 in there “Number of Domains Linking to this Page” section. Can someone please assist and point out something which I assume is painfully obvious! Cheers, Chris
Technical SEO | | footyfriends0 -
Unnatural links in webmaster tools
Google Webmaster Tools notice of detected unnatural links to my site.I download latest link from google webmaster tool and decided to remove links from last four months.I also got several links from directory sites and i want to remove the links, how to delete links from directory sites ? I f there is no way to delete directory link , please let me know other option to get rid of this issue.
Technical SEO | | Alick3000 -
Affiliate links
Is there a best practice for linking out to affiliates URLs post panda? I know some believe it can be a factor.
Technical SEO | | PeterM220 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0