Does Google Still Pass Anchor Text for Multiple Links to the Same Page When Using a Hashtag? What About Indexation?
-
Both of these seem a little counter-intuitive to me so I want to make sure I'm on the same page.
I'm wondering if I need to add "#s to my internal links when the page I'm linking to is already:
a.) in the site's navigation
b.) in the sidebar
More specifically, in your experience...do the search engines only give credit to (or mostly give credit to) the anchor text used in the navigation and ignore the anchor text used in the body of the article?
I've found (in here) a couple of folks mentioning that content after a hashtagged link isn't indexed.
Just so I understand this...
a.) if I were use a hashtag at the end of a link as the first link in the body of a page, this means that the rest of the article won't be indexed?
b.) if I use a table of contents at the top of a page and link to places within the document, then only the areas of the page up to the table of contents will be indexed/crawled?
Thanks ahead of time! I really appreciate the help.
-
Howdy Spencer!
Whoa! Lot's of questions here. Let's see if we can sort this out.
There's a lot of debate around this, and for the most part most SEOs consider the use of hashes okay for user experience, but mostly minor when it comes to influencing search results.
Here's what we know. Google indexes the first anchor text in the HTML. This is not necessarily the same thing as the first anchor on the visible page, as the HTML/CSS can be arranged so that links appear above others on the page.
That said, folks have experimented and found ways to get additional anchors indexed, including the use of hash tags. That said, what we don't know is how much weight/authority these links pass. It's generally believed (and I support this) that they probably don't pass as much value to the page as previous links.
If you have a link in your navigation, and another in the text body further down in the HTML, Google will index the first anchor, but most likely not the 2nd in most circumstances. Does this mean Google doesn't pass any value through the second? There's a lot of debate about this (read the comments here:http://www.seomoz.org/blog/all-about-anchor-text-whiteboard-friday)
I find it best not to micro-manage your links and simply keep the following in mind: If you want a link to pass as much value and authority as possible, place it in the body of the page.
Certainly there's a case made for using named anchors (#). They're good for navigation and user experience, and we see search engines pick them up in search results, but the value gained by manipulating them for ranking purposes is likely negligible.
"I've found (in here) a couple of folks mentioning that content after a hashtagged link isn't indexed."
Hmm.... I've never heard of that, and it sounds fishy. Love to see any research that's been done.
Hope this helps! Best of luck with your SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I prevent duplicate page title errors from being generated by my multiple shop pages?
Our e-commerce shop has numerous pages within the main shop page. Users navigate through the shop via typical pagination. So while there may be 6 pages of products it's all still under the main shop page. Moz keeps flagging my shop pages as having duplicate titles (ie shop page 2). But they're all the same page. Users aren't loading unique pages each time they go to the next page of products and they aren't pages I can edit. I'm not sure how to prevent this issue from popping up on my reports.
Technical SEO | | NiteSkirm0 -
Is there a way to index important pages manually or to make sure a certain page will get indexed in a short period of time??
Hi There! The problem I'm having is that certain pages are waiting already three months to be indexed. They even have several backlinks. Is it normal to have to wait more than three months before these pages get an indexation? Is there anything i can do to make sure these page will get an indexation soon? Greetings Bob
Technical SEO | | rijwielcashencarry0400 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
Google index graph duration in Google Webmaster Tools
Hello guys, I wonder, my sites are currently being indexed every 7 days, exactly. At Index Status page in GWT. However, this new site gets updated almost everyday, how can I ask google to index faster and more frequently/almost daily? Is it about SItemap.xml frequency ? I changed it today to Daily. Thanks!
Technical SEO | | mdmoz0 -
Empty Google cached pages.
My little startup Voyage has a tough relationship with Google. I have been reading SEOMOZ/MOZ for years. I am no pro but I understand the basics pretty well. I would like to know why all pages on my main domain look empty in google cache. Here is one example. Other advice is welcome too. I know a lot of my metas and my markup is bad but I am working on it!
Technical SEO | | vincentgagne0 -
Can Silos and Exact Anchor Text In Links Hurt a Site Post Penguin?
Just got a client whose site dropped from a PR of 3 to zero. This happened shortly after the Penguin release, June, 2012. Examining the site, I couldn't find any significant duplicate content, and where I did find duplicate content (9%), a closer look revealed that the duplication was totally coincidental (common expressions). Looking deeper, I found no sign of purchased links or linking patterns that would hint at link schemes, no changes to site structure, no change of hosting environment or IP address. I also looked at other factors, too many to mention here, and found no evidence of black hat tactics or techniques. The site is structured in silos, "services", "about" and "blog". All page titles that fall under services are categorized (silo) under "services", all blog entries are categorized under "blogs", and all pages with company related information are categorized under "about". When exploring the site's links in Site Explorer (SE), I noticed that SE is identifying the "silo" section of links (i.e. services, about, blog, etc.) and labeling it as an anchor text. For example, domain.com/(services)/page-title, where the page title prefix (silo), "/services/", is labeled as an anchor text. The same is true for "blog" and "about". BTW, each silo has its own navigational menu appearing specifically for the content type it represents. Overall, though there's plenty of room for improvement, the site is structured logically. My question is, if Site Explorer is picking up the silo (services) and identifying it as an anchor text, is Google doing the same? That would mean that out of the 15 types of service offerings, all 15 links would show as having the same exact anchor text (services). Can this type of site structure (silo) hurt a website post Penguin?
Technical SEO | | UplinkSpyder0 -
Site not passing page authority....
Hi, This site powertoolworld.co.uk is not passing page authority. In fact every page shows no links unless it has a link from an external source. Originally this site blocked Roger from crawling it but that block was lifted over 6 months ago. I also ran a crawl test last night and it shows the same thing. PA of 1 and no links. I would like to point out that the problem seems to be the same for all sites on the same platform. Which points me in the direction of code. for example there is a display: none tag in the ccs which is used to style where the side bar links are. It's a Blue Park platform. What could be causing the problem? Thanks in advance. EDIT Turns out that blocking the ezooms crawler stopped it from being included.
Technical SEO | | PowerToolWorld0