Is this site structure going to kill link juice?
-
This is a parallax type page and the navigation basically points you to the homepage every time. The sub-menu on the secondary pages only ushers you down the page to various topics. This design has concerned me for some time but I'd like another opinion.
-
Ah, I see what you mean Alan and I'm inclined to agree. With the JS you mentioned, there's no risk of a user (or crawler) being taken away from the page itself, so no link would be passed and "diluted", as it were. Thanks for posting this!
-
Thanks!
-
I differ in thinking here on one point.
I believe that links pointing to the same page have the same decay as any other link. That page rank flows out of them just the same, obviously it goes to the same page but some of it is lost every time.
What I do and looks much better is use javascript to scroll to that spot,
$('html, body').animate({ scrollTop: $("#mytarget").offset().top - 70 }, 'slow');
This will scroll to your anchor, the offset lets you position things nice.Even if I am wrong, the scrolling animation looks and feels much better
-
I think this could be happening because of the way Google interprets the
- I tried it with the opening quote, but not the closing quote and it worked. Notice too that the text that's highlighted (or bolded) in the search result is everything up to the
-
Yes, that is interesting. Strange.
-
Interestingly if I search it without quotes it comes right up just fine.
-
Thanks. That has been a big concern for me. We consolidated quite a few pages into one long page but our organic traffic has gone up so I guess so far so good? We are in the process of re-vamping our keywords and re-adjusting our content to match. I'm not sure if I should bring this up or not.
-
Here is a text grab from the "products" section of that page....
SQL Sentry Performance Advisor
is packed with ground-breaking features that are
When I search google for that text string in quotes.... the only thing that is found is a scraper and your adwords.
I converted some pages to one of these fancy pants formats and my longtail traffic into those pages tanked.
-
Hi there
I see what you're getting at here, but there's no great cause for concern.
So, all of those hash links (#) are internal anchors. Link juice (specifically Page Rank) does not pass and is not lost with an internal anchor. It only passes on full links - like the text links you have to other pages on the homepage. It won't try to pass (and then fail) on any internal anchors - so link juice won't be "killed" by them, so to speak.
One thing you'll need to keep in mind is how people link to you externally. To ensure all of the "link juice" from an external site is passed to yours, make sure that they link to you without any internal anchors (basically, your URL without any of the #'s).
However, even then, my recent tests have shown that the majority of the strength (if not all) will pass anyway. Seems to me that Google at least treats these internal anchors very well and will recognise where to pass the "link juice" even if one is present in a URL.
So, from a strictly SEO and internal structure point of view, I don't think there's too much to be concerned about here. If it works for you user experience wise etc. I'd say keep it for sure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
my site has 17.5m Total links according to Moz (16.6m internal follow & 840k no follow) i think i have a problem...
We are hosted by visual soft and it is a proprietory platform so we dont have full control of our site.
On-Page Optimization | | Russell-Gorilla
in comparison, 3 of our main competitors, two of which are way way bigger than us have 1.4m & 4.7m - another one still probably double or perhaps triple our size is @ 2.5m Should i worry?
Should i post my website url on here?
I would like to start working on canonical links on my site but not sure where to start, does moz pro have some sort of check or rating, i have no idea if even the basics mentioned in the tutorials have been done....
Russell0 -
Blog issue broken link
Taking Great Photographs Underwater May 25, 2015 By sdwellers@aol.com No comments yet florida keys, key largo diving Excuse my ignorance, I suspect this is an easy issue...but at the the top of each of my blog posts have what you see above....the "No Comments yet" tab is showing as a broken link 404 error...?Why? And how to fix?Thank you
On-Page Optimization | | sdwellers0 -
Cross-linking for mobile SEO
Hi everyone! I am having a hard time finding information about weather to/how to apply internal seo linking to mobile versions of sites. We decided to go with dynamic serving with user agent detection. Our desktop site has a quite heavy seo-internal-cross-linking. As I understand, for mobile we should simplify and focus on usability, so get rid of unnecessary links. But I have a doubt about weather removing this part of the web structure can hurt our SEO. Do Google mobile bots look at and rank mobile versions of pages from scratch or do they use what they know about the site and the site's structure from its desktop version?
On-Page Optimization | | ofertia0 -
Are menu links considered spammy?
Hi, I'm wondering if having a footer menu with keyword links is considered spam? It makes sense to me to have your keywords and links to relevant pages on each page. Thanks 🙂
On-Page Optimization | | Memoz0 -
Spammy link for each keyword
Some people believe that having a link for each keyword and a page of content for each keyword (300+ words) can help ranking for those keywords. However, the old approach of having "restaurant New York", "restaurant Buffalo", "restaurant Newark" approach has become seen as a terrible SEO practice. I don't know whether this was because it's spammy or because people usually combined it with thin content that was 95% duplicate. Which brings us to; http://hungryhouse.co.uk/ Why does such a major company have the following on the site (see the footer); Aberdeen Takeaway Birmingham Takeaway Brighton Takeaway Bristol Takeaway Cambridge Takeaway Canterbury Takeaway Cardiff Takeaway Coventry Takeaway Edinburgh Takeaway Glasgow Takeaway Leeds Takeaway Leicester Takeaway Liverpool Takeaway London Takeaway Manchester Takeaway Newcastle Takeaway Nottingham Takeaway Sheffield Takeaway Southampton Takeaway York Takeaway Indian Takeaway Chinese Takeaway Thai Takeaway Italian Takeaway Cantonese Takeaway Pizza Delivery Sushi Takeaway Kebab Takeaway Fish and Chips Sandwiches Do they know something I don't? [unnecessary links removed by staff]
On-Page Optimization | | JamesFx0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Too Many On-Page Links
I recently took on a website design client and ran his website through a battery of tests using Pro to take a look at the crawl errors. One that seems to stump me is the error "Too many On-Page links" concerning his blog. (http://franksdesigns.com/wp/blog) This is the first time I've seen this error and am rather confused. The report says there are 104 links on this page. However, I'm having trouble grasping this concept or finding the 104 links. Any suggestions are greatly appreciated. Thank you for your support!
On-Page Optimization | | WebLadder0