What happens if we remove all the links to internal pages from our homepage?
-
Hi Moz community,
We wanna give a try by removing all the links from homepage to internal pages and keep just a free trial button. Will this impact our SEO anyway? We have nearly 15 important internal pages at 2nd and 3rd hierarchy level. They may drop in rankings but we want to risk for few days to understand how it works. Your opinion please!
Thanks
-
Great point Chris, this is definitely worth looking into! A "lightbox pop up thingy" may give you what you want from a functionality standpoint, without having to kill those links.
Also, I do believe "lightbox pop up thingy" is the technical term. I've heard others refer to them as a "modal window" or something silly like that =P
-
Rather than go through that, could you try something like a lightbox popup thingy? (I'm not a designer and don't know the real names for them)
-
Major impact will be on Internal Pages as they will loose all the distributed page authority and ranking power from the home page that is creature of all pages. Definitely along with Internal pages Home page will have a diverse impact as these linking defines the architecture and hierarchy of a website. If you are destroying the architecture it will have negative impact unless you are not a search engine (search engine have a plain structure like this you are trying)
-
You might be risking both.
-
Hi Vijay Gaur and dhananjay.kumar1,
Thanks for the response. You mentioned that we may loos ranking. Do you mean homepage ranking or internal pages ranking?
-
Definitely this will impact your SEO negatively. You could loose ranking but still if you want to take this risk. Just go for it.
-
Although I could agree with Bryan Loconto to take a risk, yet it looks like a huge risk as your website structure might not be very well validated by the search engine bots. Also, it might take time to revert the ranks once lost.
PS: I am not just being conservation but a little cautious.
-
Hey vtmoz,
I am a firm proponent of experimentation and risk taking!! I say go with it, worst case you can rule it out and revert it back. This seems like a no-brainer experiment
Hope this helps (or motivates) Cheers!
"No amount of experimentation can ever prove me right; a single experiment can prove me wrong." - Albert Einstein
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal pages ranking over the homepage: How to optimise to rank better at Google?
Hi, We have experienced a shift in SERP from internal pages ranking over website homepage for more than a year. Previously website homepages used to rank for the primary keyword like moz.com for "SEO". Now we can see that internal pages like moz.com/learn/seo/what-is-seo been ranking for the primary keyword "SEO". Google is picking up these "what is ABC" pages than the homepage. All our competitor sites are ranking with these internal pages which are about "what is (primary keyword)". We do have the same internal pages "what is....", but this pages is not ranking; only our homepage is ranking. Moreover we dropped more than 15 positions after this shift in SERP. How to diagnose this? Thanks
Algorithm Updates | | vtmoz0 -
Sitemaps for landing pages
Good morning MOZ Community, We've been doing some re-vamping recently on our primary sitemap, and it's currently being reindexed by the search engines. We have also been developing landing pages, both for SEO and SEM. Specifically for SEO, the pages are focused on specific, long-tail search terms for a number of our niche areas of focus. Should I, or do I need to be considering a separate sitemap for these? Everything I have read about sitemaps simply indicates that if a site has over 50 thousand pages or so, then you need to split a sitemap. Do I need to worry about a sitemap for landing pages? Or simply add them to our primary sitemap? Thanks in advance for your insights and advice.
Algorithm Updates | | bwaller0 -
Agency footer link, do we keep it ?
Hello ! I was wondering if it's still a good idea to let a do-follow link on the bottom of agency released websites. Because they obvisouly come from different websites with no link with a web marketing agency. Do we have to keep them in the footer in no-follow ? If we do so, how to get some link juice from the different websites ? It sounds a bit stupid but one of my partners went from PR7 to PR5 recently. I guess Penguin 2.0 did not like all its links from its customers' website. Tks a lot !
Algorithm Updates | | AymanH0 -
Member's Badge as Link Building to Homepage or Internal Pages?
Providing members and embeddable badge is a well known link building tactic. Is it better to have the badges from hundreds or even thousands of members link back to the homepage of a website, or a lot of different inner pages? The inner pages would the their individual's profile which sits under a category (such as a service and organisation by location). Member's websites would be related to the content of the website generally. What are the advantages of each? 1. Links to homepage make it easier to rank for competitive keywords on the homepage? If the types of websites were to vary a lot, say a carpet cleaning website and a web designer website, if they all linked to the homepage, would it cause some confusion for the link profile?
Algorithm Updates | | designquotes0 -
I think my inbound link anchor text looks un-natural to google - How to fix?
Hi all, For a bit of back ground see this question i posted recently: http://www.seomoz.org/q/lost-over-65-of-organic-visits-since-sept-please-help From the responses there and looking into my backlinks and my competitors i can see an issue with the anchor text on my inbound links... nearly all keywords and very very few brand names etc... From what i can gather (using open site explorer) the page in question has: 1100 inbound links from 900 domains These use 90 different anchor texts 106 of these links use my brand / website name in the anchor text These 106 links are spread over 18 domains (73 from 1 directory) About 5-10% of the links are from directories, the rest are from what i would describe as "proper websites" From my very limited knowledge of this, the issue is my brand / website should have a far higher ratio of links using it as the anchor text then any keyword... which as you can see from the above is not the case... If it wasnt for that 1 directory there would only be 33 links with my brand from over 1000... I need to start fixing this, but was wondering how to start... Below are a list of options i could try, i have no idea if these would help or hinder, any advice you could give on the potential affects of below options would be very helpful: Options (the below are hypothetical, i have no idea if i will be able to get it done - Just thinking out loud here): Get as many as possible of the "directory" links removed Remove keywords from 50-60% of links and replace with branding Or Try to add branding to 50-60% of the anchor texts something like [Brand] + [keyword] Forget about whats been done previously / changing it will not help in anyway / and focus on branding in anchor text for any future link building? Thanks James
Algorithm Updates | | isntworkdull0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
Has Google problems in indexing pages that use <base href=""> the last days?
Since a couple of days I have the problem, that Google Webmaster tools are showing a lot more 404 Errors than normal. If I go thru the list I find very strange URLs that look like two paths put together. For example: http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm If I check on which page Google found that path it is showing me the following URL: http://www.domain.de/languages/languageschools/havanna/spanishcourse.htm If I check the source code of the Page for the Link leading to the London Page it looks like the following: [...](languages/languageschools/london/london.htm) So to me it looks like Google is ignoring the <base href="..."> and putting the path together as following: Part 1) http://www.domain.de/laguages/languageschools/havanna/ instead of base href Part 2) languages/languageschools/london/london.htm Result is the wrong path! http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm I know finding a solution is not difficult, I can use absolute paths instead of relative ones. But: - Does anyone make the same experience? - Do you know other reasons which could cause such a problem? P.s.: I am quite sure that the CMS (Typo3) is not generating these paths randomly. I would like to be sure before we change the CMS's Settings to absolute paths!
Algorithm Updates | | SimCaffe0