The meta redirect tag will keep the link juice on the 404 handler page and not pass the page rank on to the home page. Something you may want to consider if you have a lot of 404's.
Posts made by oznappies
-
RE: Canonical Tag for a 404 page
-
RE: Info webinar Ranking Factors 2011
They usually have the powerpoint slides as a link on the same page as the webinar. Specifically https://seomoz.box.net/shared/static/u1brce40zteod8nkfqum.pptx for that one and the report at http://bit.ly/rankfactors2011.
-
RE: Canonical Tag for a 404 page
You could but, you would be better creating a search page so 404's go to www.example.com/search.aspx so users can search for the content they were actually looking for in the first place. Ideally all your pages should have the canonical in the head to ensure trailing / or capitalization errors all pass juice to the correct page and do not get reported as duplicates.
-
RE: Link Age as SEO factor?
From a developer's point of view: If you do not already have the new system in place, I would suggest a MVC move rather than aspx on the dotnet platform and put a cfm handler in place to map the pages at the controller level. Goggle will not know know there has been a change and you site will perform much faster. Microsoft is tending to move away from the aspx to the more structured mvc version anyway.
-
RE: Linking from other language websites passes juice or not?
In a previous post Ryan shows a site http://www.seomoz.org/q/herbal-viagra-page-same-da-pa-as-uc-berkeley that has a very high rank and most of their links are in China in Mandarin. So yes, foreign sites do pass link juice and even better if they are relavent to your field.
-
RE: What is the optimal URL Structure for Internal Pages
Since you will most likely have more than one form of personal injury, it would make more sense for a site architecure point of view to use category/type model ie. personal-injury/car-accidents. There probably is not any ranking difference, except that you could have a personal-injury landing page that links to the injury types and gains link juice in it's own right.
-
RE: Low bounce rate; need help troubleshooting code
When you look at the head section you can see where ga.js is injected twice.
I would say that is what she did. She should have disabled the analytics plugin. You should be able to do that from the admin page without a developer.
-
RE: Low bounce rate; need help troubleshooting code
If you visit gtmetrix.com and look at the 'timeline' you will see two javascript calls one to external-tracking-min.js and ga.js both are most likely calling analytics as on a js scanner I see ga.js being called twice.
-
RE: HUGE LINK!PR9 Not showing up
Google has done a far bit of cleaning up recently and there are severl posts on Panda and Farmer updates that caused high page ranked sites to fall in actual rank. If the site had little content or large collections of links then they probably got hit.
Just type 'Google Panda' and read some of the recent post for more details.
You should run the PR8 & 9 sites through Open Site Explorer to see where it considers them to rank now. You can also check the links to ensure they are not No-Follow.
-
RE: Best way to set up a site with multiple brick and mortor locations across Canada
If they are opening 3-10 physical locations then it is probably worth the extra funds to have a developer build an architecure to suit the site requirements rather than wordpress. It would be simple then to use 'state' from the landing page to scupltue the information presented on deeper pages to reflect the local version. So, if a user follow through to services page from the landing page they will see content from the main services page along with any local content required.
It is also important to look at pagespeed in generic molded designs to ensure it can meet the content delivery demands across a country as the site's traffic increase as new branches open.
-
RE: Do Schema.org changes impact local SEO
You would figure that if Google, Bing & Yahoo get behind a new standard for declaring what snippets of information mean through schema.org then at some stage soon, they will either be using those tags to hone in on specific parts of content more relevant to a search query or present more information to the viewer during a search.
The new section, article and heading tags of HTML5 would also seem to come in to the new richer data model for a web page. There has been a lot of work standardising these tags, so it would be a logical deduction to consider they along with personalization factors will become more prevalent in future.
-
RE: Played Out ? about BLOG COMMENTING LINKS
We do comment on strongly targeted blogs for our industry even though they are no-follow and find even though they are no-follow for search engines they still account for about 20% of our direct traffic. If what you post is relevant and informative to the reader they may click and arrive directly at your site bypassing the whole search process.
It does not help your page rank but it can help conversion rate.
There is also a better chance a request for you to supply an article for their blog will be accepted if you have been a valuable contributor.
-
RE: Does google scrape links from PDF files? do these links pass link juice?
Yes it does according to Google tech spec http://code.google.com/apis/searchappliance/documentation/50/admin_crawl/Introduction.html
which specifically states if follows html links in pdf 'It follows HTML links in PDF files, Word documents, and Shockwave documents'. Google's own api docs carry more weight than a comment in a forum_._ If they are licencing this out as an application it would suggest that the same technology is available in the main engine as does Dunamis's comment about a listing in a pdf document being found in search results.
You can test for youself by publishing a pdf with a link to a info page that does not show up in any other links. Include the pdf in your sitemap but not the test page and check if it shows in googles index site:yoursite.com the next time it crawls.
This also gives some insight in an interview with Matt Cutts - http://www.stonetemple.com/articles/interview-matt-cutts-012510.shtml
Eric Enge: What about PDF files?
Matt Cutts: We absolutely do process PDF files. I am not going to talk about whether links in PDF files pass PageRank. But, a good way to think about PDFs is that they are kind of like Flash in that they aren't a file format that's inherent and native to the web, but they can be very useful. In the same way that we try to find useful content within a Flash file, we try to find the useful content within a PDF file. At the same time, users don't always like being sent to a PDF. If you can make your content in a Web-Native format, such as pure HTML, that's often a little more useful to users than just a pure PDF file.
-
RE: Is there any way to change the domain in a campaign here?
Thanks guys, I hoped that if some of the info was out of date on the 'subdomain reach', there may have now been a way to add a prefered subdomain to my campaign
The reason for the request is currently Roger reports lots of duplicate tags on the blog which Google does not reports because they are restricted in robots.txt or with NOINDEX, NOFOLLOW, NOARCHIVE in the meta tag..
Since I do want to keep the historic data, I will just sift through the errors reported by the blog.
-
Is there any way to change the domain in a campaign here?
When I started the campaign I used site.com but now I have an external blog at blog.site.com and is hosted on tumblr. This site gets lots of errors showing on the campaign because insists on following no-follows. If I change the campaign to www.site.com then blog.site.com will not be part of the campaign.
It does say in the help that it will NOT track sub-domains, but it does and hence my problem. So, I am not sure if the rest of the info there is old.
'Does the Web App track the subdomains of my campaign’s domain?
A. Unfortunately we do not currently track subdomains as a part of the
domain you enter. Instead, you must create another campaign slot for any
subdomains you wish to track.'Any ideas on where or how to do this?
-
RE: Link Juice / Java pop up
Hi Greg,
Every single link points to www.bestholidaynews.com only and the only links to your page is javascript created. So they are the only ones getting juice as Alan said. I was just pointing out some tools you can use for future posts to check for external links to your site that pass link juice.
Brett.
-
RE: Link Juice / Java pop up
I looked at it thinking you were asking about the internal links on the page you left a link to. I am not sure where on the page the link to the author is, except for a contact the author form or their internal link.
If you use firefox then SeoMoz toolbar will allow you to highlight external links on a page. You can then use 'firebug plugin' to follow the link to the source.
-
RE: Herbal Viagra page same DA/PA as UC Berkeley??
With site speed now showing on webmaster tools I'd figure it get a ranking boost for 'performance'.
It does have a lot of linking root domains 2,665 and inbound followed links 13,799 from sites with upto 84 DA, which it probably owns.
-
RE: Link Juice / Java pop up
Yes. Your anchor is of the form:
[http://www.bestholidaynews.com/index.php?option=com_contact_enhanced&view=contact&id=1&layout=confirm&author_id=812%3Aorlando-visitors-bureau&tmpl=component&atitle=Our Top 3 Overlanding Egypt Trips&elink=http%3Aaslashaaslashawww.overlandingafrica.comaslashablogaslashanorth-africaaslashaour-top-3-overlanding-egypt-trips](<a href=)" class="ce-cap-modal" rel="{handler: 'iframe', size: {x:400, y:250}}">Our Top 3 Overlanding Egypt Trips
So, the link is SEO friendly and visible to search engines. A handler is attached that overrides the usual click behaviour.
I would consider running the site through http://gtmetrix.com as there are large delays on loading content due to blocking issues and some missing gifs and other images 75%-95% larger than required.
-
RE: Does google scrape links from PDF files? do these links pass link juice?
Have a look at this article http://searchenginewatch.com/article/2067225/Google-Does-PDF-Other-Changes it explains some of the doc library search for pdf files and Google's statement here http://googleblog.blogspot.com/2008/10/picture-of-thousand-words.html.
-
RE: How long after making changes will position on Google be altered?
Submitting a current sitemap and help the process, at least to get the ball rolling. We tend to see a spike in crawl rate after doing this.
-
RE: Do http:// links to a http://www. site count the same to Google?
The comment posted twice, sorry!
-
RE: Do http:// links to a http://www. site count the same to Google?
There are a couple of things you should do:
-
In Google webmaster tools 'Site Configuration' 'Setting' set your prefered domain (generally the www version).
-
On your website configuration set up URLRewrite or a 301 redirect from site.com to www.site.com
-
In the head section of each page include a fqdn to inform search engines there is NOT duplicate content and all juice should pass to the www version of the page.
-
-
RE: Removing secure subdomain from google index
Do you need 8700 pages served on https? Protocol should transition when a page is ok to serve unsecured. Generally you would only serve pages on https that contain confidential information and have general content on http. If you look at the site and ask how many of those pages can a no logged in user see? If they are not protected by authorization then they do not need https as the content is publically viewable.
-
RE: Robots.txt disallow subdomain
I would suggest you talk to the developers as Theo suggests to exclude visitors from your test site.
-
RE: Robots.txt disallow subdomain
Do you ftp copy one domain to the other? If this is a manual process the excluding the robots.txt that is on the test domain would be as simple as excluding it.
If you automate the copy and want code to function based on base url address then you could create a Httphandler for robots.txt that delivered a different version based on the request url host in the http request header.
-
RE: Did google change their algorithm over the past week?
There is an on going discussion on this at http://www.seomoz.org/q/are-you-seeing-changes-in-your-sites-today-panda-2-2 with links to articles and also here http://www.seomoz.org/q/sudden-drop-in-all-keyword-widgets-keywords-after-years-of-dominating and here http://www.seroundtable.com/google-panda-22-hits-13586.html
So I would figure with all the chatter, they did make more changes recently.
-
RE: For SERPS which is better www or non www
As well as the htaccess rewrite, set the 'change of address' in Google webmaster tools to ensure Google passes all link juice to the desired domain.
-
RE: Duplicate content ramifications for country TLDs
When you say they want to duplicate the site entirely, do you mean structure or written content to another language (Hindi). If you target multiple countries then your would be better having landing pages for the relative country and content pages localized and feeding juice to the .com to gain domain authority. This can have value in both mainainability in development costs and allow easier expansion into other markets. Many large sites, Microsoft included would switch content from site.co.in to site.com/in and deliver specific content there and their main content via site.com/content....
If you do create three sites with the same content you may eventually get a duplicate penilty where rankings drop suddenly but you will definitely be splitting the link juice 3 ways. You can easily setup google webmaster tools with the three domains and tell them to give value to the .com if you localize instead of duplicating.
-
RE: Footer Links for Design Shops - Do They Help or Hurt?
If your design company site is ranking well you can get more value from good strong testimonials on your site than a signature line on site your customers may never find. Take the example that you do a fantastic job on a site to promote kid's books. Someone looking for a good designer for their speed boat site is more likely to read a testimonial on your site than see your tagline on the book site. You gain value by human viewers being able to read what others are saying about your work on a diverse range of sites.
If I am hiring a designer, I do not want one that has worked on a competitor's website so I would search google for designers not look at design by tag lines on competitor's sites.
It comes down to whether you want your two minutes of fame on every site you do or a valuable marketing resource such as testimonials.
-
RE: Why doesnt Seomoz give daily ranking updates
It could, but it looks like it is moving to the Firefox toolbar tools soon. It's great for following those terms you are currently targetting, where subtle changes can make a difference in smaller markets than google.com.
-
RE: Why doesnt Seomoz give daily ranking updates
Seomoz does give you a means to check ranking as often as you like - rank tracker in research Tools allows you to track keywords sitewide or page targeted and even store a history of 50 keywords. It has an advantage in that you can check how your keyword is doing in a different market say .co.uk by choosing google uk.
-
RE: SEO from Godaddy How Good is it?
We host some sites on GoDaddy but find as far as SEO goes, it is very limited. If you are looking to rank for a long tail keyphrase it may get you somewhere, but the tools of SEOMoz are far more tuned. The onpage in the campaigns here give much more detail than the seo reports on GoDaddy and they are instant. So, you can make some changes and test your onpage with out waiting for 12-24 hours for GoDaddy's email. I have found their 'website protection' is useful for detecting quirks that pose security issues.
When you look at the wording you get an idea of what they are offering 'to improve search ratings'. So if you are on page 50 and they get you to page 48, they have improved your ratings. No-one will find you, but your rating has improved. If you want good rank, work with a good SEO to get your site where you want it, and be patient.
-
RE: Subdirectories vs subdomains
I've asked this one on here before and have some test sites trialling both ways and the sub-directory gains rank much faster from the root than the sub-domain.
-
RE: Fast hosting
If you are looking for lightning fast response but do not want to pay high prices, then an option may be some of the VMware clusters or Cloud severs that scale up and deal with high traffic times. If you have bursts of traffic during certain periods of the day or month, such as specials etc, the you pay more for resources during those periods and less when your server is idle. These can be a good solution for medium to large business requirements without paying for clustered dedicated servers. If you do a search of 'cloud hosting' you can find a range on offer.
Speed can also depend of the type of content you are serving. If you intend to stream video on demand or similar services then you would need to ensure you host has the appropriate CDN facilities to meet your needs. If high data requirements then make sure the data centre can deal with that level of traffic without blocking issues.
If your need content delivered fast to various international destinations ensure the host has clusters in the datafeed areas for the countries your are targeting. e.g. To target Australia you do not want a datacentre in the US as that route is clogged so you would need Asia-Pacific centers etc.
-
RE: Google: show all images indexed on a domain
Works fine what Theo suggested.
-
RE: Best practices for country homepage
If you want domain.com to get the rank for domain.com/french you can set in the head section of ./french .italian etc to pass the juice back to the main site. I would think that <a (anchors)="" would="" be="" better="" than="" 302's="" as="" these="" are="" not="" temp="" redirects="" but="" links="" to="" language="" variations.="" i="" also="" ensure="" that="" have="" a="" tag="" in="" the="" html="" section="" <span=""><html xmlns="http://www.w3.org/1999/xhtml" xml:lang="fr-fr" lang="fr-fr"></a>
-
RE: Javascript changing URL - Thoughts?
If you want Google etc to treat it as www.site.com then add www.site.com" /> in the head of that page. Then backlinks to www.site.com www.site.com.com/home will; all give link juice to www.site.com . The category page is a different issue in that there is no category page anymore just a reference on the home page. Your developer need to rethink why he would do this sort of layout remapping which makes it harder to get seo value from the pages.
-
RE: Site not being Indexed that fast anymore, Is something wrong with this Robots.txt
I am not sure why you are setting disallow of file types. Google would not index wmv or js etc anyway as it cannot parse that type of file for data. If you want to coax google into indexing your site submit a sitemap in webmaster tools. You could also set NoFollow on the anchors for the pages you want to exclude and keep robots.txt cleaner by just including top level subdirectories such as admin etc. There just seems to be a lot of directories in there that do not relate to actual pages, and google is only concerned with renderable pages.
-
RE: Lost ranking once optimised a page
Any unsafe sites will be removed from the pages. So if you are 15 and there are 2 unsafe sites between 1 and 15 your rank will move up 2 spots to replace the pages that cannot be shown and so your rank will show 13.
-
RE: Lost ranking once optimised a page
The canonical do help as they funnel the links to provide rank for the http://www.mybabyradio.com/experts-faq/conjunctivitis page. If you leave it out and follow Google webmaster tools you will see duplicate content errors showing up with the 4 urls Ryan mentioned highlighted.
-
RE: How to relate two sites Domain Authority
Since you were looking to Advertise, I was thinking you were looking for referal clients. In that case is it worth paying for advertising, most SEO companies can get you on good DA sites with a good article submissions. It is mentioned on here a bit that multiple good C-Block DA's help to improve your link juice. So good links on lots of good ranked relativant sites should help your ratings more than a single link on a DA70 that you paid $500 for. I know Australia is a smaller market than UK but have moved all our main keyphrases to page 1 or 2 in a very short time by getting good links from on-topic related sites and getting A's for all of them in on-page optimization.
-
RE: How to relate two sites Domain Authority
Strangely, it is not that straight forward when it comes to conversions. If the traffic to site B are more in the market for your sunglasses (i.e mostly from France) compared to site A who's traffic could be from Scotland. the conversions could be better on site B to site A.
I would say if you are only talking about $600 a year, do both.
Check with Keyword tool and Rank Tracker to see where the two sites rank in relation to keywords you consider important for your site to see if visitors would see site A or B and be able to follow your Ad.
-
RE: Crawl report showing only 1 crawled page
Most of the menu system and site could work the same in jquery. Flash is not indexable by the search engines and requires a sitemap to be generated to show the structure of the site. As Ryan says there is only one link on the site. It is a tradeoff between the ease of creating a site in flash and having a good SEO friendly site that Google with rank. Even if you do site map your pages, if the content is contained in flash, it will not be seen.
-
RE: Duplicate Content within Website - problem?
David, the sub-pages as far as Goggle was concerned fed all the juice to the product page.
No the subpages were not indexed as we told Google they all came from the same page in the canonical.
How do you describe a red widget1 differently to blue widget1? The item is the same but there is only one word different in the content, so we decided to skip a physically different url for the different colours and just use different anchors on the thumbnail images. The title and alt tags would contain specific information about the colour of the widget.
If someone searches for red widget1 and we have keyword strength in widget1 they will get to the widget1 page where they will see the red widget1 and any other colours for that widget1.
The canonical allows you to specify the content origin. So if you have /category/widget1/red and /category/widget1/blue describing the same content you could use /category/widget1 in the canonical ref and both pages would give juice to the main page and get no duplicate content penality.
This only works if you have a small number of variants on each widget as Ryan pointed out, such as size, colour variations etc. Otherwise it is too confusing for humans to follow.
With the amount of content you are looking at, it is probably worthwhile getting a usability study done.
-
RE: Duplicate Content within Website - problem?
We had a similar issue but not to that scale. We had product A in Red, Blue, Green etc the first approach we used a url /category/product?id=subproduct and set id as a parameter in Google Webmaster Tools site config. This passed all the link juice to /category/product and ensured that all pages had the appropriate for the link juice page.
We then decided that all those page loads just to basically show an image for each subproduct were a pain for the customer and so decided to show small images on the /category/product page an use a jquery call to overlay a larger image when the customer clicked a particular product. This produced faster load time and better customer experience.
-
RE: Nofollow internal links
I agree with Ryan but question the usability factor of 260 links on your page. Have you done a usability study to check how easily your end users can find the information they are after. It can be daunting for a robot let alone a human to sift through all the sub-menus on your categories. It brings to mind Telstra and Optus sites that take a significant time to find the information your are after because of the huge number of options.
I also notice that when you change currency that a prompt is displayed that 'all items in the shopping cart will be deleted' even when the cart is empty. Should you not check if the cart is empty before displaying the message, otherwise the prompt is defunct.
If you want to still display them but not have the robots index them late populate them from a JQuery async call on demand as the user hovers over a menu item. You would need to ensure they are linked somewhere or on a sitemap so the search engines can still find them.
-
RE: Help with Roger finding phantom links
Thanks again Ryan, you have been very helpful answering al lot of my questions.