Any update on this? If not I'm going to mark as Answered.
Posts made by dohertyjf
-
RE: Google analytics advanced segments
-
RE: What would be the SEO benefits of using Lulu ? (ebook)
Hey Ericc -
Thanks for your question! Could you expand a bit more on what you mean by SEO benefits of using Lulu? I used to work in publishing and know some people who have published through Lulu. I believe you are correct that you can distribute on Amazon through it, as I think they use Ingram as their provider/carrier/lister.
If I were you, I think publishing through Lulu is fine, but I'd also get a copy of it and put it on your site. If your site has a decent following, maybe try running a social campaign around it to get more traction. The "Free Download" title often will help get downloads as well.
So I'd recommend putting it on your site. You may be able to get a link back to your site from Lulu (I'm not sure about this), which would be a quality link as it's a strong domain, so if you can do this send the link to your eBook-specific page.
John
-
RE: What is the quickest way to get OSE data for many URLs all at once?
As I just told Dan on Twitter, I built out a spreadsheet a few months ago that is linked to in this post: http://bit.ly/mc0Q9v.
You'll have to use your own Moz free API key and hack the sheet a little bit, but it uses the Moz and Twitter APIs to pull the Moz metrics (DA, PA, etc) into a Google doc.
Good luck Dan and anyone else who reads this!
-
RE: Can obfuscated Javascript be used for too many links on a page?
Hey Trevor -
A couple of things here. First, I would never recommend that someone use obfuscated Javascript on links in order to make it so that the crawlers cannot see them.
Also, I think the "too many links on a page" guideline is not one to follow too strictly. It's not an "error" in the Moz Pro Campaigns because it is a guideline. Depending on your site, you can have many more than 100 links on a page and be fine. Or, you can use other ways (iframes, nav behind Javascript) to have these links available to the crawlers.
Just remember (as I am sure you do) that these links will not pass any link juice and you will need to use other ways to get your pages indexed and to have a good crawler-friendly architecture.
Just my two cents. I don't think you'd be cloaking, but it's starting to get a bit iffy. I'd steer clear.
-
RE: Why is my site's 'Rich Snippets' information not being displayed in SERPs?
Hey Techboy -
Assuming your site validates fine using the Rich Snippets tool, as you said, unfortunately I don't think there is much you can do. I heard Stefan Weitz from Bing talk about it, and he said that it's a slow rollout because they want to get it right, and they are also wanting to see how people use it. Also, the search engines are giving priority to brands and well-known people (especially with rel=author markup), so the little guys are having a harder and harder time getting the semantic markup to show in the SERPs.
He even went so far as to say that we should mark up our sites now, so that when Schema is rolled out more we'll be ahead of the curve (and he insinuated that it will affect rankings positively as well).
Sorry I can't provide an actionable answer, but right now with the semantic markup it can be a bit of a waiting game.
-
RE: Schema.org for real estate portal
Hi -
The first three seem to make sense to me. If I were you, I would treat Property Listings like a Place. Place also has GeoCoordinates as an Itemprop, so you can mark up the location from within that Schema as well.
Depending on what kind of property it is, you could also use the Thing > Place > Residence Schema for certain parts. It's a bit more specific than Place.
Hope this helps a bit!
-
RE: Google analytics advanced segments
Hey Nerds -
This is a good question. Have you tried this:
New Advanced Segment -> Exclude -> Landing Page -> Containing -> /backend/cookie.php?
This should filter it out when you select the statement when you are in Traffic Sources - Sources - Search - Organic.
Let me know if this is not what you need!
-
RE: Is there a report in SEOMoz that will show me what keywords each page ranks for on my site?
You can also use tools like SEMrush to find the keywords that your site (or a competitors site!) is ranking well for, ie their top keywords. The free version gives you a snapshot, and I'm sure the paid version gives you even more.
Anthony is right though, since parts of your site individually probably rank for longterm. The idea of taking the keywords driving traffic, as found in Analytics, and backtracking them through a tracking tool is a great idea. And these are the terms you should care about as well. The sweetest thing, to me at least, is finding a term that I rank well for, that drives traffic well, that I have not optimized for at all, yet could easily gain more traffic on.
Good luck!
-
RE: Rel Canonical Syntax
I'm not positive about how they'll deal with it, but why take a chance? It won't be that hard for them to change it from a single to a normal quotation, especially since it's on your test server.
Better safe than sorry.
-
RE: Robots.txt Syntax
Rodrigo -
Thanks, and thanks for the follow-up. To be honest with you though...I have not seen or experienced anything about this. I tend to follow the suggested rules with code
So my answer is "I don't know". Anyone else know?
I also agree with you on the meta tags. Robots.txt is best used for disallowing folders and such, not pages. For instance, I might do a "Disallow: /admin" in the robots.txt file, but would never block a category page or something to that effect. If I wanted to remove it from the index, I'd also use the meta "noindex,follow" attribute. Good point!
-
RE: Robots.txt Syntax
Rodrigo -
Good question. The syntax does in fact matter, though not necessarily for SEO rankings. It matters because if you screw up your robots.txt, you can inadvertently disallow your whole site (I did it last week. Not pretty. Blog post forthcoming).
To get to your question, it is usually best to put the "Sitemap: " line at the bottom of the robots.txt, but it is not required to have it there, so far as I know.
You do not need the Allow: / parameter, because if you leave it out, Google assumes that you want everything indexed except what is put in the "Disallow: " lines.
In your case, you are disallowing "http://www.site.com/form.htm" and everything in your cgnet_directory folder. If you want everything in these folders hidden from crawlers...you have done exactly what you need to do.
I'm still learning about this, so I'm open to any correction the rest of the community has.