I should also add that the Keyword Explorer tool is awesome and one of the best things about Moz Pro. So kudos with that tool. Incorporating the rank tracker into the Keyword explorer would make sense to me from a UX point of view (more than just the first page, change over time, etc). Just a thought
Posts made by prima-253509
-
RE: UPDATE: Rank Tracker is NOT being retired!
-
RE: UPDATE: Rank Tracker is NOT being retired!
I haven't used Rank Tracker very much in the last year but it has historically been useful to look up keywords outside of the core keywords we are tracking in our campaigns. It is not just that the tool is going away, it is also that the quota is being reduced in terms of what you can track. We recently upgraded our subscription so that we could track more keywords but now, in order to mimic the functionality of the Rank Tracker tool I would have to keep some keywords free and in reserve so that campaigns could be created on an ad-hoc basis. i.e. our 750 keyword limit on the campaigns is now essentially 700 if I want to keep open spots for ad-hoc keyword research that had been provided by the rank tracker (tracked over time) or 550 if I wanted to keep open the 200 rankings available on the daily cap.
Campaign limits are also going to be hit in regards to tracking domains for a keyword phrase as you can only add three competitor sites per campaign. It just isn't as functional for ad-hoc research as the Keyword Ranking tool was.
Are quotas going to be increased on campaigns to compensate for this (keywords available / campaign spots available)?
This is disappointing as it seems like a lot of features are disappearing / being sunset while costs are staying the same. If I am missing something about quotas let me know. Thanks!
-
RE: Wordpress Woocomerce Recommended SEO URL structure
Glad it was helpful!
If you are going to have a true blog then that is enough in order to segment it out. Having the date in there can be helpful to compare the hits you are getting to old blogs vs newer blogs (i.e. how long your content is staying relevant).
If you are going to have other types of content such as shopping guides / product comparisons / etc that are more "timeless" pieces of content then you might want to think about the kinds of articles you are going to write and create prefixes that match those types of articles.
You could definitely do product guides and product comparisons in a blog but it can be harder to segment out if it is just "blog".
Hope that helps. Cheers!
-
RE: Wordpress Woocomerce Recommended SEO URL structure
One thing to keep in mind with the urls is how you can segment them in analytics for easy data analysis. You want them to be semantic and pretty but also easily segmented. I would encourage you to think about how you will be able to segment your urls in analytics so that you can easily see patterns in how people are browsing the site and what types of pages are successful.
For instance we have the following url structures for brands, equipment, replacement parts, and a learning center.
- brand/[brand-name]
- equipment/type/[category] - for the categorization of equipment
- equipment/brand/[product] - for easy segmentation of products
- part/type/[category]
- part/brand/[part]
- learn/[cat]
- learn/article/[article-title]
This gives us a lot of flexibility in moving products around in the menu system without messing up urls while still being semantic, and allowing for easy segmentation in analytics. For instance, with this setup we can see if people prefer navigating by equipment catalog or by brand. It also allows us to easily pull out the learning center articles and all the visit we get to them to see how eCommerce only visits are doing.
One thing I would suggest with your blog is to have some kind of prefix that allows you to easily exclude those pages (or only include those pages) in analytics. If you simply go by year without a prefix it will be harder to segment out the data.
You should check out a mozinar that Moz did with Everett Sizemore that deals with a lot of these issues (he specifically talks about SEO and url structure).
Also, you probably have already seen this, but yoast's plugin for wordpress will allow you to remedy much of the duplicate content that wordpress can create.
Cheers!
-
RE: What is the full User Agent of Rogerbot?
I know this is an insanely old question, but as I was looking it up as well and stumbled on this page I thought I would provide some updated info in case anyone else is looking.
The user agent can't be found on the page that is listed anymore. However it is on https://moz.com/help/guides/moz-procedures/what-is-rogerbot
Here is how our server reported Rogerbot in its access logs (taken from May 2013). Notice that there is a difference with the crawler-[number]
rogerbot/1.0 (http://www.seomoz.org/dp/rogerbot, rogerbot-crawler+pr1-crawler-02@seomoz.org
rogerbot/1.0 (http://www.seomoz.org/dp/rogerbot, rogerbot-crawler+pr1-crawler-16@seomoz.org)[updated link added by admin]
-
RE: Considering Switch to old Domain - Any Bad Karma?
Hi Mememax,
Thanks for the feedback, that I what I was hoping for but just thought I would get some thoughts from the great community here. Thanks for weighing in!
Josh
-
Considering Switch to old Domain - Any Bad Karma?
So here is the issue. I am working with a company that used to have a branded domain. Then they split the domain into two separate keyword rich domains and tried to change branding to match the keyword rich domains.
This made for a really long brand name that is difficult to actually rank for as it is mostly hi traffic key terms and also created brand confusion because all of the social accounts still operate under the old brand name.
We are considering a new brand initiative and going back to the original brand name as it better meets our business objectives (they still get traffic from branded searches under the old brand) and the old branded web domain.
My question is if there is any added risk in going back to an old domain that has been forwarded for the past 2 years to the new domain?
I know the risks and problems of a domain name change, but I am not as certain about the added complication of moving back to an old domain and essentially reversing the flow of 301's. Any thoughts?
Cheers!
-
RE: Tool for tracking actions taken on problem urls
Maybe I don't fully appreciate the power of excel but what I am envisioning seems to require more than what excel can provide.
Thanks for the suggestion though. I will think about it some more.
-
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources.
So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues.
Example Case:
SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404).
When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found.
I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed.
Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved).
Bonus for any tool that uses Google and SEOmoz API to gather this info for me
Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed).
Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools.
Thanks!
-
RE: Google Hiding Indexed Pages from SERPS?
Thanks Alan,
will see what we can do. One way or the other it has to be addressed.
-
RE: Google Hiding Indexed Pages from SERPS?
High Alan,
thanks for the response, I guess it is good to know that someone else has seen this issue before.
As for canonical tags, I do have them on all pages, but because there is not a way to set them absolutely (our CMS only allows for relative....so it takes the base path of the domain that it is on) I can't get them to link only to the domain that they are supposed to be published on.
Cheers!
-
RE: How to add a disclaimer to a site but keep the content accessible to search robots?
That is rough,
maybe a legitimate situation for user agent sniffing (albeit fraught with danger)? If you can't rely on javascript then it would seem that any option will have significant downsides.
This may be a hair-brained suggestion but what about appending a server parameter to all links for those who do not have a cookie set? if the user agent is google or bing (or any other search bot) the server could ignore that parameter and send them on their way to the correct page, however if the user agent is not a search engine then they would be forced to the disclaimer page.
This would allow for a user to see the initial content (which may not be allowed?) but not navigate the site, however it would also allow you to present the same info to both user and agent while making the user accept the terms.
Alternatively serve up a version of the page that has the div containing the disclaimer form expand to fill the whole viewport to non-cookied visitors and set the style to position:fixed which should keep the visitor from scrolling past the div, but it will still render the content below the viewport. Thus cookied visitors don't see a form but non-cookied visitors get the same page content but can't scroll to it until they accept the form (mobile does weird things with position fixe, so this again might not work, and a savy user could get around it).
Edit: Just found this article which looks promising. It is a google doc on how to allow crawls on a cookied domain https://developers.google.com/search-appliance/documentation/50/help_gsa/crawl_cookies might solve the problem in a more elegant, safe way.
Would be interested to hear what you come up with. If you could rely on javascript then there are many ways to do it.
Cheers!
-
Google Hiding Indexed Pages from SERPS?
Trying to troubleshoot an issue with one of our websites and noticed a weird discrepancy. Our site should only have 3 pages in the index. The main landing page with a contact form and two policy pages, yet google reports over 1,100 pages (that part is not a mystery, I know where they are coming from.....multi site installations of popular CMS's leave much to be desired in actually separating websites)
Here is a screen shot showing the results of the site command:
http://www.diigo.com/item/image/2jing/oseh
I have set my search settings to show 100 (the max number of results) results per page. Everything is fine until I get to page three where I get the standard "In order to show you the most relevant results, we have omitted some entries very similar to the 122 already displayed." But wait a second, I clicked on page three, now there are only two pages of results and the number of results reported has dropped to 122
http://www.diigo.com/item/image/2jing/r8c9
When I click on the "show omitted results" I do get some more results, and the returned results jumps back up to 1,100. However I only get three pages of results. And when I click on the last page the number of results returned changes to 205
http://www.diigo.com/item/image/2jing/jd4h
Is this a difference between indexes (same thing happens when I turn instant search back on, Shows over 1,100 results but when I get to the last page of results it changes to 205).
Any other way of getting this info? I am trying to go in and identify how these pages are being generated, but I have to know what ones are showing up in the index for that to happen. Only being able to access 1/5th of the pages indexed is not cool. Anyone have any idea about this or experience with it?
For reference I was going through with SEOmoz's excellent toolbar and exporting the results to csv (using the Mozilla plugin). I guess google doesn't like people doing that so maybe this is a way to protect against scraping by only showing limited results in the Site: command.
Thanks!
-
RE: Disqus integration and cloaking
Thanks John,
That link was helpful, it is a similar concept but we are not using ajax. I appreciate your response.
-
RE: Channel Conversion Rates
Hi Kyle,
I hope this will be helpful in gauging your sites performance, but I have a feeling that it will be hard to compare because the conversion rates change so much depending on the target audience and types of users. Anyway, here it is for what it is.
I have three sites that I currently am involved with that are in the eCommerce realm. Two of which are mostly B2B and one that is both B2B and B2C.
Our lowest performing CPC is .39% while our highest is 3.53% (varies wildly depending on site and referrer, google/bing/etc)
Our lowest performing organic is .89% while our highest is 4.55% (again same stipulations as above)
Direct 1.6% - 5.5% depending on the site.
From what I have seen (and I know we can improve) your organic numbers look really good (maybe high?). while CPC might be a little low. Your direct looks really good as well, although I find it interesting that it is below your organic.
Hope that gives some gauge for you.
-
Disqus integration and cloaking
Hey everyone,
I have a fairly specific question on cloaking and whether our integration with disqus might be viewed as cloaking.
Here is the setup. We have a site that runs off of drupal and would like to convert the comment handling to disqus for ease of our users. However, when javasrcript is disabled the nice comment system and all of the comments from disqus disappear. This obviously isn't good for SEO, however the user experience using disqus is way better than the native comment system. So here is how we are addressing the problem. With drupal we can sync comments between the native comment system and disqus. When a user has javascript enabled the containing div for the native comment system is set to display:none. hiding the submission form and all of the content and instead displaying it through the disqus interface. However when javascrip is not enabled the native comment form and the comments will be available to the user.
Could this be considered cloaking by google? I know they do not like hidden div's, but it should be almost exactly the same content being displayed to the user (depending on when the last sync was run).
Thanks for your thoughts, and if anyone has familiarity with a better way to integrate drupal and disqus I am all ears.
Josh
-
RE: Homepage outranked by sub pages - reason for concern?
Thanks for the response. It is nice to hear from someone else who has the same type of site and sees the same thing. Appreciate the tip and the response.
-
RE: Homepage outranked by sub pages - reason for concern?
Thanks Alan,
that helps and you might have pointed something there. Our site has lots of links on each page and each page basically links to the same pages which would keep everything pretty even. Structure is something that we are working on. I wonder if that is part of the problem.
-
Homepage outranked by sub pages - reason for concern?
Hey All,
trying to figure out how concerned I should be about this. So here is the scoop, would appreciate your thoughts.
We have several eCommerce websites that have been affected by Panda, do to content from manufacturers and lack of original content. We have been working hard to write our own descriptions and are seeing an increase in traffic again. We have also been writing blogs since February and are getting a lot of visits to them.
Here is the problem, our blog pages are now outranking our homepage when you type in site:domain-name
Is this a problem? our home page does not show up until you are 3 pages in. However when you type in just our domain name in google as a search it does show up in position one with sitelinks under it.
This is happening across both of our sites. Is this a cause for concern or just natural due to our blogs being more popular than our homepage.
Thanks!
Josh
-
RE: Facebook Comments
If you mean viewing the source of the page and the actual html elements that is what I did. With Javascript turned on all of the html elements show up. With it turned off they don't, thus much of it is being written via javascript.
Instant preview on that page from the google serps does not show all of the comments, just the likes. However the cached version of the page does show all of the comments, but it must be some sort of screen capture because the majority of the comments do not show up when viewing the source of the cached page.
So not sure that really confirms anything. I guess to find out you might have to do a controlled test.
-
RE: Facebook Comments
Can anyone confirm that google actually sees all of the comments by disqus? I turned javascript off on the page and about half of the comments disappeared (the http://garyvaynerchuk.com/post/7396143247/the-twitter-at-system-do-you-understand-it example). About the only thing that showed up was the "so and so likes this" and "so and so re-blogged this" none of the actual comments appear to show up. I know it is not an iframe but the content is still being written to the page via javascript, so I am wondering if Google and other crawlers can actually access it. Any thoughts?
-
RE: How is link juice split between navigation?
Hi Keri,
thanks for the follow up. As for the specific question no I have not really found a concrete answer. Currently we have left the duplicate navigation alone and focused on more pressing updates. Sorry that I don't have more info to share.
-
What tools do you use to submit a site to local yellowpages?
Hey all,
two part question for you.
-
Do you use any tools to automatically submit websites to local yellowpages (example: http://business.intuit.com/directory/marketing/100_syndication_sites.jsp)? and if so, what one and why?
-
Are there any dangers to doing it this way?
It seems that this might save a lot of time and be incredibly helpful to manage your brand profile pages in a centralized location. Also some tools that I am seeing incorporate brand monitoring (which you can do through a variety of tools I know). Anyways, thoughts? comments? tips?
-
-
RE: Google analytics now showing social signals
Well the simple answer is that I made a mistake. It popped up again today and after doing some research yesterday and this morning I realized it is being populated by an extension that I have for chrome (started wondering about that when it showed up today in Chrome but not FF).
The social stats are being populated by the SEO Site Tools extension for Chrome: https://chrome.google.com/webstore/detail/diahigjngdnkdgajdbpjdeomopbpkjjc
Evidently this feature works intermittently as I have never seen it before and it only worked once yesterday and only on this page and I have been using it for about a month (for some odd reason it doesn't work on any other page).
So apologies for getting you all excited. If you want to read more about the tool SEJ did a review on it a while back: http://www.searchenginejournal.com/the-seo-tool-that-may-make-you-switch-to-google-chrome/20672/
Cheers
-
RE: Google analytics now showing social signals
Well, bummer, believe it or not but it is gone now. i attached a link with a box around where it was showing up. if I see it again I will take a screen shot of it. It was interesting because for the particular page it showed 6 likes and 1 shares, no comments for the facebook stats. so it distinguished between likes and shares somehow. Which is something I am fuzzy on as FB seemed to merge the two.
-
RE: Google analytics now showing social signals
ya, it just shows up on pages where I have shares. otherwise nothing shows up.
-
Google analytics now showing social signals
Looking through Google analytics today and noticed that there is a section under top content that shows number of Facebook likes & shares, tweets, diggs, delicious book marks, etc. Anyone else seeing this?
[staff note: see answers, this came from a Chrome extension]
-
RE: Google Analytics Benchmarking Newsletter: How does your site perform?
Hey Aaron,
good point, hard to always keep that in perspective. The reason that I am concerned about it is because from what I understand google is taking those metrics and using it in there search results. We have a landing page that is full of good content, people spend an average of over 4 minutes on it (a single page) and it had a good conversion value, however it had a high bounce rate. We have seen the number of impressions where this pages shows up decline quite substantially over the past month and I am wondering if it is do to the high bounce rate.
So if the search engines are incorporating those stats, as much as it might not mean anything in terms of user experience (i.e. we are actually providing a good user experience), it might mean a lot for how the search engines rank you (i.e. they just see a high bounce rate).
Thanks for the response!
-
RE: Google Analytics Benchmarking Newsletter: How does your site perform?
That would be really nice to see, and would definitely be helpful. Would love to see that take place.
Fireclick's stats are very interesting, will look at it a little more, thanks for the tip.
-
RE: Google Analytics Benchmarking Newsletter: How does your site perform?
Hey Benjamin,
Thanks for the response, good info, and sharing your stats. I think your last statement about comparing bounce rate directly between industries is exactly what I was thinking. The aggregated stats from google are great but there is no segmentation so it doesn't seem to be incredibly helpful as a benchmark. Thus the question and hopefully we will get enough answers to get a feel for how different industries compare and how the sites that the seomoz community handle fair as compared to the aggregate stats.
I know that we have quite a bit of work to do to get our sites optimized.
Thanks again for your response
-
Google Analytics Benchmarking Newsletter: How does your site perform?
With Google recently releasing benchmarking data I am curious as to what you all see across the various types of website niches that you work with (eCommerce, news, blog, services, small business, etc). And how SEO'd websites compare with this "raw" data provided by google.
We have one medium size (12,000 products) strictly eCommerce website that has a bounce rate of 37% and an avg time on site of 5:20
While two other medium size eCommerce/blog sites have a bounce rate of 57% and 59% with average time on site of 2:37 and 2:30 respectively.
Finally, I manage a website for a local small business that provides business and home cleaning services. This site has a bounce rate of 45% and 1:40 average time on site.
How do your sites perform in these areas? Is it typical to see this great of a disparity between strict eCommerce websites and those sites that are both informational and transactional in nature? What about other kinds of websites?
Cheers!
-
RE: Bing Update?
There seems to be no talk, just a few webmaster reporting sharp drops (or more favorably, sharp increases). But nothing from Bing that I have been able to find. Congrats though on the increase, would be curious to hear if you see it drop off or if it stays the same.
Thanks for the note,
Cheers!
Josh
-
RE: Bing Update?
Hi Ryan, thanks for the response and help. I was not aware of that feature, thanks for the tip.
We have three sites, all of which experienced significant drops all on the 10th but are slowly regaining indexation and ranking. We have never had a fully indexed site with bing, all are above 7,000 pages (ecommerce), with 3-4 thousand pages indexed. One went from 3000 down to 1500 in one day, the other two lost about 30% of their indexed pages. Just seemed really odd, and I saw a couple of other people on Bing's blog complaining of large drops on June 10th (doubly odd). Will continue to investigate.
Cheers!
-
Bing Update?
With all of the talk about Panda V2 I am just curious as to whether Bing has also updated their algorithm in the past month. I have seen some reports about the number of indexed pages dropping in half on June 10th as reported in bing's webmaster tools, as well as experiencing that drop on our websites, and a significant decrease in rankings do to them dropping some high performing pages out of their index.
So Just wondering if anyone else had seen this or had any info on an algorithm update rolled out from bing.
Cheers,
Josh
-
What is the effect of a proxy server replicating a sight on SEO
I have heard of PPC company's that set up a proxy server to replicate your site so that they can use their own tracking methods for their reports. What affect if any does this have on SEO for a site?
-
RE: Tool for scanning the content of the canonical tag
Thanks Dr. Pete and Marcus.
I just finished reading the post. I have looked at Screaming Frog before but was hoping to be able to find a way to do it myself. Just didn't want to plop money down on something that seemed like it should be able to be done using tools I already had. But the software does look good. Any thought on if they will come out with a one time purchase instead of a yearly subscription?
Cheers!
Josh
-
RE: Tool for scanning the content of the canonical tag
Great, really appreciate it! Many thumbs up
-
RE: Tool for scanning the content of the canonical tag
Hey Marcus,
thanks for the quick response. That is exactly what I would be looking for. I do have a list of url's and that is also simple enough to get from something like xenu. Would love to work with you on this.
Thanks.
Josh
-
Tool for scanning the content of the canonical tag
Hey All,
question for you. What is your favorite tool/method for scanning a website for specific tags? Specifically (as my situation dictates now) for canonical tags?
I am looking for a tool that is flexible, hopefully free, and highly customizable (for instance, you can specify the tag to look for). I like the concept of using google docs with the import xml feature but as you can only use 50 of those commands at a time it is very limiting (http://www.distilled.co.uk/blog/seo/how-to-build-agile-seo-tools-using-google-docs/).
I do have a campaign set up using the tools which is great! but I need something that returns a response faster and can get data from more than 10,000 links. Our cms unfortunately puts out some odd canonical tags depending on how a page is rendered and I am trying to catch them quickly before it gets indexed and causes problems. Eventually I would also like to be able to scan for other specific tags, hence the customizable concern. If we have to write a vb script to get it into excel I suppose we can do that.
Cheers,
Josh
-
RE: Best Way To Host Images For Image Optimization
Hey Guillaume,
Thanks for pointing out the alt tags, can't believe I forgot to mention that (doh!).
Good points with how to address the speed issues.
Cheers,
Josh
-
RE: How is link juice split between navigation?
Hey Damien, thanks for the response. Ya I had originally thought about no following one set of links but then found out what you just pointed out, that the nofollow doesn't work that way anymore. We actually have more links then that per page (that just happens to be a round number) but what I am trying to figure out is since about half of them are duplicates am I really losing anything? since they only link to about 50 unique pages are those pages being passed the same amount of juice as they would be if they were only being linked to once per page (instead of being linked to in the main nav and footer)?
-
RE: Best Way To Host Images For Image Optimization
I will take a stab at it and someone more knowledgeable than me can correct me if I am wrong :).
I would host the pictures on your own server and address the speed issues with caching, compression, etc. As long as the pictures are optimized for the web correctly you can mitigate the cost in speed and gain the opportunity to be able to control your image optimization plan.
Right now your links point out and are not descriptive of the content at all. If they were hosted on your site you are referencing your own domain and can control the URL much easier. Instead of the link you currently have you could have a specific url that describes the picture (www.example.com/photos/Rafael-Nadal-wimbledon-2011.jpg). Please note that I am no tennis expert so this is just an example :). this would give you a descriptive url that points to your site, and by optimizing the alt and title tags you tell the search engines what your image is about. When those picture are indexed and show up in google or bing it is your site that gets the traffic and credit not a 3rd party host.
Hope that helps, personally as searching trends towards more visual cues I want to make sure that any pictures that are found of our products are optimized fully and point to our site.
-
RE: How is link juice split between navigation?
Thanks for posting. I understand what chapter four says but it doesn't seem to answer my question. My understanding is that google only counts the first link on a page when passing link juice although it splits link juice across all of the links on a page. So according to this understanding only the navigation contained in the dropdowns at the top of the page will pass link juice, thus only half of the possible link juice is passed since the links in the footer don't pass any juice (even though they are factored in to how much juice each link passes). Is that a correct understanding? The example in the book does not discuss what happens to how link juice is calculated and passed when two links on one page point to the same subpage.
-
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation
Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed?
What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit?
We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages.
Thanks for your help! Cheers.