I should also add that the Keyword Explorer tool is awesome and one of the best things about Moz Pro. So kudos with that tool. Incorporating the rank tracker into the Keyword explorer would make sense to me from a UX point of view (more than just the first page, change over time, etc). Just a thought
- Home
- prima-253509
prima-253509
@prima-253509
Job Title: Director of Marketing
Company: Prima Supply
Website Description
We focus on selling restaurant equipment to startups and established businesses.
I have grown up being fascinated by technology and how it is used in everyday life. My first brush with the internet was less than stellar, I found it hard to use, ugly, and slow. But I stuck around and was excited to see the internet come into its own. I became quite fascinated by the emerging functionality and all of the technology behind it. So pursued an associates in webdesign, and a BA in Liberal studies with a focus on marketing and business. I now work as an in-house SEO analyst and social network strategist.
Favorite Thing about SEO
Learning new things
Latest posts made by prima-253509
-
RE: UPDATE: Rank Tracker is NOT being retired!
-
RE: UPDATE: Rank Tracker is NOT being retired!
I haven't used Rank Tracker very much in the last year but it has historically been useful to look up keywords outside of the core keywords we are tracking in our campaigns. It is not just that the tool is going away, it is also that the quota is being reduced in terms of what you can track. We recently upgraded our subscription so that we could track more keywords but now, in order to mimic the functionality of the Rank Tracker tool I would have to keep some keywords free and in reserve so that campaigns could be created on an ad-hoc basis. i.e. our 750 keyword limit on the campaigns is now essentially 700 if I want to keep open spots for ad-hoc keyword research that had been provided by the rank tracker (tracked over time) or 550 if I wanted to keep open the 200 rankings available on the daily cap.
Campaign limits are also going to be hit in regards to tracking domains for a keyword phrase as you can only add three competitor sites per campaign. It just isn't as functional for ad-hoc research as the Keyword Ranking tool was.
Are quotas going to be increased on campaigns to compensate for this (keywords available / campaign spots available)?
This is disappointing as it seems like a lot of features are disappearing / being sunset while costs are staying the same. If I am missing something about quotas let me know. Thanks!
-
RE: Wordpress Woocomerce Recommended SEO URL structure
Glad it was helpful!
If you are going to have a true blog then that is enough in order to segment it out. Having the date in there can be helpful to compare the hits you are getting to old blogs vs newer blogs (i.e. how long your content is staying relevant).
If you are going to have other types of content such as shopping guides / product comparisons / etc that are more "timeless" pieces of content then you might want to think about the kinds of articles you are going to write and create prefixes that match those types of articles.
You could definitely do product guides and product comparisons in a blog but it can be harder to segment out if it is just "blog".
Hope that helps. Cheers!
-
RE: Wordpress Woocomerce Recommended SEO URL structure
One thing to keep in mind with the urls is how you can segment them in analytics for easy data analysis. You want them to be semantic and pretty but also easily segmented. I would encourage you to think about how you will be able to segment your urls in analytics so that you can easily see patterns in how people are browsing the site and what types of pages are successful.
For instance we have the following url structures for brands, equipment, replacement parts, and a learning center.
- brand/[brand-name]
- equipment/type/[category] - for the categorization of equipment
- equipment/brand/[product] - for easy segmentation of products
- part/type/[category]
- part/brand/[part]
- learn/[cat]
- learn/article/[article-title]
This gives us a lot of flexibility in moving products around in the menu system without messing up urls while still being semantic, and allowing for easy segmentation in analytics. For instance, with this setup we can see if people prefer navigating by equipment catalog or by brand. It also allows us to easily pull out the learning center articles and all the visit we get to them to see how eCommerce only visits are doing.
One thing I would suggest with your blog is to have some kind of prefix that allows you to easily exclude those pages (or only include those pages) in analytics. If you simply go by year without a prefix it will be harder to segment out the data.
You should check out a mozinar that Moz did with Everett Sizemore that deals with a lot of these issues (he specifically talks about SEO and url structure).
Also, you probably have already seen this, but yoast's plugin for wordpress will allow you to remedy much of the duplicate content that wordpress can create.
Cheers!
-
RE: What is the full User Agent of Rogerbot?
I know this is an insanely old question, but as I was looking it up as well and stumbled on this page I thought I would provide some updated info in case anyone else is looking.
The user agent can't be found on the page that is listed anymore. However it is on https://moz.com/help/guides/moz-procedures/what-is-rogerbot
Here is how our server reported Rogerbot in its access logs (taken from May 2013). Notice that there is a difference with the crawler-[number]
rogerbot/1.0 (http://www.seomoz.org/dp/rogerbot, rogerbot-crawler+pr1-crawler-02@seomoz.org
rogerbot/1.0 (http://www.seomoz.org/dp/rogerbot, rogerbot-crawler+pr1-crawler-16@seomoz.org)[updated link added by admin]
-
RE: Considering Switch to old Domain - Any Bad Karma?
Hi Mememax,
Thanks for the feedback, that I what I was hoping for but just thought I would get some thoughts from the great community here. Thanks for weighing in!
Josh
-
Considering Switch to old Domain - Any Bad Karma?
So here is the issue. I am working with a company that used to have a branded domain. Then they split the domain into two separate keyword rich domains and tried to change branding to match the keyword rich domains.
This made for a really long brand name that is difficult to actually rank for as it is mostly hi traffic key terms and also created brand confusion because all of the social accounts still operate under the old brand name.
We are considering a new brand initiative and going back to the original brand name as it better meets our business objectives (they still get traffic from branded searches under the old brand) and the old branded web domain.
My question is if there is any added risk in going back to an old domain that has been forwarded for the past 2 years to the new domain?
I know the risks and problems of a domain name change, but I am not as certain about the added complication of moving back to an old domain and essentially reversing the flow of 301's. Any thoughts?
Cheers!
-
RE: Tool for tracking actions taken on problem urls
Maybe I don't fully appreciate the power of excel but what I am envisioning seems to require more than what excel can provide.
Thanks for the suggestion though. I will think about it some more.
-
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources.
So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues.
Example Case:
SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404).
When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found.
I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed.
Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved).
Bonus for any tool that uses Google and SEOmoz API to gather this info for me
Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed).
Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools.
Thanks!
-
RE: Google Hiding Indexed Pages from SERPS?
Thanks Alan,
will see what we can do. One way or the other it has to be addressed.
Best posts made by prima-253509
-
RE: Wordpress Woocomerce Recommended SEO URL structure
One thing to keep in mind with the urls is how you can segment them in analytics for easy data analysis. You want them to be semantic and pretty but also easily segmented. I would encourage you to think about how you will be able to segment your urls in analytics so that you can easily see patterns in how people are browsing the site and what types of pages are successful.
For instance we have the following url structures for brands, equipment, replacement parts, and a learning center.
- brand/[brand-name]
- equipment/type/[category] - for the categorization of equipment
- equipment/brand/[product] - for easy segmentation of products
- part/type/[category]
- part/brand/[part]
- learn/[cat]
- learn/article/[article-title]
This gives us a lot of flexibility in moving products around in the menu system without messing up urls while still being semantic, and allowing for easy segmentation in analytics. For instance, with this setup we can see if people prefer navigating by equipment catalog or by brand. It also allows us to easily pull out the learning center articles and all the visit we get to them to see how eCommerce only visits are doing.
One thing I would suggest with your blog is to have some kind of prefix that allows you to easily exclude those pages (or only include those pages) in analytics. If you simply go by year without a prefix it will be harder to segment out the data.
You should check out a mozinar that Moz did with Everett Sizemore that deals with a lot of these issues (he specifically talks about SEO and url structure).
Also, you probably have already seen this, but yoast's plugin for wordpress will allow you to remedy much of the duplicate content that wordpress can create.
Cheers!
-
RE: UPDATE: Rank Tracker is NOT being retired!
I should also add that the Keyword Explorer tool is awesome and one of the best things about Moz Pro. So kudos with that tool. Incorporating the rank tracker into the Keyword explorer would make sense to me from a UX point of view (more than just the first page, change over time, etc). Just a thought
-
What is the effect of a proxy server replicating a sight on SEO
I have heard of PPC company's that set up a proxy server to replicate your site so that they can use their own tracking methods for their reports. What affect if any does this have on SEO for a site?
-
RE: How to add a disclaimer to a site but keep the content accessible to search robots?
That is rough,
maybe a legitimate situation for user agent sniffing (albeit fraught with danger)? If you can't rely on javascript then it would seem that any option will have significant downsides.
This may be a hair-brained suggestion but what about appending a server parameter to all links for those who do not have a cookie set? if the user agent is google or bing (or any other search bot) the server could ignore that parameter and send them on their way to the correct page, however if the user agent is not a search engine then they would be forced to the disclaimer page.
This would allow for a user to see the initial content (which may not be allowed?) but not navigate the site, however it would also allow you to present the same info to both user and agent while making the user accept the terms.
Alternatively serve up a version of the page that has the div containing the disclaimer form expand to fill the whole viewport to non-cookied visitors and set the style to position:fixed which should keep the visitor from scrolling past the div, but it will still render the content below the viewport. Thus cookied visitors don't see a form but non-cookied visitors get the same page content but can't scroll to it until they accept the form (mobile does weird things with position fixe, so this again might not work, and a savy user could get around it).
Edit: Just found this article which looks promising. It is a google doc on how to allow crawls on a cookied domain https://developers.google.com/search-appliance/documentation/50/help_gsa/crawl_cookies might solve the problem in a more elegant, safe way.
Would be interested to hear what you come up with. If you could rely on javascript then there are many ways to do it.
Cheers!
-
Google Analytics Benchmarking Newsletter: How does your site perform?
With Google recently releasing benchmarking data I am curious as to what you all see across the various types of website niches that you work with (eCommerce, news, blog, services, small business, etc). And how SEO'd websites compare with this "raw" data provided by google.
We have one medium size (12,000 products) strictly eCommerce website that has a bounce rate of 37% and an avg time on site of 5:20
While two other medium size eCommerce/blog sites have a bounce rate of 57% and 59% with average time on site of 2:37 and 2:30 respectively.
Finally, I manage a website for a local small business that provides business and home cleaning services. This site has a bounce rate of 45% and 1:40 average time on site.
How do your sites perform in these areas? Is it typical to see this great of a disparity between strict eCommerce websites and those sites that are both informational and transactional in nature? What about other kinds of websites?
Cheers!
-
RE: Channel Conversion Rates
Hi Kyle,
I hope this will be helpful in gauging your sites performance, but I have a feeling that it will be hard to compare because the conversion rates change so much depending on the target audience and types of users. Anyway, here it is for what it is.
I have three sites that I currently am involved with that are in the eCommerce realm. Two of which are mostly B2B and one that is both B2B and B2C.
Our lowest performing CPC is .39% while our highest is 3.53% (varies wildly depending on site and referrer, google/bing/etc)
Our lowest performing organic is .89% while our highest is 4.55% (again same stipulations as above)
Direct 1.6% - 5.5% depending on the site.
From what I have seen (and I know we can improve) your organic numbers look really good (maybe high?). while CPC might be a little low. Your direct looks really good as well, although I find it interesting that it is below your organic.
Hope that gives some gauge for you.
-
RE: Wordpress Woocomerce Recommended SEO URL structure
Glad it was helpful!
If you are going to have a true blog then that is enough in order to segment it out. Having the date in there can be helpful to compare the hits you are getting to old blogs vs newer blogs (i.e. how long your content is staying relevant).
If you are going to have other types of content such as shopping guides / product comparisons / etc that are more "timeless" pieces of content then you might want to think about the kinds of articles you are going to write and create prefixes that match those types of articles.
You could definitely do product guides and product comparisons in a blog but it can be harder to segment out if it is just "blog".
Hope that helps. Cheers!
-
RE: What is the full User Agent of Rogerbot?
I know this is an insanely old question, but as I was looking it up as well and stumbled on this page I thought I would provide some updated info in case anyone else is looking.
The user agent can't be found on the page that is listed anymore. However it is on https://moz.com/help/guides/moz-procedures/what-is-rogerbot
Here is how our server reported Rogerbot in its access logs (taken from May 2013). Notice that there is a difference with the crawler-[number]
rogerbot/1.0 (http://www.seomoz.org/dp/rogerbot, rogerbot-crawler+pr1-crawler-02@seomoz.org
rogerbot/1.0 (http://www.seomoz.org/dp/rogerbot, rogerbot-crawler+pr1-crawler-16@seomoz.org)[updated link added by admin]
-
RE: Google Analytics Benchmarking Newsletter: How does your site perform?
Hey Benjamin,
Thanks for the response, good info, and sharing your stats. I think your last statement about comparing bounce rate directly between industries is exactly what I was thinking. The aggregated stats from google are great but there is no segmentation so it doesn't seem to be incredibly helpful as a benchmark. Thus the question and hopefully we will get enough answers to get a feel for how different industries compare and how the sites that the seomoz community handle fair as compared to the aggregate stats.
I know that we have quite a bit of work to do to get our sites optimized.
Thanks again for your response
I have grown up being fascinated by technology and how it is used in everyday life. My first brush with the internet was less than stellar, I found it hard to use, ugly, and slow.
But I stuck around and was excited to see the internet come into its own. I became quite fascinated by the emerging functionality and all of the technology behind it. So pursued an associates in webdesign, and a BA in Liberal studies with a focus on marketing and business.
I now work as an in-house SEO analyst and social network strategist.
Looks like your connection to Moz was lost, please wait while we try to reconnect.