You are welcome (you also aren't alone; I don't have problems often and it's great that the Moz team has such terrific service).
![josh-riley josh-riley](/community/q/assets/uploads/profile/19414-profileavatar-1619582975267.png)
Posts made by josh-riley
-
RE: Why isnt my Site getting Re Crawled !!!
-
RE: Why isnt my Site getting Re Crawled !!!
If you are referring to having SEOMoz crawl it, I have had the same problem. Email their help desk - they are very responsive! And it will get it crawled. I also had crawls by them taking up to 10 days - waaaay too long. But just let them know and they will take care of it.
-
RE: Seomoz logging in
I use Chrome and the same thing is happening to me, even with the "remember me" box checked.
-
RE: Can i give other accounts access
What access are you referring to - SEOMoz? (You didn't clarify, but I'll assume you mean SEOMoz; people post about all kinds of sites and programs here, so being specific can really help how fast you get a response.)
They don't have a multi-login feature, (yet - however it is something they are working on). Before I joined I used a colleagues login so I could access under her account, so - since that's the only current option, if you are comfortable you can do that.
If not, then as of March 2012, Moz still wasn't sure when a multi-login feature would be implemented.
-
RE: Why is either Rogerbot or (if it is the case) Googlebots not recognizing keyword usage in my body text?
When you did the on page analysis, did you ask it to check for "sarasota liposuction" or just "liposuction" - I've noticed that the checker tool is very specific like that.
-
RE: Google is keeping very old title tags in the SERPs for my site. How can I fix this?
And within Webmaster, add a sitemap and resubmit it. That gives Google an indication that something has changed and they should recrawl. If you already have Webmaster and a sitemap, then just click on the "resubmit" button. They generally get to it within a few days and it's the closest way I know to "force" a crawl.
-
RE: Too many links - How to address without removing them?
Agree. Also, too many links isn't as a big of a deal as it used to be. If Google even thinks there's a link there, they will try to crawl it or make the connection anyway.
-
RE: What is the best route SEO wise for implementing a Wordpress blog that has a domain under go daddy and hosting under a third party like Kalio Commerce?
Wordpress isn't very secure - even with the security plugins and such you can leverage. SEOMoz has some great content on the subject of subdomains vs sudfolders...what can impact being able to use that is if the blog is hosted on a different server than the TLD and if so, if the domain mapping can work to point the blog to the subfolder/subdomain is Google sees it as being associated. This is a challenge for some systems (including mine, even though in theory it should work). Sudfolders are the preference.
-
RE: What is the best route SEO wise for implementing a Wordpress blog that has a domain under go daddy and hosting under a third party like Kalio Commerce?
As someone going through something similar right now, I'm going to throw out there to tread lightly. I got some great advice on setting things up, but the domain mapping with Wordpress has been a nightmare. Nothing wants to point correctly, or domain map, in the way that it should in theory.
My lesson is to test everything out first, and try the mapping and pointing before doing anything else.
-
RE: How to start more deep seo.
I agree with Ryan - you need a strategy. I think one of the biggest mistakes we can make when targeting SEO keywords is going too broad and hoping that we can capture things that quite honestly aren't the right fit. You really have to know them, their terminology and how they search. The article link Ryan shared is a great layout for how to go about a sound strategy, which will eventually answer your questions.
However, as for the global monthly searches - personally, I say ignore that. Check the "exact" phrase box and look at the local monthly searches. I want to have an idea of how small of an audience I can end up with if I select certain words. It's a very different perspective to find out a word you thought was really popular actually gets 170 searches a month and at best I can only hope to capture a percentage of that.
-
RE: Using Google key word tool to reseach key words for a site
I agree with Clancy - my goal is to find words with significant traffic while identifying what can lead to the best conversion and least amount of bouce backs.
Ex: a word, "example" that has 1,000 local searches a month - if I can capture 10% of those searchers (100) vs. "specific example) which has 100 searches and 10% (10), I have more opportunity.
-
RE: If you only had a limited budget for tools...
I should clarify: Screaming Frog is not a monthly fee, it's annual.
-
RE: If you only had a limited budget for tools...
And throw in Screaming Frog. Depending on the site, 500 URL crawl is free; any site with more pages is 99 pounds. It will crawl any site like a crawler and can sometimes give additional information outside of SEOMoz. (Moz crawls 10,000 pages at a time and I have more than 50,000.)
Check out Keywordspy - you can get some OK data under the free offering and SpyFu's free part of it's tool can also round out competitive information.
-
RE: Press Release- Is it worth it?
I've run into problems with news releases getting lots of inbound links and then ranking higher/better than core pages that I'd rather have ranking for similar subject matter/keywords.
-
RE: How do you incorporate a Wordpress blog onto an ecommerce website?
As someone who is currently dealing with the making a WP blog look like it lives on a subdomain, I will agree with Dr. Pete that it's very tricky.
-
RE: Video Distribution Services
Question - when you say major video sites, do you mean like YouTube and Vimeo? I'm just curious why you'd want them to not be part of your own account/channel within each for better branding...which is part of the strategy.
First, what's your goal with the videos? And why would you want them to live outside of your hands (ie., have a third party manage and post, and I assume then also oversee all the title/description/tagging, etc.) when the best SEO would come from keeping it under your umbrella? While you may not think there's time to manage this on your own, what do you want to get out of it and then is that worth the trade off of losing the ties to your ownership?
(Granted there are a lot of variables to take into consideration - like if you have embedded links in the video that can drive viewers back to your website, and ultimately what the CTA for the videos include, etc.)
-
RE: Duplicate Content Issue with
I have used disallow parameters for all kinds of things; you can set it up and test it in Webmaster tools under "crawler access" before you implement on your site to confirm it's done properly. There have been a few times I've had to tweak it to get exactly what I wanted.
I'd test with the "/" in front of it as one option - again, just in testing, to see if there's any difference in results.
Since Google already got a hold of it, it'll take a lot of time to see results but don't be discouraged since they have to re-crawl and figure it out.
Good luck
-
RE: WordPress blog hosted on GoDaddy domain mapping help
One last random question, pls: have you had to purchase a dedicated IP if using shared hosting? I think we may need to but before we go making this bigger without promise of it paying off, I thought I'd see if you maybe had run into this and how it may have turned out? Pls.
-
RE: Duplicate Content Issue with
I like using robots.txt for stuff. - mostly because our homegrown cms limits our abilities.s however, if these pages have already been indexed then disallow limits the outcome. Ideally, disallow or no index or any of those are done in advance so Google doesn't get their hands on it; doing it after the fact can take some time for Google to figure it out and put the pieces together. Can your site manage a canonical for this?
-
RE: WordPress blog hosted on GoDaddy domain mapping help
Thanks for taking time to follow up and I appreciate you and Ben offering help. I'll work with my designer and report back
Cheers, Andrea
-
RE: WordPress blog hosted on GoDaddy domain mapping help
Dan, small world, indeed! Correct; we tried and ended up in an infinite loop with our Apache system and my web designer has been trying a plug in or two to see if there's a way to map the domain. We seem to be hitting all kinds of random hiccups and as none of us are network admin savvy, I thought this may be a good place for the road less traveled. Thanks for reaching out! Andrea.
-
RE: Changing a url from .html to .com
And, if you are interested, here's a nice blog on it: http://www.seomoz.org/blog/301-redirect-or-relcanonical-which-one-should-you-use
-
RE: Changing a url from .html to .com
The advantage of the 301, although sometimes trickier to implement (my site can't handle canonicals, so I am used to everything being trickier!) is that with the 301, it tells Google that the page content has moved permanently from the old URL. Eventually then Google will un-index the old URL and it will essentially cease to exist.
If implemented properly you shouldn't be hurt as you're still telling Google where to go so they can follow.
-
RE: Changing a url from .html to .com
Any time you change a URL you need a 301 redirect. If you set it up properly and go from a .html to a .com URL as your primary there shouldn't be too much fall out.
-
RE: WordPress blog hosted on GoDaddy domain mapping help
I wish - setting up an alias like that doesn't gel with how WP handles domain mapping. It broke all our links out.
-
WordPress blog hosted on GoDaddy domain mapping help
We set up a WP blog that's hosted through GoDaddy. For various reasons, we purchased a URL to use to get through the technical build and set up and are trying to map that to a subdomain of our company website. (We can't host it on our own server, unfortunately).
My question is: for WP blogs hosted via WP you can buy a domain mapping upgrade and I'm trying to find a similar plugin that could offer the same thing that would apply to our GoDaddy hosting and point to our subdomain (GD apparently doesn't offer the domain mapping).
Anyone have any thoughts, please?
-
RE: How do you incorporate a Wordpress blog onto an ecommerce website?
I think Ben outlines some great, actionable steps - there's one word of caution I'll throw out and that it's not necessarily that easy, depending on how your back end is set up.
I recently looked at something similar, and because of issues hosting WP (with PHP) on our server, we had to worry about hacking and the integrity of our shopping cart checkout system being vulnerable. So that wasn't a viable option to set it up as a sub folder and we had to look at a subdomain and pointing Apache at WP. (I don't want to bore you with all the technical vetting we went through, just suffice to say that theory and reality don't always go hand-in-hand.)
-
RE: How to check a website's architecture?
CodeAcademy also recently introduced free online coding classes that you can check out - it may be a good way to apply the learnings from your readings.
It takes practice and dedication - and repetition - to learn coding. You have to train yourself to properly apply the code. HTML and CSS are the best places to start, then move onto PHP
-
RE: How do you determine the level of an SEO
I think link building can straddle between beginner and intermediate, depending on your site/content and how you need to go about getting links.
Given that Google, for example, is constantly making changes to its algo and how much weight it wants to give some traditional SEO things, like anchor text, links, KW density, that it's a sign of a more capable person if they are reading up on the changes and being fluid with how to adapt said changes to benefit their site.
(And I agree that anyone who is stuck on page-only optimization shouldn't be qualified to be an SEO, however there's still a lot of people who don't know much about the field who treat that narrow focus as if it's all there is to know. )
-
RE: How do you determine the level of an SEO
There's going to be a ton of possible answers to this, but my two cents: depth of technical skill set and global knowledge really set experiences pros apart. So many newbies are so focused on the basics - on page optimization and basic HTML coding - that when you come across those who can identify and diagnose true technical issues and work across sites at a global level, you can tell the difference in skill set.
I mean, look at some of the questions that get asked in these forums - great indicators of how complex this can all become.
-
RE: 500 errors and impact on google rankings
I agree with Alan.
As for the sitemap, I'm not entirely sure what you mean by "goes down on a regular basis" - as in # of pages indexed by Google? And if some of those URLs are part of the 500 error list, then it makes sense that the # of pages Google indexed would go down as if Google can't index a page then it doesn't matter if it's listed in the sitemap as it can't be crawled.
-
RE: Why has my OSE csv report been finalizing for 3 days?
I've had some problems lately with things taking excessively long, too. Generally nothing should take that long. I had one campaign report crawling for 10 days. It's so individualized - I suggested contacting the help team. They can hit whatever trigger or check with engineering (if needed) to get it squared away.
-
RE: What site do you admire/like for its SEO - technical, content, whatever - and why?
Thank you for taking the time to share - I'm going to check out that site.
-
What site do you admire/like for its SEO - technical, content, whatever - and why?
I am gathering examples of great SEO'd sites and would appreciate your examples. The rationale can be anything - great SEO structure, great linking, solid content - you think stands out.
Thank you!
-
RE: How does my blog help in SEO
Relevant anchor text back to your site is a good thing, as you want people to be able to make the connection and get the SEO benefit. Yes, blogs are presented as "you need" but keep in mind there's a lot of hype with this stuff. How you do it is more important than doing it - as Alan mentioned, you need to have a blog people will want to link it for it to really benefit.
-
RE: Duplicate Content Question
Keep in mind that many, many niche sites have this same problem. I mean, there's only so many ways to write compelling content on a very technical subject, right? Images, studies, videos, articles, quotes, news releases - all sources for a way to freshen up what you are "borrowing" from the larger organization's site.
-
RE: Why do rankings show differentley when checked from different computers
Use the SEOMoz rank tracking tool (http://ranktracker.seomoz.org/) for objective - non personalized or influenced - ranking results.
-
RE: Pages not indexed by Google
Well, there's a lot of ways to look at this - this wouldn't result in more pages indexed, so the two issues are totally separate.
If the goal is to get more pages indexed, then a site map (either XML or event a text list) uploaded to your server for Google to find can help. Or, at least that makes sure that Google is finding and indexing the pages you want them to find. Your Google Webmaster Tools account (assuming you have one) will also tell you some data.
For example, we used to have 100K+ pages; many weren't quality content I wanted to rank. Like, a PDF of a catalog ranking about the product page. So, I reduced the number of pages indexed so Google would have better, more quality content to serve to searchers.
Using Xenu or Screaming Frog is another good way to help uncover pages. Those tools crawl your site like Google would,then you can download the file and not only see all the URLs found, but also if they are 301/404/200, etc. And, Screaming Frog can crawl your site and output a XML sitemap for you (it's an easier way to make one).
I prefer SF and it's about $150 US dollars for the use - well worth it.
As for why - well, if you have a lot of pages, Google doesn't always find them. That's where a site map can help (it directs Google what to crawl). Otherwise, there could be technical issues to a bunch of pages and they aren't properly linked up or something and that could be causing the issue.
-
RE: 404 Page/Content Duplicates & its "Warning"
Duplicate content is always a problem is the wrong page is being served up for a search term (i.e, a 404 page vs an active page). It's bad for bounce rate and conversions, and search engines eventually drop 404 pages from their index.
So, as far as SEO effectiveness goes, if people aren't getting served the content you want them to, yes, it's a problem.
-
RE: Pages not indexed by Google
'No follow' isn't the same as a 'no index' code. No follow just tells the search engine it "should not influence the link target's ranking in the search engine's index." 'No index' is where you tell the crawler to not index the pages, then you can remove that if you at some future point want them indexed.
So, in theory, what you did wouldn't have anything to do with how many pages are indexed on your site anyway.
-
RE: Redirecting broken incoming links
This is a big, big problem for me - lots of partial links that need redirects. Since redirects pass on some (not quite all) link juice, it's worth it if you manually have the time to do - they all add up.
-
RE: Press Release and Duplicate Content
I have an issue with some of our press releases ranking above the product page or for targeted page for a certain keyword because of the inbound links and such a press release can gather. There can be some risk of dup content, depending on how the PR is written in comparison to the pages it relates to. However, I have found dup to be less of a problem and keyword cannibalization to be a bigger one.
I have to go back in and update anchor text, links, etc. in the press releases to push some of the "juice" through to the product page we want to rank for those terms - definitely messy to undo and work with.
-
RE: We were ranked no8, and now we are no2 but...
That's always a possibility and one of the drawbacks of personalization for SEO. However, if via third party checkers, if you are going up (from #8 to #2), then there's also a better chance of best visibility via personalization, too.
Ranking visibility only means visibility - people can be more likely to click because they can find you more easily.
-
RE: We were ranked no8, and now we are no2 but...
It could also do with Google determining that your page is a high quality one, so it now gets more visibility. Or, the number of inbound links has increased, which also influences rank. Personalization could be at play, depending on how you went about discovering the ranking change, so if you want to confirm, check it with SEOMoz's Rank Tracking to get a non personalized result.
-
RE: Results Google in different browsers
I agree with IringW. Personalization is the most common factor for fluctuation like that. You can also use SEOMoz's Rank Tracker as a third-party tool to give an accurate result. I've never had this problem solely based on browser used to check.
-
RE: Could this URL issue be affecting our rankings?
URLs - headache! We have a terrible URL structure because of the ways we have to pull data, so this is something that I have checked into, too. Now, I will say there's lots of differing opinions on this. I will share with you what someone from Google said last week at SMXWest: they just want you to know about bad links, they don't penalize you for them.
I'm not saying that's the end-all-be-all answer, but she knows that there's a perception that it can 'ding' you when the reality (according to her) is that they drop 404 pages from their index because they don't serve up bad pages. You have lots of bad pages, less linking ability, less pages to have rank and you can lose online visibility. There's a difference between losing visibility because your overall content offering is reduced by bad links and those pages never having existed in the first place.
There's a good chance there's something else going on -one of the things I adore about this forum is that people here have crazy skills and I have witnessed them uncover an issue the original poster didn't even know they had.
-
RE: 403, 301, 302, 404 errors & possible google penalty
I agree - try Screaming Frog. I use it - and prefer it to Xenu - and it's free for up to 500 URLs or $99 for a license for unlimited. Even with a slower computer it can run in the background (leave it on overnight).
-
RE: 403, 301, 302, 404 errors & possible google penalty
I can't quite follow - output from what?
Are you asking someone else to run a Xenu/Screaming frog of this URL (http://www.nlpca.com/DCweb/Interesting_NLP_Sites.html#null anchor occurs multiple timeshttp://www.nlpca.com/DCweb/Interesting_NLP_Sites.html#U) and give you feedback on what to fix (beyond the obvious that any 400 and 500 level errors should be tackled first)?