Weird. KW Difficulty should be functioning as of yesterday afternoon. Can you try again and if you encounter an issue, send it to help@seomoz.org with your account info?
![randfish randfish](/community/q/assets/uploads/profile/17-profileavatar-1619582838584.png)
Best posts made by randfish
-
RE: SEOMoz Software
-
RE: DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
Always happy to help, and especially to provide transparency. Thanks for the kind response
-
RE: Infinite Pagerank Dilemma
The math doesn't work out here either, because each page is only assigned an extremely tiny amount of PR initially, and the damping factor will make it such that you're only passing some lesser quantity to the linked page. That, combined with the fact that PageRank's iterations end after a certain number of runs (at one point, I think it was ~10 iterations), and you'll see that looping PR like you've shown doesn't work out.
-
RE: Building my Portfolio of 70 MicroNiche Sites (10-15 Pages of Content)
Hi Sebastian - unfortunately, I'm not sure there's much help I can give you on your question (I'll explain why below), but I did want to pop in to say that I'm worried about the goals you're attempting to achieve and how they conflict with what Google and its users are seeking from websites.
The build + flip model of creating an exact match domain (which have been trending down in Google's results and no longer provide much benefit) along with the concept of hosting many sites and simply publishing articles without regard for the quality or the broader mission or the end-user's goals and experience in mind is dangerous. Years ago, those types of systems worked and today, they can still work ocassionally and temporarily, but given the amount of effort required, I'm certain you could find other ways to accomplish far greater financial returns and give the web something truly excellent - something Google wants to rank and people want to visit, rather than a gallery of article-heavy built-to-flip sites.
The last point I'll add is that the links you might find to sites like these are not going to be high-quality, editorially given endorsements from trusted sources. They're far more likely to be the kind of links that get you into trouble. And, generally speaking, the Moz Q+A community doesn't focus on those and may not have great recommendations for how to acquire them.
That said, we welcome all kinds, and I wish you luck whatever you choose.
-
RE: SEOMoz Software
Hi Kyle - yes, we had to change to individually fetching the top 10, then 11-20, then 21-30, etc. This could have changed ranking positions in the 11+ range more significantly, though we haven't seen widespread reports of this. I would note that results past 20 are usually far more volatile than those on the first page.
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Hi Gary - yes! That's what the thread I posted above includes - details on what the differences between this new index and tool are vs. the old one. The bullet points basically cover it
-
RE: Should we re-use our old re-directed domain for a new wesbite ?
That sounds like a poor decision in my opinion and experience. An "old" domain isn't worth anything more than a "new" domain - age is not a factor in a website's rankings or any other marketing-related element. It is the case that many old website outperform many new ones by virtue of having built content, links, shares, mentions, branding, and marketing of all kinds for years. But if you compared two websites with the same metrics on those fronts, 6 months vs. 6 years is meaningless.
If it's the case that the old website is more brandable, or the client wants to use it because she/he prefers the name and wants to operate the business under that moniker, then it's understandable. But, if the client believes that age alone makes the domain more valuable/worthwhile/easier-to-rank, I'd challenge that assumption.
Best of luck!
-
RE: Is there any data available on how deeplinking contribute to improve domain authority?
In order to truly test this at scale, you'd need a very large number of sites and a huge swath of testable, organic linking sites - probably not feasible.
However, I can tell you that IMO, it doesn't matter much. If the links are organic and editorial, Google probably doesn't care whether they point to deep pages or the homepage. Some sites get a lot of press, which means they get links to their homepage. Other sites have lots of links to their content, which means they get deep links. I honestly doubt Google cares much which it is - what matters more is the quality and editorial nature of the linking sources.
-
RE: Does SEOMOZ planing to restructure OSE metrics?
Hi Jason - we do have some continual improvement plans for OSE and for our metrics. Obviously, the Just Discovered links feature was one of the most recent and valuable ones. We also do refactor and recalculate Page Authority/Domain Authority on a regular basis to make them better correlated with changes to Google.
On the longer-term horizon, we're planning a webspam metric, and we may be using more data from social and fresh web to improve PA/DA.
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Thanks! Look forward to your thoughts on how this new version works for you/your business.
-
RE: Old URLs Appearing in SERPs
Hi Rosemary - can you share some examples of the URLs and the queries that bring them up in search results? If so, we can likely do a diagnosis of what might be going on with Google and why the pages aren't correctly showing the redirected-to URLs.
-
RE: Is there a good website builder that can gain links?
Hi Scott - it depends. You can use Google Search Console's preferred version (https://support.google.com/webmasters/answer/44231?hl=en) to help them choose between www vs. non, but if there are other parameters or versions of the page, you really want a canonical tag or 301.
Given the limitations it sounds like GoDaddy is giving you around this stuff, I'd probably suggest moving to a different CMS/host. Better safe than sorry later.
-
RE: Open Site Explorer missing links
Hi Jonathan - thanks for the feedback. We're constantly working to tweak and improve Linkscape's index, and in this latest round we tried to balance out some of what we'd seen last index with deep pages on large sites crowding out domain diversity and a wide breadth.
Of course, our goal is to have both over time, but we've got a lot of work to do to get to Google's scale. We suspect they keep between 100-150billion pages in their main index, and we've got 51billion in our latest update, so we know there's missing stuff.
On the other hand, Linkscape and OSE should still be very useful to comparative analysis against competitors or other sites in your field, and it should still be very useful for discovering high value link targets and important sites/pages.
If there's specific stuff you think we're missing, please don't hesitate to send to help at seomoz dot org.
Thanks!
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Thanks dMa - very glad to hear it. If there's things you think are missing or that can be improved upon, please let us know. Still in beta, obviously, and there are a number of new features and data improvements coming, but we're hoping to regain the lead in the link tool space (I'm with you, it's been way too long).
-
RE: Old URLs Appearing in SERPs
In my experience, the best way to absolutely get rid of them is to use the 410 permanently gone status code, then resubmit them for indexation (possibly via an XML sitemap submission, and you can also use Google's crawl testing tool in Search Console to double-check). That said, even with 410, Google can take their time.
The other option is to recreate 200 pages there and use the meta robots noindex tag on the page to specifically exclude them. The temporary block in Google Search Console can work, too, but, it's temporary and I can't say whether it will actually extend the time that the redirected pages appear in the index via the site: command.
All that said, if the pages only show via a site: command, there's almost no chance anyone will see them
-
RE: Tips for attributing specific rises in rank to increases in traffic
I have to agree with the general sentiment of both Travis & EGOL. While it is intellectually interesting to discover the answers to questions like the ones you've asked, it serves almost no practical purpose. I've thought about it often, wondering if the ability to get better about predicting how movement in SERPs could affect traffic and conversions, but the real answer is that the variability is too high to make predictions accurately even with a lot of data (you'd need data across literally millions of SERPs to have a true model, and even then, it would likely require machine learning + inputs we don't have access to).
The simple models are "good enough" - i.e. rankings go up, traffic related to those keywords will, too. Keywords that drive particularly valuable traffic that rank poorly and have less competitive SERPs should be the focus of marketing activities
Someday, I hope to build software that can get better at this (and we're working on some of the more simplistic models with ranges today at Moz), but it's likely not a great use of time to try and get insanely detailed on this.
-
RE: Spam Score
Hi Kingalan - first off, I'd recommend checking out http://moz.com/blog/understanding-and-applying-mozs-spam-score-metric-whiteboard-friday which will give you a pretty good overview of what Spam Score is and how it works.
I wouldn't worry about firing two flags - Moz triggers a few, and many good sites do as well. If they're things you want to fix anyway, go for it, but Spam Score flags aren't about saying "this is necessarily bad" or "this definitely needs fixing." It's merely identifying features that, when added together, show correlations with sites we saw Google penalize/ban.
As far as your links go - that distribution seems fine to me, too. If you want, you could look at the highest flag count links and if you believe they're problematic after manually reviewing, go ahead and give them the boot (via disavow or getting the site to remove them). The flag count is merely to help you order your manual review - it should never replace the process of actually looking at those links and determining which should be kept/removed.
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Thanks Adam!
Re: 1) There is a toggle in the panel above the link lists to filter out common syndication and shortening URLs. Maybe try checking that? If it's not catching everything, let us know what (here or via a ticket to help@moz.com) and we can try adding those to the filtration.
Re: 2) Yeah - we have the "linking domains" view in the tool but adding a "one link per domain" or "up to X link per domain" in the links view is certainly something we can consider, too.
-
RE: Seriously? where is my first question?! is it deleted?
Hi Chowi - this is a thorny issue, and unfortunately, the answer to your follow-up regarding redirecting the bots is "maybe."
What I mean is - maybe the engines will be fine with it because the content matches exactly, but there are cases we've observed where they've been upset or penalized the pages/site. This blog post (http://www.seomoz.org/blog/white-hat-cloaking-it-exists-its-permitted-its-useful) does a good job of showing cases where this type of cloaking happens and Google is fine with it, but as you can see from Matt Cutts' response in the comments, they really don't like most sites to do this.
If I were in your shoes, I'd look for a way to make that initial page as search-friendly as possible, but if you're fairly sure that the engines will see your content precisely the same and are willing to adopt some risk from that perspective, you could do the redirect. As I said, many big sites do get away with this behavior.
-
RE: How to handle Friendly URLs together with internal filters search?
Hi JoaoCJ - In cases like these, I don't usually sweat the URL length too much. It is OK to go over a bit -- our recommendations come from correlation analysis and testing. Observing Google's rankings, it tends to be the case that pages with fewer parameters (like 0) tend to outperform pages with more, and that shorter URLs tend to outperform longer ones. That said, it's not a hard and fast rule, more a sloping line.
As far as the filters go, I might consider using rel=canonical unless you're sure you want those pages separately indexed. If that's the case (you DO want them indexed), perhaps consider using static URLS -- even something like a number in the URL could work, e.g. /123/. For the pagination, Google's also got the rel=prev/next tags that I'd suggest employing.
Wish you all the best!
-
RE: Can I add another user to my SEOMoz Pro campaign?
Tragically and frustratingly, it's been pushed back again. Right now, it's supposed to be delivered by the end of September, but I'm honestly not putting a lot of faith in the promises I hear after so many delays. Apparently, this project is massively harder, more complex, and more time-consuming than anyone at Moz imagined or predicted
You have my sincere apologies and my genuine frustration alongside yours. I'm very disappointed we couldn't deliver this by now.
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
Thanks Jack! I've been pushing the team to get a pie chart of anchor text (there's a lot of folks at Moz who think pie charts are evil and bad UI, but I like them), so I appreciate the +1 for that feature
-
RE: How to trace back a link from a 404 error?
In the PRO Web App, the CSV export of your crawl includes the list of sources for all 404s/500s/etc. that can help. You'll also find these in Google Webmaster Tools for your site.
-
RE: How to handle Friendly URLs together with internal filters search?
Not too big a problem to have slightly longer title. Just be aware that how they display in SERPs can affect CTR, which can affect rankings. You can use https://moz.com/blog/new-title-tag-guidelines-preview-tool to get a good view of that.
-
RE: Spam Score shows No Contact Info even though I have a Contact Page
Hi mztobias - I think we just got that flat out wrong. Not sure why our crawler missed your contact page, but clearly it did. Hopefully in the next index, that will be rectified. I don't have the ability to manually edit the score/notation, but once we recrawl the site and update our index, it should be fixed.
Sorry about that!
-
RE: The New Link Explorer (which will replace Open Site Explorer) is Now in Beta
I don't think there's much value there. We haven't seen the c-block metrics correlate any better than linking domains of late (not surprising given that c-blocks aren't as popular a way to hide link networks anymore), so removed them from the new product.
-
RE: SEO for One Page Websites
Suppose that's not wholly unreasonable. Just make sure to have some good links (that are natural and non-spammy-looking) pointing back to the main domain.
-
RE: Comparing New vs. Old Keyword Difficulty Scores
I have to agree with Russ. I think the old KW Difficulty model was making that keyword look stronger than it is, and the new one, while maybe a bit low, is more accurate. I'd also suggest, as Pete did, that using any keyword in isolation is unwise. Compare scores for similar terms you might target in the same sector, against the same sorts of competitors, and use that relative data -- KW Difficulty, and indeed PA, DA, and keyword volume, are all far more useful as comparative metrics than absolute ones.
-
RE: Getting started with Social media promotion
It really depends on your goals from here. Are you looking to boost traffic from social sites? If so, you're going to want to learn what your audience wants and start creating it, sharing it and participating in conversations on these platforms to earn their trust and engagement.
If your goal is to provide customer service and reputation tracking/management, you can set up alerts through services like Google Alerts (www.google.com/alerts) and many other services (this Quora thread is quite helpful on tools that can assist on that front: http://www.quora.com/Are-there-any-free-tools-available-to-track-Twitter-mentions-and-ideally-sentiment)
If you're seeking to boost SEO rankings using social media, you'll need to build up an engaged contingent of friends/followers who will share/like/tweet your content.
Some good resources include:
- http://www.seomoz.org/blog/social-media-marketing-facebook-twitter-arent-enough
- http://www.seomoz.org/blog/comparing-seo-social-media-as-marketing-channels
- http://www.seomoz.org/blog/a-visual-tour-through-the-basics-of-social-media-marketing
Best of luck!
-
RE: What makes a "perfectly optimized page" in 2013?
Hi James - I've been meaning to write an updated version of that post. I've got an email in my inbox with the task on my to-do list and will do my best to get to it soon. Sorry for the delay!
In the meantime, the comments above are very kind, but also accurate. It's still a pretty solid guide to on-page optimization.
-
RE: Rand's blogging graphics
Donnie's spot on. A decade ago, I used to build Flash websites and became familiar and fast with the software. Thus, I still use it to create diagrams, flowcharts, wireframes and most of the graphics you see in my posts. It's certainly not professional quality, but it's quick and effective for simple blog illustrations.
That said, I probably wouldn't recommend Flash as an actual graphics tool to anyone. Most designers I know cringe at the thought of using an animation program to make simple 2D diagrams.
-
RE: Purchased an expiring domain, Now the pagerank has gone.
Umit - I just wanted to chime in and note that Ryan's correct on the guidelines and focus of the community here at Moz.
Your question is OK, but it does push on the boundaries of our community. While we'd love to have you as a member and you've clearly got some great ideas and experience to share, we don't generally support or try to provide advice/help to those attempting to manipulate search engines with black/gray hat tactics.
-
RE: Is there anyway for redirected links to still provide SEO value?
Hi Spencer - I think there's some awkward phrasing combined with the challenge of parsing the true meaning/intent of your question on this one. I'll do my best to answer what I think you're asking.
A shortened link, by default, does not lose its ability to pass link juice, PageRank, trust metrics, anchor text signals or anything else an engine might associate with a link. If it did, all these years, our TinyURL links (which existed long before any social stuff) and all those 301 redirects (which are essentially how shortened URLs function) would have failed. Clearly, they didn't, nor do bit.ly, j.mp, t.co, etc. type links today.
If you're asking if, by placing a shortened URL on a normal webpage and linking to it, the target of the 301 redirect loses out compared to a direct link, the answer is no. If you're asking whether nofollowed links in Twitter tweets or profiles that contain shortened URLs (or that exist elsewhere in the social web and may not be followed or even crawlable by engines) lose value, the answer is "it depends," but also "probably."
All that said, at one point in time, a Google representative did note that 301 redirects and rel=canonical tags do lose a small amount of the PageRank they pass to another page compared to a non-redirect/canonical. We're of the strong opinion this is between 1-10% of the PageRank value, though we also suspect that other link signals, many of which are often more important than PageRank nowadays, are unaffected. This is my opinion only, and we can't know for sure whether Google still puts this slight dampening on redirects/canonicals.
Hope that helps!
-
RE: Rand's presentation from the AMA webinar on March 27th, 2012
Hi Miranda - I'm not sure whether the AMA made a recording of the webinar available. I checked out their site and couldn't find it, so it's possible they may not be making it public (or didn't record). You could try emailing the folks over there - alibb@ama.org was my contact.
Best of luck!
-
RE: Hit hard by Panda 3.3 and Penguin. What to do?
I think this is exactly what Google hoped would happen with the Penguin update - SEOs and marketers who invested in gray/black hat links would have such an utterly horrific time trying to dig out that it would scare a broad swath of the industry into more white hat territory. Whether that's actually working is arguable, but it was certainly a goal of the update.
If you are ready to make the move over to the old domain, I wouldn't stop you. However, if you've built up some valuable brand equity, visitor loyalty and marketing prowess outside of SEO on this site, there's a few other possibilities:
- Work hard on UX and UI. Google hates penalizing beautiful sites that visitors love, and if you do get a manual review, this can help.
- Make the content truly exceptional, too. Ensure that there's nothing that feels like artificial/manipulative/done-just-for-rankings stuff on the site. Again, this makes it more likely that any reconsideration request will work
- Send out as many requests for link removal as possible and include the lists of where/how you acquired links and how you've tried to remove them in your reconsideration request
- Hope and pray
This process might not get you back in, but it could work. Google's requiring a "good faith" effort and some proof of said effort, but there's a possibility your site might get by. For the future, I'd strongly recommend sticking to entirely editorially given/earned links.
Wish you luck!
-
RE: SEO w/o Social Webinair experiment - how can RTs of a URL of a google search possibly affect the position of one of the search results?
Hi Chris - totally understand your question. The key is that the brand name is included in the search query and the test (in that particular case) was less about using the social networks for rankings, but to see if search volume itself and CTR could influence rankings (which it appeared to do).
You can read more about the experiment on search volume and CTR here: http://www.seomoz.org/blog/experiment-google-rankings-w-search-volume
And more on the experiment to influence rankings with Google+ and Twitter here: http://www.seomoz.org/blog/do-tweets-still-effect-rankings
Those should help clarify.
Cheers!
Rand
-
RE: Google Authorship and the "Fishkin" Outburst! Sorry Rand ;)
Great discussion here already, and I agree with what's been posted - content marketing and content strategy continue to be incredibly valuable for SEO and for many other marketing channels. The shift away from author pics is no reason to change course.
On a sidenote, I thought Ammon Johns' reply to my tweet was a very smart one: https://twitter.com/Ammon_Johns/status/486854967165480960 I should have considered that before sending my tweet (though I do wish Google would just be transparent about this stuff - it would help us to build a lot more trust and less suspicion of them).
-
RE: Hit hard by Panda 3.3 and Penguin. What to do?
I've not seen penalties transfer via the 301 very often (in fact, I've only heard stories of it but never seen it confirmed with a public example). I'd probably do the 301 - as you said, it's not a great experience otherwise for visitors who bookmarked or get referred to the old domain.
If you're really nervous, you could create a message that shows up on the site and refers visitors to the new location, but that's a lot of extra work, and requires that extra click, which isn't great for UX.
I suppose if you're sure Google is going to pass the penalty, you could use the 301, but robots.txt block the site from being accessed, so Google wouldn't actually see the site being moved over (thus, it would show Google you're doing this purely for UX and not for SEO).
-
RE: Does a link in facebook count as a backlink?
A) Generally yes, but hard to say for certain. There are correlations suggesting that maybe some of them do pass influence in some way.
B) Probably, but again, likely based on engagement, i.e. if a social link gets very little engagement, it probably won't do much to influence search rankings, but if it gets a lot, it often seems to have at least some positive impact.
-
RE: How to improve PA of Shortened URLs
Either we haven't crawled any links that point to #2, or the links we've seen to it don't pass any link equity (e.g. they're nofollowed or on pages with meta robots=nofollow, etc).
-
RE: How to improve PA of Shortened URLs
Hi Monu - shortened URLs generally aren't going to accrue much PA (or much link equity), because many (most) folks who link to them won't link to the shortener but to the URL it resolves to.
I'd also say that there's almost no circumstance I can imagine where it's actually useful or desirable to have a high PA score (or high ranking ability) for the shortened URL. You want the URL that actually resolves -- the one Google will show in its listings -- to get all the links. Shorteners could go out of business or stop redirecting properly or change from 301s to something else to track clicks, and then you'd lose that link equity to the final target. Thus, always better to have it go to the resolved URL.