Questions created by seo_plus
-
Hotfrog and BOTW Analysis
Hi, Just curious if people see a benefit from BOTW / HotFrog as offered here in the Moz Local stuff. I was always under the impression that stuff offered here would be "of quality" however I just did a bit of background checking and both of these seem to be marked with nofollow, and I'm guessing some of it has been for a long time. Are there other benefits that accrue? BOTW is not cheap, and while HotFrog is cheap, I feel a need to be somewhat wary of these options - but I SO want to trust the MOZ brand! Moz is pretty well known for excellent analysis - so... has anyone done any recent analysis on this? Is everything going away?? Please tell me it ain't so! thanks!!
Local Website Optimization | | seo_plus1 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Moz Crawler failing with https redirect?
Is there a way to get the Moz Crawl Test to work with HTTPS? I just got back this error: 902 : Network errors prevented crawler from contacting server for page. Site is set up with a standard 301 to redirect http to https - or at least I certainly hope it is! Rex Swain's HTTP Header Checker took shows a standard 301. Anyone else experiencing this error? btw - this is both a specific question and an opportunity for open discussion... Thanks!
Feature Requests | | seo_plus0 -
Will massive "sculpting" make a difference?
I'm working with a very popular blog that also is associated with related products we manufacture and sell ourselves. The blog is about 99% blog content and about 1% product content. If suddenly some 99% of a 5,000 page blog is changed to have the blog pages no-indexed, will the linkjuice be now more concentrated on the remaining 1%? Also, be aware that this blog has lots of high quality backlinks from everyday recognizable magazines, newspapers and blogs. Of course, this is a highly competitive market place so I'm trying to leave no-stone-unturned here in working out the kinks. In the old days, we sort-of-called this "pagerank sculpting" and the idea was to focus the linkjuce on certain pages and defocus it on other pages. It made sense to block certain pages that were not indexible but Google supposedly dinged that tactic years ago, and today people say this is act also helps as it conserves the crawl budget. Might this make a difference these days?? Keep in mind that the 4,950 remaining pages are still followed, and all backlinks remain in place. Will the site start ranking better for the keywords on the 50 indexed (product) pages?
Link Building | | seo_plus1 -
Backlink Redirection as Backlink Building Strategy?
Just checking in - i'm working on a site with tons of broken backlinks from high authority sites. For instance, I've discovered that some 90% of their backlinks are broken, and these are from highly recognizable, name brand magazines, newspapers, blogs and the like. Right now, the site has a Domain Authority of 48 (better than most in the industry from what i am learning) yet as the site has been around for years and has gone through 5 redesigns, there is an absolute ton of solid inbound backlinks that are getting 404's. Using Screaming Frog (list mode) I've also learned there are a ton of 301's that turn out to be redirecting to 404 pages so that also starts to add up. I always knew this was a problem / opportunity and I've always considered it a high priority to fix (301) broken links of this sort to improve ranking (you know, using htaccess or WordPress Redirection tools) -- and to avoid multiple redirects wherever possible. In fact, I consider it a basic all-win, no-lose strategy. I always assumed this was the case and I also assume this will continue to be so. However, as a professional, I always want to double check my assumptions every now and then... Is this still considered solid strategy? Are there any issues that one should look out for?
Technical SEO | | seo_plus0 -
GWMT / Search Analytics VS OpenSiteExplorer
Just had the experience of using OSE data to show what we call "linkrot" to a client -- only to find that GWMT / Search Analytics shows no such thing. Fortunately the client is an old friend and no face was lost, but it was dicey there for a bit as I have come to rely on and reference OSE again and again and again, OSE showed Domain Authority dropping by about 1/3 in the last 12 months, presumably due to old links getting broken, linking sites changing their architecture etc. And of course, ranking is tanking, as you would expect. But Google shows many more (and much more spammy looking!) backlinks. Has anyone had any experience benchmarking the 2 data sets of backlinks against each other? Dr Pete?
Moz Pro | | seo_plus
Does one update more frequently than another? Do you trust one more than another?? If so, why?? Thanks!0 -
Free Media Site / High Traffic / Low Engagement / Strategies and Questions
Hi, Imagine a site "mediapalooza dot com" where the only thing you do there is view free media. Yet Google Analytics is showing the average view of a media page is about a minute; where the average length of media is 20 - 90 minutes. And imagine that most of this media is "classic" and that it is generally not available elsewhere. Note also that the site ranks terribly in Google, despite having decent Domain Authority (in the high 30's), Page Authority in the mid 40's and a great site and otherwise quite active international user base with page views in the tens of thousands per month. Is it possible that GA is not tracking engagement (time on site) correctly? Even accounting for the imperfect method of GA that measures "next key pressed" as a way to terminate the page as a way to measure time on page, our stats are truly abysmal, in the tenths of a percentage point of time measured when compared with actual time we think the pages are being used. If so, will getting engagement tracking to more accurately measure time on specif pages and site signal Google that this site is actually more important than current ranking indicates? There's lots of discussion about "dwell time" as this relates to ranking, and I'm postulating that if we can show Google that we have extremely good engagement instead of the super low stats that we are reporting now, then we might get a boost in ranking. Am I crazy? Has anyone got any data that proves or disproves this theory? as I write this out, I detect many issues - let's have a discussion on what else might be happening here. We already know that low engagement = low ranking. Will fixing GA to show true engagement have any noticeable impact on ranking? Can't wait to see what the MOZZERS think of this!
Reporting & Analytics | | seo_plus0 -
Keyword Difficulty Tool
Is there a way to use KDT and include my own URL in the process so that I can see (and show my client) how things look competitively across all these nice dimensions? All is well if my client's site is in the top 10 - but if it isn't, how can I get the same set of metrics on a specific URL as it pertains to a specific keyword? Do I somehow to remember it used to do this? Or am I imagining things? I can't seem to get it to work this way. Thanks,
Moz Pro | | seo_plus0 -
OSE Advanced for specific directory only?
Pls how do I set up Open Site Explorer Advanced to look for backlinks to files in ONLY a certain directory? so if the domain is 1234.com, I don't care about links to 1234.com. OSE can find those very neatly! What I do care about is ONLY links to 1234.com/profiles and I can't seem to figure out a way to do this. the /profiles directory has thousands of profiles in it - and we think we have hundreds of thousands of backlinks - so ideally, I'd like to use regex or DOS like filtering to look at only those which start with 9 or 8 or whatever.
Moz Pro | | seo_plus0