Questions created by Jen_Floyd
-
How much do I have to differentiate syndicated content, exactly?
We have about 15-20 articles we'll repurpose on a partner domain (think: media outlet). To avoid duplicate content suspicion, how much exactly do we need to differentiate the content on the second domain? Yea, this is assuming we can't obtain a canonical for whatever reason. I've found some good advice here, but am looking for some quantification. Like: "A sentence/paragraph of introduction at the top of the piece, plus a link back to the original at the end of said introduction ought to do it." Any help is appreciated. Thanks! Tim
Content Development | | Jen_Floyd0 -
When will true multi-user log-in be available in Moz Pro? How about an archive?
Our team is growing and we are running into each others' research on the tool! Will our users ever be able to use their own instance of Moz? Also, how about a historical archiving function for research? Would be nice to record past research as similar requests come to us in the future. Wish list stuff - thx.
Product Support | | Jen_Floyd0 -
JavaScript Issue? Google not indexing a microsite
We have a microsite that was created on our domain but is not linked to from ANYwhere EXCEPT within some Javascript elements on pages on our site. The link is in one JQuery slide panel. The microsite is not being indexed at all - when i do site:(microsite name) on Google, it doesn't return anything. I think it's because the link's only in a Java element, but my client assures me that if I submit to Google for crawling the problem will be solved. Maybe so, but my point is that if you just create a simple HTML link from at least one of our site pages, it will get indexed no problem. The microsite has been up for months and it's still not being indexed - another newer microsite that's been up for a few weeks and has simple links to it from our pages is indexing fine. I have submitted the URL for crawling but had to use the google.com/webmasters/tools/submit-url/ method as I don't have access to the top level domain WMT account. p.s. when we put the microsite URL into the SEOBook spider-test tool it returns lots of lovely information - but that just tells me the page is findable, does exist, right? That doesn't mean Google's going to necessarily index it, as I am surmising...Moz hasn't found in the 5 months the microsite has been up and running. What's going on here?
Intermediate & Advanced SEO | | Jen_Floyd0 -
Do you say Browser Title or Page Title?
I have seen much more use of "Page Title" of late....in fact hardly any of "Browser Title" except by some folks who are using a very old CMS. We need to go with one at my workplace to avoid confusion. My vote is Page Title. Thoughts? Thanks, Tim
On-Page Optimization | | Jen_Floyd0 -
Any SEO value in gTLD redirect?
So, my client is thinking of purchasing several gTLDs with second level keywords important to us. Stuff like this...we don't want .popsicles, just the domain with the second level keyword. Those cost anywhere from $20-30 right now: grape.popsicles cherry.popsicles rocket.popsicles companyname.popsicles The thinking is that it's best to be defensive, not let a competitor get the gTLD with our name in it (agreed) and not let them capitalize on a keyword-rich gTLD (hmm). The theory was that we or a competitor could buy this gTLD and redirect it to our relevant page for, say, cherry popsicles. They wonder if that would help that gTLD page rank well - and sort of work in lieu of AdWords for pages that are not ranking well. I don't think this will work. A redirected page shouldn't rank better that the page it links to...unless Google gave it points for Exact Match in the URL. Do you think they will -- does Google grade any part of a URL that redirects? Viewing this video from Matt Cutts, I surmise that a gTLD would be ranked like any other page -- if its content, inbound links, etc. support a high DA, well, ok then, you get graded like every domain. In the case of a redirect, the page would not be indexed as a standalone so that is a moot point, right? So, any competitor buying a gTLD with the hopes of ranking well against us would have to build up pagerank in that new domain...and for our purposes I see that being hugely difficult for anyone - even us. Still, a defensive purchase of some of these might not be a bad idea since it's a fairly low cost investment. Other thoughts?
Intermediate & Advanced SEO | | Jen_Floyd0 -
Purchase second-level gTLDs?
So, I've been asked if it makes SEO sense for our company to grab a bunch of second-level gTLD (which we were earlier calling gTLD subdomains incorrectly) so that we can capitalize on redirecting them to our relevant pages that might not be ranking as well (if Google treats them like EMDs). For instance, buy something analogous to red.shoes, blue.shoes, purple.shoes and so on and then redirect them to our relevant pages for that product. Someone owns the .shoes domain but is happy to sell us second-level domains like red.shoes for $20-30. The question is, if we scoop up 100 or so of these relevant to our product, will it matter? I guess it depends on how Google is going to treat these. Anyone know?
Intermediate & Advanced SEO | | Jen_Floyd0 -
Need a keyword tool for the whole company
I alone cannot do keyword research on all of the online content that my company produces. There are about 10 publishing support teams who could do this research themselves, once trained, but I don't know which tool to suggest they use since they can't all use my MOZ login. Right now, those who do any research at all are using Google Trends. Wrong answer, but of course they are used to Trends for their social campaigns. Has your company dealt with this situation? I've looked at a few free keyword tools...each seems to have its plusses and minusses. **What would you recommend...either as a free tool or possible other workaround? **
Keyword Research | | Jen_Floyd0 -
Instead of a 301, my client uses a 302 to custom 404
I've found about 900 instances of decommissioned pages being redirected via 302 to a 404 custom page, even when there's a comparable page elsewhere on the site or on a new subdomain. My recommendation would be to always do a 301 from the legacy page to the new page, but since they're are so many instances of this 302->404 it seems to be standard operating procedure by the dev team. Given that at least one of these pages has links coming from 48 root domains, wouldn't it obviously be much better to 301 redirect it to pass along that equity? I don't get why the developers are doing this, and I have to build a strong case about what they're losing with this 302->404 protocol. I'd love to hear your thoughts on WHY the dev team has settled on this solution, in addition to what suffers as a result. I think I know, but would love some more expert input.
Technical SEO | | Jen_Floyd0 -
Scribd embed links - bad idea?
My client's site in question has a TON of outstanding, constantly updated, highly detailed articles. The site owner also has a branded collection of nearly all of them on Scribd. I guess I can live with that because dupe content isn't an issue and the pdfs there link back to the site and another domain of ours. Plus it gets a lot of eyeballs on our newish brand and content, and we can run reports on users. BUT, we have Scribd social share buttons on each article on our site that (among other things) allows a user to grab a direct link to the content on Scribd or an embed link for their blog or whatever. So, two questions really - Foremost, shouldn't we get rid of that embed option on our page? I mean, isn't is stealing from our backlink potential? I can't imagine juice would somehow pass back to us through a Scribd-located doc or embed but I haven't found info affirming or contradicting that. And secondly, isn't a Scribd collection a bit analogous to posting videos on YouTube and hoping your page will ultimately benefit from it via clickthroughs, etc? At this year's MozCon I heard a strong argument against that. Thanks -
Branding | | Jen_Floyd0