If it's not an urgent issue ... install analytics now and collect data for a month or two. You'll also want to install Search Console. Basically, if the pages have incoming links or traffic you'll obviously want to move them but if not and they're low quality they should probably be left behind.
Posts made by BeanstalkIM
-
RE: Moving to new site. Should I take old blog posts with me?
-
RE: Duplication in Meta Titles
Something in the title you sent triggered a thought and after checking I realized you're dealing with a .co.uk domain. I have found the .co.uk Google to be far more tolerant of heavy keyword use and even link spam so you're probably in a battle with folks who are indeed keyword stuffing or worse and finding yourself having to do the same just to keep up.
It's a bit of a slippery slope but I will admit that even some recent work I did in the UK required a slightly more heavy handed approach to SEO than I'd typically do. So while I wouldn't recommend it in the US, the title you're suggesting will probably work well in the UK.
Cheers !
Dave
-
RE: Where do I place my blog
Already answered repeatedly but piping in just to lend even more support to the consensus.
-
RE: Duplication in Meta Titles
Good question. I'm not sure exactly how I'd write it as it would depend on how your products are arranged. If you have pages for each of the different cartridges so you can target terms such as LC980BK independently then I'd probably go with something like:
Buy Brother DCP-197C Ink Cartridges from Domain.com
Of course he structure would differ by printer type as this model doesn't have any searches for phrases including "inkjet" so I'd skip including that. I also like the word "Buy" because if I'm looking up cartridges that's what I want to do.
Obviously testing is key though.
Hope that helps and best of luck ...
Dave
-
RE: Best Plugins for Rich Snippets
Hey Brian.
Obviously each scenario is a bit different but in general you'll simply want to enter as much information as you have for each field. In the case of reviews it's pretty easy.
Schema Type - Review
Name - Name of the product being reviewed
Website - URL of the product being reviewed
Description - A description of the product being reviewed
Item Name - What you're calling it (i.e. Review of XYZ)
Item Review - A summary of the review
Rating - Your rating
Author - You
Published Date - The date of the reviewHope that helps !
-
RE: Blank Cart Pages Showing as Duplicate, HELP
I agree with Kashif. Another option would be to block the cart in the robots or canonical all cart pages back to the root (/cart/ not your homepage) if the root of the cart is an actual page you want crawled.
-
RE: Duplication in Meta Titles
It's always important to remember that every scenario is a bit different and that the title tag rules outlined by Moz are meant to be guidelines as opposed to gospel. There are times when the title may have to extend past the recommended character count though it's also important to remember that Google uses pixels as opposed to characters so some titles can have more characters than others.
Dr. Pete (from Moz himself) created a great tool at https://moz.com/blog/new-title-tag-guidelines-preview-tool that you can use to test titles. Whenever I hit one that needs to extend past the visual I like to use the tool to make sure that at least my marketing message is presented properly.
I'd personally focus on clicks and that might be something you want to dig in to. While your rankings went down it may be worth checking if your clickthroughs for a query to that page went up as a percentage of what would be expected given your position. If they did then you'll want to look at ways to boost your rankings with the given titles so when you do recover you recover with a higher clickthrough rate, if not then you may want to run some limited tests on some products to see what happens if you extend the titles back to what they were. I mention this to insure that there wasn't a coincidence issue occurring where your rankings dropped due to an update that was poorly timed with the title changes. It would be wise to look through your analytics, find the time of the drop and compare that with the algo change history page (also kept up-to-date by Dr. Pete ... busy guy) to help safeguard against reacting to the wrong issue.
I personally don't have a big issue with some limited keyword duplication provided it still reads right. I don't like what your competitor's title is but then ... I don't have to - only the searcher does.
-
RE: Amount of Internal Links?
The unofficial best practice is 100 as the upper limit but in the end I'd try to think of it more from the context of how the weight passes between pages and how you want the engines and users to prioritize your content.
I actually wrote an article on this exact subject some time ago on Search Engine Watch where I show the math and how site structures to pass weight differently. That's probably a better answer that a flat number without reason so if you're interested you can see how the weight flows with diagrams at http://searchenginewatch.com/sew/how-to/2179376/internal-linking-promote-keyword-clusters.
Hope that helps and let me know if you have any questions.
-
RE: To consolidate or not to consolidate?
Great great question.
From my experience you'll be looking at a roughly 2 to 8 week hit in rankings. I know that's a wide margin but it depends on more than just the single redirection of your homepage on the city domains to the city page on the global domain but also includes the content and of course the internal links. You'll want to insure that you move any content that's attracting traffic, social signals or links and also redirect the pages with links or traffic to the appropriate page on the new domain.
Past that it's really just a matter of waiting and trying not to chew your nails off.
I wish you luck, that's a hard call to make but I think you're making the right one.
Regards.
Dave
-
RE: Http canonical .htaccess
A 200 is perfect (it just means "OK"). I double-checked and your canonicals' are pointing to the http version which is also what you want if you're not doing a 301. Some tools will still show a duplication even with the canonicals but rest assured ... Google knows what's what.
Best of luck to you as you proceed forward and glad it worked out.
Regards.
Dave
-
RE: Http canonical .htaccess
When I visit https://www.kimwist.com/ it's spinning into a loop and doesn't display the site. You can check it at http://www.redirect-checker.org/ and see what I'm talking about. It looks like it's 301ing to itself. I'm wondering if you've set a redirection to go to https that's trying to do it if the URL itself is https in which case you'd want to switch it to point to http.
-
RE: Best Plugins for Rich Snippets
I haven't tried All-In-One (though I now plan to) but I have found success with Schema Creator by Raven.
Totally agree with Daniel on Breadcrumb NavXT.
-
RE: Http canonical .htaccess
Sorry about that code. Ordinarily I'd have had a chance to test it but didn't have a test site ready for that type of testing.
Thinking about it further, an actual redirect won't work due challenge/response on the certificate not being installed. If you load the site in a new browser you'll see what I'm talking about.
If the site were mine (or a clients of course) I'd recommend getting a certificate setup and redirecting the whole thing to HTTPS now. If you don't want to do that the certificate will at least light up the HTTPS site directly (i.e. without the insecure warning) which will allow it to be redirected properly.
That said ... you do have the canonical tags on the pages properly so a duplicate content warning will be a false positive in that the content is indeed duplicated but Google knows that the HTTPS is the primary source as far as I can tell. Your site is going into a redirect loop on HTTPS so I can't double check but it appears as though it should from the way it's displaying in the source. Happy to double check after the loop is corrected.
Dave
-
RE: Http canonical .htaccess
You wouldn't be setting up the canonicals in the htaccess file so that may be where you're hitting a road block. If you saw references to the htaccess file that would likely be in reference to setting up redirects from https to http to eliminate the error. What tech is your site built on? With that I'll probably be able to point you in the right direction.
That said, If you do just want to setup the redirection via htaccess you should be able to use the following code:
<code>RewriteEngine On RewriteCond %{HTTPS} on RewriteRule (.*) http://%{HTTP_HOST}%{REQUEST_URI}</code>
Note: the code doesn't come out right here so you'll want to visit the link below to see how it should be formatted.
I need to note that I haven't had a chance to test it but it does look right. I'm far more familiar with the code to go the other way. The initial source of the code (credit where it's due) is at http://stackoverflow.com/questions/8371/how-do-you-redirect-https-to-http.
Hope that helps.
Dave
-
RE: Site hacked in Jan. Redeveloped new site. Still not ranking. Should we change domain?
Hi galwaygirl.
I'm going to be PMing you my thoughts as I've taken a bit of a peek at your backlink profile and my advice is specific only to your scenario and might be taken wrong and applied to others incorrectly if anyone else read it.
Dave
-
RE: Site hacked in Jan. Redeveloped new site. Still not ranking. Should we change domain?
Which manual action notice did you get? I'm assuming from what you noted that it's the user-generated spam notice but thought it better to check before replying. The only other one I can guess might apply was the unnatural links penalty. If it was the unnatural link penalty then you'll have to wait for the next Penguin update but if you can confirm the penalty I'm happy to look into it further.
Regards.
Dave
-
RE: 301 redirect file question
HI Alyssa, Dave here.
I know you may not want to give out your URL publically but if you can send me your URL as a private message I'm happy to look into it specifically as I'd hate to give the wrong answer. I'll answer it here in this thread so it can serve as useful to others but I'll keep the URL to myself.
Dave
-
RE: 301 redirect file question
I don't know the specific scenario you're talking about obviously but I'd recommend first checking under:
system > configuration > web > url options
Majento shouldn't be generating a 302 on every page but I'll be the first to say that Majento also isn't my strong suit. I don't like the sounds of it however as a 302 can bleed out your PageRank.
At what I'm hearing I'd recommend for the 301s you're setting up now - point them to the final target. Even if Majento was doing a 301 I'd still recommend this. Think of it like you're asking me for directions and I know both the final destination and another guy who does too. Would you rather I just told you where to go or pointed at the other guy?
That said, hopefully someone can come up with a more specific Majento answer for you.
-
RE: Does relative CTR affect organic SEO efforts?
You're bringing up a subject that's much disputed in the SEO community as I'm sure you well know. I note this so you'll take my answer as educated opinion rather than fact. Before we get into the exact question you're answering it's worth noting a hangout recently in which John Mueller answers a different but related question. In the hangout he was asked whether Google uses onsite user actions as a factor. His answer was "no" and noted that he doesn't believe they know what people are doing in the context of filling out a form or making a purchase and that it's not used as a ranking factor. Conversion and goal tracking aside I choose to believe him as factoring this in would force Google to compare apples and oranges with sites that use Google Analytics sending different signals than those that do not. The reason I mention this is that while he answered that questions and I believe he did so honestly, there's a bigger point in there that ties to your question. So let's get to that ...
Does Google Use Clickthrough Rates?
To answer this we have to answer two things ...
1 - can clickthrough rates provide a positive signal on a site's likelihood in addressing the searchers need, and
2 - can this create false positives?The answer to both of these questions is "yes". If a site appears in the search results and is clicked by a user that is a signal that the site likely matches the users need. If I searched for "blue widgets" and a site with a title like "Exclusively Red Widgets | OnlyRedWidgets.com" appeared in the results then it would likely have a low clickthrough rate and that can be used as a signal. The signal shouldn't count as an overall quality-of-site signal, just a signal based on that specific query as the site might be a great supplier of red widgets. Now, this leads us to the second question, can it give false positives?
Let's image that red widget site used the title "Red & Blue Widgets Galore" but still only sold red widgets. This is where functions to address the second question would be necessary and that is tracking the user's site behavior. Since John has said they don't do that in Analytics or, by extension, Chrome, etc. use (and again - I believe him) then we have to look to Google themselves. We can see in the SERP URLs themselves that Google is tracking which sites get clicks. Past that they also know (just like you or I do) when that user is back at Google. So while they may not be tracking the users behavior on a specific site for the SERPs it's certainly possible (dare I say "probable") that they are tracking the time from the click to the site to the searcher's next appearance at Google. Whether said searcher has remained on your site or simply read a blog post there and followed a link to a different article is irrelevant, they have found what they wanted.
What I see is four core scenarios:
1 - the user clicks a link in the SERPs and returns quickly to Google and selects another site under the same query. This would be a negative relevancy signal.
2 - the user clicks a link in the SERPs and after a good deal of time returns to Google and selects another site under the same query. This would indicate a positive experience where the user was simply seeking additional information or options.
3 - the user clicks a link in the SERPs and after either a short or long period of time returns to Google and adjusts their query to a related but different one. This would indicate the user needs to refine their search to find what they want and send neither a positive nor negative signal.
4 - the user clicks a link in the SERPs and after either a short or long period of time returns to Google and adjusts their query to a completely unrelated one. This would indicate the user found what they were looking for and has moved on to another task sending a positive signal.So to your question, I believe the answer is a conditional "yes". Clickthroughs can send a positive signal to Google however that requires that the user found what they wanted to boost the relevancy for that phrase.
The real perk here is this, whether you believe this explanation or not (and again - this is opinion) the actions you need to take are the same. Regardless of whether clickthroughs or even onsite time are a ranking signal the purpose of your site is to attract clicks and satisfy the users so the actions you need to take are the same regardless. One has to love those types of scenarios.
-
RE: I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2\. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect?
I'd take a slightly different approach to solving your issue though blocking the pages will work. My only concern with doing that is that if you do any weight to that page will evaporate from your site as opposed to being passed back internally. You won't be finding the pages in Wordpress as they're auto-generated and I'm guessing there's only one author which is why the author archive would be the same as the general archive.
Assuming you're using Yoast you can remedy the issue by simply going to the Titles & Metas area, selecting the "Archives" tab and check the box next to "Add noindex, follow to the author archives". This will allow the PageRank to pass but the page won't be indexed as duplicate content. There are other types of pages int eh same area you can do the same thing for.
As an aside, you should change your username. From the example you've given I'm assuming you've left the user as "admin". Since that's the default for Wordpress it makes it easier for attacks to brute-force their way in as they already have the username. This can be done via phpMyAdmin to just change it but if you're not comfortable in there you can simply create a different user with Admin privileges and delete the old "admin" making sure to attribute all posts and pages to the new user.
I shouldn't have to say this but just in case something goes wrong BE SURE TO BACK UP YOUR DATABASE !
-
RE: Domain Redirect and SSL Cert
I assumed I'd be able to deal with it easier too.
-
RE: Robots.txt
Via Search Console try to "Fetch As Google" and assuming that works without errors use the submit function. You'll know very quickly whether you've got technical issues and get the page into the index very quickly.
-
RE: Domain Redirect and SSL Cert
If your current site is secure and you want to redirect to the new one then yes, you'll need to keep your SSL cert. If you don't you'll get a certificate error and the redirection will hit that snag and perhaps worse, the link weight will hit the same one. Someone linking to the non-https page will redirect but the https will generate the issue. Annoying I know ... now you get to pay for two certificates.
If anyone else knows a way around this I haven't thought of I'd love to hear it but I had to deal with this a few months back and the only solution was to setup the old domain with it's own hosting and certificate and deal with the redirections at that level.
-
RE: Parked Domains
What I mean by one-to-one redirects is redirecting each page on one domain (the ones with incoming traffic or links) to the equivalent page on the other domain. They are standard redirects.
Ex: fullcompanyname.com/category/product1 to compname.com/cat/product1
Hope that helps.
-
RE: Parked Domains
I have to agree with Oleg in going one-to-one redirects of any pages with links or incoming traffic. Most hosting plans will let you have multiple domains so setup the one you're "not using" and just one-to-one them in the htaccess file. Remember - it doesn't need to be every page on the site, just the pages with links or entry traffic.
Speaking on links, when you're choosing the domain to use as the primary I'd be sure to very strongly consider which one has the best link profile. A bit is lost in a 301 so better not to lose that.
-
RE: Is there anyway to find the Keyword from which a site getting the Traffic??
If they're all in a folder (like /blog/ or /articles/) then I'd just go with a site: query (ex - site:competitorsite.com/blog/ which will give youa list of all the pages in that folder Google knows about. If they have tags etc. you can also remove those qwiht a negative (i.e. site:examplesite.com/blog/ -site:examplesite.com/blog/tags/).
You can also look for common text just on the posts ("written by" for example) and query:
"written by" site:examplesite.com
Cheers !
-
RE: Is there anyway to find the Keyword from which a site getting the Traffic??
If it's your site head over to Webmaster Tools. The query data will tell you where your clicks are coming from.
If it's a competitor I've found SpyFu to be the best tool for that one. Not perfect, but decent.
And what kind of posts are you talking about? Post on their site? Guest blogs?
-
RE: How long before I can use a redirected domain without taking back link juice?
I assume the old domain has a penalty and thus the concern with the redirection (not judging, just noting the premise of the answer). While in these events I am hesitate to connect the dots at all, going back to my affiliate marketing days (when I had a much more cowboyish approach to SEO) I would have done the following:
1 - Put up a one page splash page on the old site.
2 - Disallow the site in the robots.txt file
3 - Put the noindex,nofollow on the splash page
4 - Use a meta refresh on the splash page directed to the new site
5 - if it was a link issue add a disavow file on both domains for the links to the old domainThe splash page should read something like, "This site has been moved to a new domain. If you are not redirected in x seconds (x being however long you've selected for the refresh) please click here." (where "click here is a nofollowed link to your new domain.
This all said, anytime you link two domains there is always the chance the Google will connect the dots now or in the future so there is an element of risk. You have clearly shown that you don't want weight passing so that's a perk but it all depends on risk tolerance.
I'll be interested to hear any additional thoughts or techniques. I haven't done anything like this in many many years.
Good luck !!!
-
RE: Help me decide between 2 domains! Please!
Well, this is an opinion question but I'd go with PlayHazel.com. Again though, this one is just opinion and I'm just thinking, "Which would I be more prone click?"
-
RE: Methods for estimating competitor website traffic from natural search
SpyFu gives some pretty solid estimated. It's based on generic click percentages so not perfect by any means but if you can't afford Hitwise (also with it's limitations) it's a decent alternative.
-
RE: Is it stil a rule that Google will only index pages up to three tiers deep? Or has this changed?
Google prioritized by the importance content plays on your site (i.e. how prominent it is in your navigation and hierarchy) but given time ... they crawl as much of your site as possible.
So the short answer is no from a crawling standpoint but from a ranking standpoint ... it's a serious consideration. Of course, if you link to all your pages just to push them up then it's a nightmare for visitors and you dilute the PageRank flow for the key pages so it's a balancing act.
-
RE: Question about New Client with Manual Actions / Partial Matches in GWT
Manual actions aren't update based, they happen on an ongoing basis when someone at Google looks at the site and decides it's in violation of their guidelines.
Good luck !
-
RE: Time lag between algorithm changes and results?
My pleasure and keep up the great work.
-
RE: Time lag between algorithm changes and results?
One can never write off an algo change of course (especially given that there were 3 Panda updates from late August though September) but if so it was probably a case of you having good content (congrats) and solid links from domains that held their value through the updates and being rewarded during this period via the decline of others.
Or one could simply say, "We have good content and links from good sites so our rankings improved."
-
RE: Time lag between algorithm changes and results?
It seems to me that this is more just decent SEO getting rewarded over time. Knowing when your site jumped up would help though.
-
RE: Rel=publisher
Good point. I've taken that approach that if I use it well, it'll work well and until it's replaced and not working - will continue to.
But as with all things SEO ... one always needs to pay attention to what's going on around you.
-
RE: Rel=publisher
I've always used it as a "this person is responsible for this content" which helps clarify things. Yes having an author deemed an authority helps of course but on the other side, they'll never be an authority if they don't start somewhere and your own blog is a good spot to work from as a starting point.
Re: taking inspiration from the work of others, as long as that person or work is getting credit for it (i.e. stat sources noted, etc.) then there's nothing wrong with it. That author write that piece so they get the credit. the author of the stats (for example) will get the credit for their work on their own site (re: authorship).
This is of course opinion as content ownership has a whack of grey areas but if you know your sector, what you can and can't ethically do, and properly credit your sources in the post - you shouldn't have any problems.
-
RE: Rel=publisher
Authorship and publisher are different things. the author tag tells Google who wrote a piece, the publisher tag tells Google who owns the site and it only needs to be put on the homepage (as far as I know ... anyone have any different feedback?)
You can read about the difference at http://www.websitemagazine.com/content/blogs/posts/archive/2013/02/05/the-difference-between-rel-author-amp-rel-publisher.aspx in more detail but in short ... they don't do the same thing and I don't think the publisher tag is going to do what you want it to.
Hope that helps !
-
RE: Increase in Not Found Errors
I'd have to see more detail to know what the 404 issue is but let me jump in and say that a big problem with the way you've done it is 301ing every page on the domain to the homepage of the new site. This is saying, "Hey Google, we're not worried about sending the visitors to a page that contains information they want." I can't say for sure of course (darn Google and their tight lips) but I'd bet dollars to donuts that they're devaluing all your links assuming that nothing is relevant since they can't assume relevancy from the way the redirect site-wide are being handled.
The 301's domain also have a "less than ideal" domains to backlinks ratio so that's likely stacking on top.
Can't answer the question here but thought I'd comment on what might be causing some of your problems.
-
RE: Footer link back to developers domain
There are definitely sites that are (to quote George Orwell) "more equal than others".
Wordpress would be hooped is those types of links were catalysts to a penalty ... at least in a level world. I dont' think anyone could say with 100% certainty that it won't bite you in the butt so nofollow is probably the route to go.
-
RE: Footer link back to developers domain
As an SEo I never do it but that's more to the question, do I want to say, "Hey Google, here's a site that's been SEO'd." From the perspective of a web designer simply taking credit for their work, I've only seen it cause issues recently if they're followed links and anchor heavy. From my experience, branded links seem to work fine but mix it up.
That said, who knows what's coming. If you want to play it safe - nofollow is the way to go. But then, you probably knew that.
-
RE: Does a sub-domain benefit a domain...
Back in 2007 Matt Cutts announced at Pubcon that sub-domains and folders were to be treated the same. Of course, that didn't exactly happen as it was supposed to but the principle is sound. If what you're worried about is how weight is passing then it's pretty much a level field where you're left to chose the one that makes the most sense for your backend. I personally tend to lean on sub-directories whenever possible for predictability and simplicity.
I will note that I have quite recently seen penalties against a sub-domain on a large-scale site affect the rankings on the domain as a whole so there's definitely crossover in domain authority and trust.
The quickest answer to your question is, yes the domain will benefit from the sub-domain. But it's probably still easier to use a folder.
-
RE: Duplicate content - big brand players
While the exact definition of "stronger" generally falls on pure authority and strength, it is skewing more to perceived user intent. What I mean by this is that if Google infers that the user is looking to purchase a product they would be more prone to rank Amazon however if they believe the user is simply looking for product information they would rank the Samsung site.
I a lot of generic queries it's difficult for Google to determine, at least early in the search cycle where they may not have a pre-defined idea of what you're trying to do. In that case they would have to lean on pure strength. That said, as they're getting better and better at personalization of results, one could conclude (and I personally do) that if they see you searching phone models and generally clicking on online stores, that they would then infer that even on the next model-specific generic search, that you'd prefer a shopping experience delivered over the Samsung site and it's product based information.
-
RE: Should I add PDF manuals to my product pages?
As an irrelevant aside - love that avatar David.
-
RE: Should I add PDF manuals to my product pages?
I agree with Mike here.
While technically the canonical might do what you want (kind of) this isn't what it's intended for. Another side to that coin is, if you funnel the strength to the product page from the PDF but the product page doesn't have the content that the PDF was ranking for then you still won't get the rankings on the product page and on top of that, you'll lose them on the PDFs.
-
RE: Duplicate content - big brand players
In a case like this (with major brand sites as those listed) you won't be dealing with a penalty insomuch as you'll be dealing with a case of content being ignored. For example, if a local shop has the same content as Amazon, Amazon will generally win.
I can't possibly answer the question better than Matt Cutts himself so he is:
http://www.youtube.com/watch?v=LgbOibxkEQw#t=77 You'll notice that the focus of what he's saying is on unique content and giving Google a reason to rank you. This is a case like that noted in my comments about strong outranking weak; if a site is up against a strong site - Google is going to rank the one they believe is the most trustable and that will be the biggest brand (generally).
Now, I cannot for the life of me find it but I know that recently Google has comment elsewhere that they may look to user intent. If a user or their query indicates a preference to online buying then they would rank Amazon and if they tend to imply a local purchase then a local site would rank with exactly the same content. I know that's from recent comments from Google but I cannot remember where I saw that. Sorry - not a great source.
Either way however, the stronger site will win if the content is all up at the same time and the stronger site may even beat original source if Google decides it's more reliable or fits a searchers intent better.
-
RE: Duplicate content - big brand players
I used to do a lot of affiliate marketing but it's been a while so take this with a grain of salt though I do work with sites in a similar situation ...
The key here in original source. If you get picked up as the initial source of the content then you're OK, if you don't then your resellers will outrank you and you'll be paying them for traffic they earned with your content. The same can be said if they are significantly stronger than you.
In your shoes I'd write 2 versions of the descriptions, one for you and one or resellers. Don't let them take yours, let them take the content that meant for resellers. Then you'll always have original content. It'll be double the work of course but that's better than writing copy for every site.
To be fair - you may want to mention that they should consider writing their own copy but if they don't want to they can take the copy you're offering to give away.