Questions created by TomNYC
-
Link cloaking in 2015\. Is it a bad idea now?
Hi everyone, I run a travel-related website and work with various affiliate partners. We have thousands of pages of well-written and helpful content, and many of these pages link off to one of our affiliates for booking purposes. Years ago I followed the prevailing wisdom and cloaked those links (bouncing them into a folder that was blocked in the robots.txt file, then redirecting them off to the affiliate). Basically, doing as Yoast has written: https://yoast.com/cloak-affiliate-links/ However, that seems kind of spammy and manipulative these days. Doesn't Google talk about not trying to manipulate links and redirect users? Could I just "nofollow" these links instead and drop the whole redirect charade? Could cloaking actually work against you? Thoughts? Thanks.
Intermediate & Advanced SEO | | TomNYC0 -
Are pop-unders bad for SEO?
Hi all, I run a travel site that specializes in hotel bookings. We're working with a third-party advertiser to launch a pop-under unit when someone searches for hotels on our site. (This unit is of the "also try your search on these competing sites" variety.) I'm worried, however, that this might affect our SEO, especially in light of this on their site: https://support.google.com/webmasters/answer/2721313?hl=en Would Google even see these pop-unders? (Are pop-unders treated the same as pop-overs?) And, if so, would G see them as unwanted and treat them as a nuisance? Could it lead to negative SEO consequences? Any thoughts would be appreciated. Thanks! Tom
Intermediate & Advanced SEO | | TomNYC0 -
Redirecting to a new domain... a second time
Hi all, I help run a website for a history-themed podcast and we just moved it to its second domain in 7 years. We've had very good SEO up until last week, and I'm wondering if I screwed up the way I redirected the domains. It's like this: Originally the site was hosted at "first.com", and it acquired inbound links. However, we then started to host the site on blogger, so we... Redirected the site to "second.blogspot.com". (Thus, 1 --> 2) It stayed here for about 7 years and got lots of traffic. Two weeks ago we moved it off of blogger and into Wordpress, so we 301 redirected everything to... third.com. (Thus, 1 --> 2 --> 3) The redirects worked, and when we Google individual posts, we are now seeing them in Google's index at the new URL. My question: What about the 1--> 2 redirect? There are still lots of links pointing to "first.com". Last week I went into my GoDaddy settings and changed the first redirect, so that first.com now points to third.com. (Thus 1 --> 3, and 2-->3) I was correct in doing that, right? The drop in Google traffic I've seen this past week makes me think that maybe I screwed something up. Should we have kept 1 --> 2 --> 3? (Again, now we have 1-->3 and 2-->3) Thanks for any insights on this! Tom
Intermediate & Advanced SEO | | TomNYC1 -
Should we change the publish date in WordPress when updating a post?
Hi everyone, We're going through some of our old posts in our WordPress blog and updating them, adding new information, new links, and photos. My question: If we update the posts significantly, should we also update the "published" date to today? If we only correct some typos or a dead link, we don't touch the date. However, if we've done some real work on the post, we'd like to update the published date in order to bring it to the top of our blog feed and draw new attention to the post. However, I'm a little nervous that this could be seen by Google as spammy, as it's not technically a new post and the URL already exists in Google's index of our site. Here's an example of a post that was published several years ago and then updated a few week's ago with new information (and a new date stamp): http://www.eurocheapo.com/blog/barcelona-tip-five-cheap-eats-under-e6.html Any thoughts on this? Thanks, Tom
Technical SEO | | TomNYC0 -
Do image "lightbox" photo gallery links on a page count as links and dilute PageRank?
Hi everyone, On my site I have about 1,000 hotel listing pages, each which uses a lightbox photo gallery that displays 10-50 photos when you click on it. In the code, these photos are each surrounded with an "a href", as they rotate when you click on them. Going through my Moz analytics I see that these photos are being counted by Moz as internal links (they point to an image on the site), and Moz suggests that I reduce the number of links on these pages. I also just watched Matt Cutt's new video where he says to disregard the old "100 links max on a page" rule, yet also states that each link does divide your PageRank. Do you think that this applies to links in an image gallery? We could just switch to another viewer that doesn't use "a href" if we think this is really an issue. Is it worth the bother? Thanks.
Intermediate & Advanced SEO | | TomNYC0 -
Canonical use when dynamically placing items on "all products" page
Hi all, We're trying to get our canonical situation straightened out. We have a section of our site with 100 product pages in it (in our case a city with hotels that we've reviewed), and we have a single page where we list them all out--an "all products" page called "all.html." However, because we have 100 and that's a lot for a user to see at once, we plan to first show only 50 on "all.html." When the user scrolls down to the bottom, we use AJAX to place another 50 on the page (these come from another page called "more.html" and are placed onto "all.html"). So, as you scroll down from the front end, you see "all.html" with 100 listings. We have other listings pages that are sorted and filtered subsets of this list with little or no unique content. Thus, we want to place a canonical on those pages. Question: Should the canonical point to "all.html"? Would spiders get confused, because they see that all.html is only half the listings? Is it dangerous to dynamically place content on a page that's used as a canonical? Is this a non-issue? Thanks, Tom
Intermediate & Advanced SEO | | TomNYC0 -
Would you "nofollow" links from a column on HuffingtonPost?
Hi all, So, I've read a lot of posts about guest posting being dead, but what about if you have a regular column on a well-regarded site? Stop? Nofollow links? We have a regular column on the Huffington Post and each piece has historically had at least one link (or more) back to our site. Yes, early on (like last year) we did use optimized anchor text in our links, and then calmed down on that a bit. But regardless, the links have always been relevant to the topic covered, and the topic is always in our niche (namely: budget travel in Europe). I saw Matt Cutts' recent video in which he recommends using the "nofollow" tag on guest posts when linking to one's own site, and specifically mentions HuffPo. Thus, I'm prepared to go back to my old posts and "nofollow" those links, but I just wanted a sanity check from the fine folks at SEOMoz. Would you go back and nofollow them? Many thanks!
Algorithm Updates | | TomNYC0 -
Panda Recovery: Is a reconsideration request necessary?
Hi everyone, I run a 12-year old travel site that primarily publishes hotel reviews and blog posts about ways to save when traveling in Europe. We have a domain authority of 65 and lots of high quality links from major news websites (NYT, USA Today, NPR, etc.). We always ranked well for competitive searches like "cheap hotels in Paris," etc., for many, many years (like 10 years). Things started falling two years ago (April 2011)--I thought it was just normal algorithmic changes, and that our pages were being devalued (and perhaps, it was). So, we continued to bulk up our reviews and other key pages, only to see things continue to slide. About a month ago I lined up all of our inbound search traffic from Google Analytics and compared it to SEO Moz's timeline of Google updates. Turns out every time there was a Panda roll-out (from the second one in April 2011) our traffic tumbled. Other updates (Penguin, etc.) didn't seem to make a difference. But why should our content that we invest so much in take a hit from Panda? It wasn't "thin." But thin content existed elsewhere on our site: We had a flights section with 40,000 pages of thin content, cranked out of our database with virtually no unique content. We had launched that section in 2008, and it had never been an issue (and had mostly been ignored), but now, I believed, it was working against us. My understanding is that any thin content can actually work against the entire site's rankings. In summary: We had 40,000 thin flights pages, 2,500 blog posts (rich content), and about 2,500 hotel-related pages (rich and well researched "expert" content). So, two weeks ago we dropped almost the entire flights section. We kept about 400 pages (of the 40,000) with researched, unique and well-written information, and we 410'd the rest. Following the advice of so many others on these boards, we put the "thin" flights pages in their own sitemap so we could watch their index number fall in Webmaster tools. And we watched (with some eagerness and trepidation) as the error count shot up. Google has found about half of them at this point. Last week I submitted a "reconsideration request" to Google's spam team. I wasn't sure if this was necessary (as the whole point of dropping the pages, 410'ing and so forth was to fix it on our end, which would hopefully filter down through the SERPs eventually). However, I thought it was worth sending them a note explaining the actions we had taken, just in case. Today I received a response from them. It includes: "We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages..." And thus, I'm a bit confused. If they say that there wasn't any manual action taken, is that a bad thing for my site? Or is it just saying that my site wasn't experiencing a manual penalty, however Panda perhaps still penalized us (through a drop in rankings) -- and Panda isn't considered "manual." Could the 410'ing of 40,000 thin pages actually raise some red flags? And finally, how long do these issues usually take to clear up? Pardon the very long question and thanks for any insights. I really appreciate the advice offered in these forums.
White Hat / Black Hat SEO | | TomNYC0 -
Can white text over images hurt your SEO?
Hi everyone, I run a travel website that has about 30 pre-search city landing pages. In a redesign last year we added large "hero" images to the top of the page, and put our h1 headlines on top of them in white. The result is attractive, but I'm wondering if Google could be reading this page as "white text on white page", which is an obvious no-no, especially if it could seem that we're trying to hide text. Here's an example: http://www.eurocheapo.com/paris/ H1: Expert reviews of cheap hotels in Paris I should add that our SERPs for these city pages has dropped (for "Cheap hotels in X"), but it could obviously be related to other issues. Any advice would be appreciated. Many thanks! Tom
Web Design | | TomNYC0 -
Why are inbound links not showing?
I run the site http://www.eurocheapo.com and am finding that many inbound links are not showing up in OSE and on the toolbar. For example, check out this hotel review: http://www.eurocheapo.com/paris/hotel/hotel-esmeralda.html In OSE it shows only 2 links (from 1 domain), which is crazy. It has dozens of inbound links from many different domains (links:http://www.eurocheapo.com/paris/hotel/hotel-esmeralda.html). I notice this all over my site. Pages that we link between are also showing no internal links -- which is easy to disprove. Was there a problem with this crawl? Or is the problem in our code? Many thanks for your help, Tom
Moz Pro | | TomNYC0