Questions created by FPD_NYC
-
In Search Console, why is the XML sitemap "issue" count 5x higher than the URL submission count?
Google Search Console is telling us that there are 5,193 sitemap "issues" - URLs that are present on the XML sitemap that are blocked by robots.txt However, there are only 1,222 total URLs submitted on the XML sitemap. I only found 83 instances of URLs that fit their example description. Why is the number of "issues" so high? Does it compound over time as Google re-crawls the sitemap?
Intermediate & Advanced SEO | | FPD_NYC0 -
How to handle broken images on an old site post migration?
I am working with a client who migrated their site prior to starting their SEO work with us. In a crawl of broken backlinks, I found some old image files with links. Ideally, I would like to redirect to an appropriate image, but I have no way of knowing what the image was because the page it was on is now dead. Does anyone have a way to identify and handle broken image files from a site that has already been migrated?
Intermediate & Advanced SEO | | FPD_NYC0 -
International XML Sitemaps - Standalone, or Integrate into Existing XML Sitemap?
Hi there, We understand that hreflang tagging can be incorporated into an existing XML sitemap. That said, is there any inherent issue with having two sitemaps - your regular XML sitemap plus an international XML sitemap which lists off many of the same URLs as your original XML sitemap? For example, one of our clients has an XML sitemap file they don't want to have to edit, but we want to implement international hreflang xml sitemaps for them. Can we add an "English" XML sitemap with the proper hreflang tagging even though this new sitemap contains many duplicates as the existing XML sitemap file? Thank you!
Intermediate & Advanced SEO | | FPD_NYC0 -
How would you handle these pages? Should they be indexed?
If a site has about 100 pages offering specific discounts for employees at various companies, for example... mysite.com/discounts/target mysite.com/discounts/kohls mysite.com/discounts/jcpenney and all these pages are nearly 100% duplicates, how would you handle them? My recommendation to my client was to use noindex, follow. These pages tend to receive backlinks from the actual companies receiving the discounts, so obviously they are valuable from a linking standpoint. But say the content is nearly identical between each page; should they be indexed? Is there any value for someone at Kohl's, for example, to be able to find this landing page in the search results? Here is a live example of what I am talking about: https://www.google.com/search?num=100&safe=active&rlz=1C1WPZB_enUS735US735&q=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&oq=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&gs_l=serp.3...7812.8453.0.8643.6.6.0.0.0.0.174.646.3j3.6.0....0...1c.1.64.serp..0.5.586...0j35i39k1j0i131k1j0i67k1j0i131i67k1j0i131i46k1j46i131k1j0i20k1j0i10i3k1.RyIhsU0Yz4E
Intermediate & Advanced SEO | | FPD_NYC0 -
What to do about real backlinks spiraling out of control and affecting domain trust flow
I have a site with a few hundred thousand backlinks and many of these are are legitimate, but the source site of the backlink often has a low authority and is broken, causing a spiral of bad backlinks from pagination, comment replies, etc. For example one site may have 4 legitimate backlinks with a spiral of 400+ bad backlinks, this is happening across dozens of domains. My site is more authoritative than these broken backlinks and regularly receives highly authoritative backlinks, because of this would it be best to disavow these spiraling low authority domains, attempt to contact the webmaster and add a nofollow, or any other solution?
Intermediate & Advanced SEO | | FPD_NYC0 -
Cross-domain / subdomain tracking in GA?
Hi there, My client has a site website.com and a booking engine, booking.website.com They are currently tracking the main site and the booking subdomain as two separate properties in the same GA account. The issue is we can't see where users are originating on the subdomain property; it's all being counted as referral. My understanding is we need to set up subdomain tracking using Google Tag Manager in order for GA to pass the user data between the two subdomains. This is fine, except for this one line I am reading on Google's guide to cross-domain tracking: Subdomains If you have updated your tracking code to analytics.js, then no additional configuration is required to track subdomains. You can use cross domain tracking to collect data from a primary domain, like www.example.com, and a subdomain, like www.subdomain.example.com, in a single Analytics account property. That last line makes it sound like we should be using cross-domain tracking for this purpose. Are we correct in setting up subdomain tracking and NOT cross-domain tracking to be able to track users across subdomains on the same domain?
Reporting & Analytics | | FPD_NYC0 -
Spam signals from old company site are hurting new company site, but we can't undo the redirect.
My client was forced to change its domain name last year (long story). We were largely able to regain our organic rankings via 301-redirects. Recently, the rankings for the new domain have begun to plummet. Nothing specific took place that could have caused any ranking declines on the new site. However, when we analyze links to the OLD site, we are seeing a lot of link spam being built to that old domain over recent weeks and months. We have no idea where these are coming from but they appear to be negatively impacting our new site. We cannot dismantle the redirects as the old site has hundreds, if not thousands, of quality links pointing to it, and many customers are accustomed to going to that home page. So those redirects need to stay in place. We have already disavowed all the spam we have found on the old Search Console. We are continuing to do so as we find new spam links. But what are we supposed to do about this spam negatively impacting our new site? FYI we have not received any messages in the search console.
White Hat / Black Hat SEO | | FPD_NYC1 -
I think Google Analytics is mis-reporting organic landing pages.
I have multiple clients whose Google Analytics accounts are showing me that some of the top performing organic landing pages (in terms of highest conversion rates) look like this: /cart.php /quote /checkout.php /finishorder.php /login.php In some cases, these pages are blocked by Robots.txt. In other cases they are not even indexed at all in Google. These pages are clearly part of the conversion process. A couple of them are links sent out when a cart is abandoned, etc. - is it possible they actually came in organically but then re-entered via one of these links which is what Google is calling the organic landing page? How is it possible that these pages would be the top performing landing pages for organic visitors?
Intermediate & Advanced SEO | | FPD_NYC0 -
Is this a black-hat strategy? If so, what category does this fall under?
I am working with a major beauty client who owns an exact-match domain name related to their product that brings in a ton of traffic. They offer great content on this website that is inherently valuable. The catch is that the call-to-action brings users back to the main company site (a different URL). So if they want to "buy the product" or "learn more," they are taken to a different domain (the main company domain). There are 47 links to the main site on the EMD site. There are some slight mentions of the main brand on the EMD site, but it's hardly noticeable. It mostly appears to be a standalone site not affiliated with a major brand. My gut tells me this is black-hat but I can't find a fitting description of this strategy online, and why they shouldn't be doing this. Is this considered a doorway page / doorway site? Is this considered a link scheme? What would you call this strategy? Or is this actually not even black hat?
White Hat / Black Hat SEO | | FPD_NYC0 -
Why is Google Ranking the Umbrella Category Page when Searching for Sub-Categories Within that Umbrella Category?
I have an e-commerce client who sells shoes. There is a main page for "Kids" shoes, and then right under it on the top-navigation bar there is a link to "Boys Shoes" and "Girls Shoes." All 3 of these links are on the same level - 1 click off the home page. (And linked to from every page on the website via the top nav bar). All 3 are perfectly optimized for their targeted term. However, when you search for "boys shoes" or "girls shoes" + the brand, the "Kids" page is the one that shows up in the #1 position. There are sitelinks beneath the listing pointing to "Girls" and "Boys." All the other results in Google are resellers of the "brand + girls" or "brand + boys" shoes. So our listing is the only one that's "brand + kids shoes." Our "boys" shoes page and "girls" shoes page don't even rank on the 1st page for "brand + boys shoes" or "brand + girls shoes." The only real difference is that "kids shoes" contains both girls and boys shoes on the page, and then "boys" obviously contains boys' shoes only, "girls" contains girls' shoes only. So in that sense there is more content on the "kids" page. So my question is - WHY is the kids page outranking the boys/girls page? How can we make the boys/girls pages be the ones that show up when people specifically search for boys/girls shoes?
Intermediate & Advanced SEO | | FPD_NYC0 -
Our client's site was owned by former employee who took over the site. What should be done? Is there a way to preserve all the SEO work?
A client had a member of the team leave on bad terms. This wasn't something that was conveyed to us at all, but recently it came up when the distraught former employee took control of the domain and locked everyone out. At first, this was assumed to be a hack, but eventually it was revealed that one of the company starters who unhappily left the team owned the domain all along and is now holding it hostage. Here's the breakdown: -Every page aside from the homepage is now gone and serving a 404 response code -The site is out of our control -The former employee is asking for a $1 million ransom to sell the domain back -The homepage is a "countdown clock" that isn't actively counting down, but claims that something exciting is happening in 3 days and lists a contact email. The question is how we can save the client's traffic through all this turmoil. Whether buying a similar domain and starting from square one and hoping we can later redirect the old site's pages after getting it back. Or maybe we have a legal claim here that we do not see even though the individual is now the owner of the site. Perhaps there's a way to redirect the now defunct pages to a new site somehow? Any ideas are greatly appreciated.
Technical SEO | | FPD_NYC0 -
Trying to pinpoint why 1 keyword moved down 100 positions in 2 weeks. Help me speculate?
Hi there, One of my client's sites, a very large and successful ecommerce website with great SEO performance, has seen a significant drop in rankings in the past 2 weeks. The rankings have begun to somewhat stabilize today, except one particular keyword with a search volume of 74k has gone from 1 to 100. Here is what has taken place in 2 weeks, sitewide: I revised and improved upon title tags and meta descriptions to make them more user-friendly and contain more optimized terms. Following all of Google's best practices, as always. Google still appears to be indexing these changes (has anyone seen an initial drop in rankings while this takes place?) The site has seen a very significant increase in 404 errors due to one feature of the site breaking. We got a message about it in Webmaster Tools, and this appears to coincide with when overall rankings dropped. The development team is working quickly to get this resolved. As of today, I am seeing the highest page-load time than any other day in 2015. With regard to the particular page/keyword in question: The keyword is no longer "exact match" at the beginning of the title tag, but rather broken up throughout the title tag so the whole title sounds better for users. **Have you found that this type of change is sufficient for a keyword rank to move down ~100 positions?? **(Either way, I have asked the client to revise the title to start with the exact match keyword, once again.) Google has indexed the page 2 days ago, but is still displaying the old title tag in search results. I have not found any instances of internal or external links to this page being removed. With all this information, does anyone see anything that seems like it could have reasonably caused such a huge tank in rankings? Is this a blip in time? Is there anything I am not considering? Should I just be patient?
Intermediate & Advanced SEO | | FPD_NYC0 -
Keyword Targeting - How to Properly Target Two Similar Terms?
Hi all, So I have a question about "best practices" when you have two unique, but highly similar keywords you are targeting. Let's use the examples of "raincoats for women," which gets 9,900 searches a month, and "rain jackets for women," which gets 4,400. I am in the process of selecting keywords for my client's "keyword portfolio" and need to come up with a strategy when faced with two similar keywords that use different terminology. I'm well aware that there should only be one page for "women's raincoats" but there is no doubt in my mind that Google will give preferential treatment to whichever version of the keyword (raincoats/rain jackets) I include in my title tag, meta description, content, etc. I know that the modern philosophy is that Google is sophisticated enough to understand that the two words are essentially synonymous. That said, would you A) only pick "raincoats for women" for your client's keyword portfolio and focus exclusively on that term in your optimizations? b) pick both terms and try to strike an even balance between both in your optimizations? c) pick both terms and only optimize for "raincoats for women" and hope that "rain jackets for women" gets some peripheral benefit from your optimizations via Google's understanding of synonyms? Thanks!
Algorithm Updates | | FPD_NYC0 -
Can someone help me understand why this page is ranking so well?
Hi everyone, EDIT: I'm going to link to the actual page, please remove if there are any issues with confidentiality. Here is the page: https://www.legalzoom.com/knowledge/llc/topic/advantages-and-disadvantages-overview It's ranking #2 on Google for "LLC" This page is a couple months old and is substantially heavy in content, but not much more so than all the dozens of other pages online that are competing with it. This is a highly competitive industry and this particular domain is an extremely huge player in this industry. This new page is suddenly ranking #2 for an extremely competitive head term, arguably the most important/high volume keyword being targeted by the entire site. The page is outranking the home page, as well as the service page that exactly targets the query - the one that you would think would be the ranking page for this head term. However, this new page is somewhat of a spin-off with some additional related content about the subject, some videos, resources, a lot of internal links, etc. The first word of the title tag exactly matches the head term. I did observe that almost no other pages on the site have the exact keyword as the first word of the title tag, but that couldn't be sufficient to bring it up so high in the ranks, could it? Another bizarre thing that is happening is that Google is ignoring the Title Tag in the actual HTML (which is a specific question that is accurate to the content on the page), and re-assigning a title tag that basically looks like this: "Head Term | Brand." Why would it do this on this page? Doesn't it usually prefer more descriptive title tags? There are no external links coming up on Moz or Majestic pointing to this page. It has just a couple social shares. It's not being linked to from the home page or top nav bar on the main site. Can anyone explain how this particular page would outrank the main service page targeting this keyword, as well as other highly authoritative, older pages online targeting the same keyword? Thanks for your help!
Intermediate & Advanced SEO | | FPD_NYC1 -
301 Redirect / Canonical loop on home page?
Hi there, My client just launched a new site and the CMS requires that the home page goes to a subfolder - clientsite.com/store. Currently there is a redirect in place such that clientsite.com -> clientsite.com/store. However, I want clientsite.com to be the canonical version of the URL. What should I do in this case, given that there is now a loop between the redirected page and the canonical page?
Intermediate & Advanced SEO | | FPD_NYC0 -
Huge e-commerce site migration - what to do with product pages?
My very large e-commerce client is about to undergo a site migration in which every product page URL will be changing. I am already planning my 301 redirect process for the top ~1,000 pages on the site (categories, products, and more) but this will not account for the more than 1,000 products on the site. The client specified that they don't want to implement much more than 1,000 redirects so as to avoid impacting site performance. What is the best way to handle these pages without causing hundreds of 404 errors on site migration day? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
Subcategories within "New Arrivals" section - duplicate content?
Hi there, My client runs an e-commerce store selling shoes that features a section called "New Arrivals" with subcategories, such as "shoes," "wedges," "boots," "sandals," etc. There are already main subcategories on the site that target these terms. These are specifically pages for "New Arrivals - Boots," etc. The shoes listed on each new arrivals subcategory page are also listed in the main subcategory page. Given that there is not really any search volume for "Brand + new arrivals in boots," but lots of search volume for "Brand + boots," what is the proper way to handle these new arrivals subcategory pages? Should each subcategory have a rel=canonical tag pointing to the main subcategory? Should they be de-indexed? Should I keep them all indexed but try to make the content as unique as possible? Thank you!
Intermediate & Advanced SEO | | FPD_NYC0 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | | FPD_NYC0 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
What is the difference between "Organic Traffic" and the "Non-Paid Search Traffic" default segment in Google Analytics?
These two filtering options ("organic traffic" in the left sidebar and "non-paid search traffic" in the advanced segments) give me slightly different numbers. Any idea why this would be the case?
Reporting & Analytics | | FPD_NYC1