Questions created by M_D_Golden_Peak
-
Why is Volume Data Vastly Different in Keyword Explorer Lists than Manual Search?
I have created a number of lists in the new Keyword Explorer tool, and finally had a chance to put them into my company documents. I noticed a drop in some keywords which I assumed (rightly so) was incorrect. For example, in my "list" I have the keyword mixed media. When I export the CSV it shows a volume range of 0-10. If I type just that keyword into the explorer I get a range of 4.3k-6.5k for mixed media, which is where I would expect it to be. (https://moz.com/explorer/overview?locale=en-US&q=mixed+media) This issue happened to about a dozen keywords at least in my list, and most of them are that major of a difference. Any idea what is going on here?
Moz Bar | | M_D_Golden_Peak0 -
Error in Duplicate Content Being Reported - Pages Aren't Actually Duplicates
The recent crawl of one of our sites revealed a high number of duplicate content issues. However, when I viewed the report for pages with duplicate content I noticed almost all of them are not duplicates. For example, these two pages are marked as dupes:
Moz Bar | | M_D_Golden_Peak
https://www.writersstore.com/publishers/hollywood-creative-directory
https://www.writersstore.com/authors/g-miki-hayden These are thin as far as content goes but definitely not duplicates. Any recommendations or ways to adjust the settings so that these false positives aren't clogging up our site crawl report?0 -
Help... To Optimize Category Page or Not?
My question is about whether to optimize a category page or not, but it’s a rather odd situation. Here’s a bit of background to start. When we relaunched our site, about six months ago, we had primary, secondary, and tertiary categories created. A user could reach all three levels by clicking through the site. Then we decided instead of linking to the tertiary categories, that we’d turn them into filters which can be applied at the secondary level. Thus, there is not actually a direct link to the 3rd level categories on the site. An important side note, I did check and confirm they are still included in the sitemap file. My initial thoughts were to forget any further optimization of those 3rd level categories, but as it turns out we still have rankings for some of them. Now the question… Because some of these pages are ranking and are found in the sitemap, should I include them in my SEO plan to build up and optimize, or because they are no longer linked to directly on the site will they eventually fizzle out (and I shouldn’t do anything further). This is such a unique situation that I am really looking for some insight from the community. Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
How to Remove /feed URLs from Google's Index
Hey everyone, I have an issue with RSS /feed URLs being indexed by Google for some of our Wordpress sites. Have a look at this Google query, and click to show omitted search results. You'll see we have 500+ /feed URLs indexed by Google, for our many category pages/etc. Here is one of the example URLs: http://www.howdesign.com/design-creativity/fonts-typography/letterforms/attachment/gilhelveticatrade/feed/. Based on this content/code of the XML page, it looks like Wordpress is generating these: <generator>http://wordpress.org/?v=3.5.2</generator> Any idea how to get them out of Google's index without 301 redirecting them? We need the Wordpress-generated RSS feeds to work for various uses. My first two thoughts are trying to work with our Development team to see if we can get a "noindex" meta robots tag on the pages, by they are dynamically-generated pages...so I'm not sure if that will be possible. Or, perhaps we can add a "feed" paramater to GWT "URL Parameters" section...but I don't want to limit Google from crawling these again...I figure I need Google to crawl them and see some code that says to get the pages out of their index...and THEN not crawl the pages anymore. I don't think the "Remove URL" feature in GWT will work, since that tool only removes URLs from the search results, not the actual Google index. FWIW, this site is using the Yoast plugin. We set every page type to "noindex" except for the homepage, Posts, Pages and Categories. We have other sites on Yoast that do not have any /feed URLs indexed by Google at all. Side note, the /robots.txt file was previously blocking crawling of the /feed URLs on this site, which is why you'll see that note in the Google SERPs when you click on the query link given in the first paragraph.
Technical SEO | | M_D_Golden_Peak0 -
How to Handle Sketchy Inbound Links to Forum Profile Pages
Hey Everyone, we recently discovered that one of our craft-related websites has a bunch of spam profiles with very sketchy backlink profiles. I just discovered this by looking at the Top Pages report in OpenSiteExplorer.org for our site, and noticed that a good chunk of our top pages are viagra/levitra/etc. type forum profile pages with loads of backlinks from sketchy websites (porn sites, sketchy link farms, etc.). So, some spambot has been building profiles on our site and then building backlinks to those profiles. Now, my question is...we can delete all these profiles, but how should we handle all of these sketchy inbound links? If all of the spam forum profile pages produce true 404 Error pages (when we delete them), will that evaporate the link equity? Or, could we still get penalized by Google? Do we need to use the Link Disavow tool? Also note that these forum profile pages have all been set to "noindex,nofollow" months ago. Not sure how that affects things. This is going to be a time waster for me, but I want to ensure that we don't get penalized. Thanks for your advice!
White Hat / Black Hat SEO | | M_D_Golden_Peak0 -
Using Webmaster Tools to Redirect Domain to Specific Page on Another Domain
Hey Everyone, we redirected an entire domain to a specific URL on another domain (not the homepage). We used a 301 Redirect, but I'm also wondering if I should use the Google Webmaster Tools "Change of Address" section to redirect. There is no option to redirect the old domain to the specific URL on the new domain within the "Change of Address" section. Thoughts?
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Internal Duplicate Images
Question: If we had hundreds of images duplicated on a site, but different URL (tacking "-1" to the end), is that a Panda issue? Will we get penalized. I know duplicate content (web pages) pretty well, but duplicate files? That I'm unsure of.
Technical SEO | | M_D_Golden_Peak0 -
Why Does Ebay Allow Internal Search Result Pages to be Indexed?
Click this Google query: https://www.google.com/search?q=les+paul+studio Notice how Google has a rich snippet for Ebay saying that it has 229 results for Ebay's internal search result page: http://screencast.com/t/SLpopIvhl69z Notice how Sam Ash's internal search result page also ranks on page 1 of Google. I've always followed the best practice of setting internal search result pages to "noindex." Previously, our company's many Magento eCommerce stores had the internal search result pages set to be "index," and Google indexed over 20,000 internal search result URLs for every single site. I advised that we change these to "noindex," and impressions from Search Queries (reported in Google Webmaster Tools) shot up on 7/24 with the Panda update on that date. Traffic didn't necessarily shoot up...but it appeared that Google liked that we got rid of all this thin/duplicate content and ranked us more (deeper than page 1, however). Even Dr. Pete advises no-indexing internal search results here: http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world So, why is Google rewarding Ebay and Sam Ash with page 1 rankings for their internal search result pages? Is it their domain authority that lets them get away with it? Could it be that noindexing internal search result pages is NOT best practice? Is the game different for eCommerce sites? Very curious what my fellow professionals think. Thanks,
Intermediate & Advanced SEO | | M_D_Golden_Peak
Dan0 -
Does "Noindex" lead to Loss of Link Equity?
Our company has two websites with about 8,000 duplicate articles between them. Yep, 8,000 articles were posted on both sites over the past few years. This is the definition of cross-domain duplicate content. Plan A is to set all of the articles to "noindex,follow" on the site that we care less about (site B). We are not redirecting since we want to keep the content on that site for on-site traffic to discover. If we do set them to "noindex," my concern is that we'll lose massive amounts of link equity acquired over time...and thus lose domain authority...thus overall site rankability. Does Google treat pages changed to "noindex" the same as 404 pages? If so, then I imagine we would lose massive link equity. Plan B is to just wait it out since we're migrating site B to site A in 6-9 months, and hope that our more important site (site A) doesn't get a Panda penalty in the meantime. Thoughts on the better plan?
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
How would you handle 12,000 "tag" pages on Wordpress site?
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
Intermediate & Advanced SEO | | M_D_Golden_Peak1