Penguin Update Seems To Benefit Wikipedia Etc
-
I was updating product info on my site which was apparently hammered by Penguin. As I was updating I was "Googling" the products. I noticed that every single product I carry, Wikipedia held the #1 position in search results.
Anyone else noticing this? I previously held the number 1 position on 2 of my products but I was knocked down to 60+...
-
Yeah no kidding! I just noticed that on a search term where I previously ranked #1 that yourdomain.com now ranks above me for a term that has NOTHING to do with yourdomain.com which is a parking like page. Why that would show up on page one of google and my site gets kicked to #62 I have no idea!
-
Pretty much the same here! Now we know to which company all the wikipedia donations are going to...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would you say is hurting this site, Penguin or Panda?
Would you say this is both Penguin and Panda and no penalty has ever been lifted? What would be your general recommendations for this site? seWnoQm
White Hat / Black Hat SEO | | BobGW0 -
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
Keyword in alt tag and future G Updates
Hello, I notice that it is common practice to put the page's keywords directly into an alt tag. I don't see how this helps the user and how it helps the user using screen readers and such. Do you think future G updates will slightly penalize pages with alt tags that are just the page's keywords and not a helpful phrase? What do you recommend to put in alt tags in light of future G updates?
White Hat / Black Hat SEO | | BobGW1 -
Am i getting backlink benefits from sites i design and host
I own & host over 300 domains for as many businesses. They all link back to my site from every page. but seomoz shows only hundred. so do other seo tools. why is that?
White Hat / Black Hat SEO | | nooptee0 -
How do I prepare my store for Penguin 2.0?
Looking for advice from someone who has been keeping up on the updates from Matt Cutts, other sources on what to work on for my web store to prevent getting hit hard by the upcoming penguin update.Practical advice on what to clean up on the site would be extremely useful. Watched a Matt Cutts video yesterday getting a preview...I'm very curious about the part saying that Google will show preference to those who are "Experts in their Fields." What makes you qualified for this? We are in the wicker furniture industry and have been a local brick and mortar store since 1982. We started our website about 1998, so I would consider that experience possibly part of the equation. On the other hand, I know everyone would love to say they are the experts in their niche, so it would be nice to know what needs to be done to achieve this. Thank you in advance for any help, Mark Grabowski Wicker Paradise xQmQeKU25zg
White Hat / Black Hat SEO | | wickerparadise0 -
Preparing for Penguin: Remove, Disavow, or change to branded
For someone that has 80 root domains pointing to their domain and 10 of them are sitewide backlinks from 10 PR4+ sites. All paid for. All with the same main keyword anchor text Should I advise him to remove the links, dissavow the links, dissavow then remove or just change to branded anchor text for the 10 sitewide links. Another option is to just keep one link (preferrably editorial) from each site. The only reason not to pull them off right away is that the client could not sustain his business with a drop in sales. These are by far the strongest 10 root domains. Eventually, when he has enough good backlinks these are all coming off. There was a huge drop in sales for this site last fall, but it recovered almost completely by changing keyword stuffing and adding ecommerce content. Looking to keep his sales and also prepare for this years updates.
White Hat / Black Hat SEO | | BobGW0 -
competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority
According to my recent SEOmoz links analysis, my competitor sites link to a considerable amount of irrelevant sites/nonsense sites that seem to score high with regard to domain authority... e.g. wedding site linking to a transportation attorney's website. Aother competitor site has an overall of 2 million links, most of which are seemingly questionable index sites or forums to which registration is unattainable. I recently created a 301 redirect, and my external links have yet to be updated to my new domain name in SEOmoz. Yet, by comparing my previous domain authority rank with those of the said competitor sites, the “delta” is relatively marginal. The SEOmoz rank is 21 whereas the SEOmoz ranks of two competitor sites 30 and 33 respectively. The problem is, however, is to secure a good SERP for the most relevant terms with Google… My Google pagerank was “3” prior to the 301 redirect. I worked quite intensively so as to receive a pagerank only to discover that it had no affect at all on the SERP. Therefore, I took a calculated risk in changing to a domain name that translates from non-latin characters, as the site age is marginal, and my educated guess is that the PR should rebound within 4 weeks, however, I would like to know as to whether there is a way to transfer the pagerank to the new domain… Does anyone have any insight as to how to go about and handling this issue?
White Hat / Black Hat SEO | | eranariel0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0