Google Panda 4.0 update - Good for Small businesses?
-
Hi guys,
We recently did a post on Google Panda 4.0 release. Check this here.
Have you seen any notable changes in rankings for your website? Do you think that this update will benefit small businesses/websites?
Looking forward to your comments.
-
One of our sites had issues previously, bad links. We went through the disavow procedure months ago, but no change in rank until yesterday - up about 28 spots, broke the #50 barrier, all since the update.
-
Yeah, I always thought it was funny when Google's snippet tester would show it was clearly working, and then you see the link on a SERP, and guess what? No snippet, no authorship lol.
Best of luck, looks like things are stabilizing a bit for you.
-
Have you seen this? Ouch. http://blog.searchmetrics.com/us/2014/05/21/panda-update-4-0-winners-and-losers-google-usa/
-
Oh yes - the site i mentioned that had recovered massively is also now displaying the product and review schema, that wasn't previously displaying - even though rich snippets testing said it was all fine. I hadn't considered that the two could be linked. Duh, thanks.
-
Thanks for your input guys. We have noticed in many communities that most of webmasters/SEOs have seen improvements in their rankings so far from the update.
eBay has seen drop in rankings. I wonder if this has something to do with their study last year showing that paid search don't work for brand terms :|. haha
-
After looking hard at rankings after every algo update, we always see strange results while Google is still "rolling it out". I wouldn't take much to heart while they get the algo in place. It would be better to let things stablilze, then go back and see what rankings changed. I think the week of the update is the worst time to analyze what is going on.
From Matt Cutts blog:
https://twitter.com/mattcutts/status/468891756982185985"Google is rolling out our Panda 4.0 update starting today." Rolling out, starting today are the things that stand out to me from that statement, meaning "its not done yet" lol
I will say that we have seen our rankings improve, with multiple links now showing on the front page for our main keywords. Whereas authorship was not displaying, it now is, combined with our schema reviews.
-
We've seen one of our sites jump from low 40's to 11 overnight after months of being low.
We're UK based as well, more a directory style site then e-commerce.
-
Great article, and very much on time. Thanks for sharing. I have checked some of our sites and at the moment I cannot see any changes or major drops. We seem to be doing it right so far
It would be good to review the traffic in a week. Please get in touch with us on @<a class="view_profile profile_link js-avatar_tip ss_tip tip_left" data-tip="View 's Profile" data-sstip_class="twt_avatar_tip" data-screen-name="FingoMarketing">FingoMarketing </a>
-
We're in the UK and 2 of our ecommerce sites that had been suffering have seen improvements this morning. One of them particularly has improved in rankings massively for many, many queries.
Also noticed that some queries are dropping forum threads as results.
-
Not so far, no.
In fact, from our early analysis, small biz is hurting. A search for "Melbourne pizza" shows 4-5 news articles and only the local block of businesses.
Other searches show more directories than ever and a lot more gumtree/craigslist.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What about a No-record backlink in the eye of Google
I have an uncertainty - when I make a backlink as a piece of SEO in some site when I reviewed similar couple of days after. It hasn't filed and I checked its robots record. It appearing Client specialist: Mediapartners-Google Disallow: User-Agent: * Disallow: However, is this make any backlink uphold or only this with the end goal of not ordering in google. I make it straightforward - "Is this sort of backlink creation my site kitchen meg uphold my SEO action or Not?" In this No-record site.
Local Website Optimization | | Salman425520 -
How to Get 1st Page Google Rankings for a Local Company?
Hi guys, I'm owning a London removal company - Mega Removals and wants to achieve 1st page rankings on Google UK for keywords like: "removals London", "removal company London", "house removals London" but have no success so far. I need professional advice on how to do it. Should I hire an SEO or should focus on content? I will be very grateful for your help.
Local Website Optimization | | nanton1 -
How accurate are google keyword estimates for local search volume?
We've all used the Google Adwords Keywords Tool, and if you're like me you use it to analyze data for a particular region. Does anyone know how accurate this data is? For example, I'd like to know how often people in Savannah, Georgia search for the word "forklift". I figure that Google can give me two kinds of data when I ask for how many people in Savannah search for "forklift". They might actually give me rough data for how many people in the region actually searched for the term "forklift" over the last 12 months, then divide by 12 to give me a monthly average. Or they might use data on a much broader region and then adjust for Savannah's population size. In other words, they might say, in the US people searched for "forklift" and average of 1,000,000 times a month. The US has a population of 300,000,000. Savannah has a population of about 250,000. 250,000 / 300,000,000 is 0.00083. 1,000,000 times 0.00083 is 208. So, "forklift" is searched in Savannah an average of 208 times. 1. is obviously much more accurate. I suspect that 2. is the model that Google is actually using. Does anyone know with reasonable certainty which it is? Thanks,
Local Website Optimization | | aj613
Adam0 -
Subdomain vs. Separate Domain for SEO & Google AdWords
We have a client who carries 4 product lines from different manufacturers under a singular domain name (www.companyname.com), and last fall, one of their manufacturers indicated that they needed to move to separate out one of those product lines from the rest, so we redesigned and relaunched as two separate sites - www.companyname.com and www.companynameseparateproduct.com (a newly-purchased domain). Since that time, their manufacturer has reneged their requirement to separate the product lines, but the client has been running both sites separately since they launched at the beginning of December 2016. Since that time, they have cannibalized their content strategy (effective February 2017) and hacked apart their PPC budget from both sites (effective April 2017), and are upset that their organic and paid traffic has correspondingly dropped from the original domain, and that the new domain hasn't continued to grow at the rate they would like it to (we did warn them, and they made the decision to move forward with the changes anyway). This past week, they decided to hire an in-house marketing manager, who is insisting that we move the newer domain (www.companynameseparateproduct.com) to become a subdomain on their original site (separateproduct.companyname.com). Our team has argued that making this change back 6 months into the life of the new site will hurt their SEO (especially if we have to 301 redirect all of the old content back again, without any new content regularly being added), which was corroborated with this article. We'd also have to kill the separate AdWords account and quality score associated with the ads in that account to move them back. We're currently looking for any extra insight or literature that we might be able to find that helps explain this to the client better - even if it is a little technical. (We're also open to finding out if this method of thinking is incorrect if things have changed!)
Local Website Optimization | | mkbeesto0 -
How to correctly move subdomain to subfolder (google webmaster)?
Hello, This is my first post in here 🙂 I just wondered what is the correct way to move a subdomain to subfolder? I've moved it, re-done sitemap, so that main website would include a subfolder, as they are part of one big website now (it was something like a blog on a subdomain). Subdomain now does correct 301 redirects. Submitted new sitemap to google, asked google to re-fetch the whole domain (thus subfolder should be re-fetched too, as it's part of main nav). The areas i'm in doubt: I can tell google that the domain got moved, however it is moved to the one that is already approved in the same account, but is in a subfolder, so should i do this? Or should i simply somehow erase it on webmaster? The blog was launched about a month ago, and it isn't perfectly optimized yet, it wasn't on google SERPs pretty much at all, excluding googling it straightly, and there are pretty much 0 traffic from google, almost all of it is either direct either referral, mostly social, Thanks, Pavel
Local Website Optimization | | PavelGro920 -
Virtual Offices & Google Search
United Kingdom We have a client who works from home and wants a virtual office so his clients do not know where he lives. Can a virtual office address be used on his business website pages & contact pages, in title tags and descriptions as well as Google places. The virtual office is manned at all times and phone calls will be directed to the client, the virtual office company say effectively it is a registered business address. Look forward to any helpful responses.
Local Website Optimization | | ChristinaRadisic0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Notify Google of correction?
We discovered duplicate content issues because of errors in domain forwarding. The forwards were masked so Google crawl thought all duplicate content. Fixed now and any suggestion on how to notify Google? just wait it out?
Local Website Optimization | | FredRoven0