Google Panda 4.0 update - Good for Small businesses?
-
Hi guys,
We recently did a post on Google Panda 4.0 release. Check this here.
Have you seen any notable changes in rankings for your website? Do you think that this update will benefit small businesses/websites?
Looking forward to your comments.
-
One of our sites had issues previously, bad links. We went through the disavow procedure months ago, but no change in rank until yesterday - up about 28 spots, broke the #50 barrier, all since the update.
-
Yeah, I always thought it was funny when Google's snippet tester would show it was clearly working, and then you see the link on a SERP, and guess what? No snippet, no authorship lol.
Best of luck, looks like things are stabilizing a bit for you.
-
Have you seen this? Ouch. http://blog.searchmetrics.com/us/2014/05/21/panda-update-4-0-winners-and-losers-google-usa/
-
Oh yes - the site i mentioned that had recovered massively is also now displaying the product and review schema, that wasn't previously displaying - even though rich snippets testing said it was all fine. I hadn't considered that the two could be linked. Duh, thanks.
-
Thanks for your input guys. We have noticed in many communities that most of webmasters/SEOs have seen improvements in their rankings so far from the update.
eBay has seen drop in rankings. I wonder if this has something to do with their study last year showing that paid search don't work for brand terms :|. haha
-
After looking hard at rankings after every algo update, we always see strange results while Google is still "rolling it out". I wouldn't take much to heart while they get the algo in place. It would be better to let things stablilze, then go back and see what rankings changed. I think the week of the update is the worst time to analyze what is going on.
From Matt Cutts blog:
https://twitter.com/mattcutts/status/468891756982185985"Google is rolling out our Panda 4.0 update starting today." Rolling out, starting today are the things that stand out to me from that statement, meaning "its not done yet" lol
I will say that we have seen our rankings improve, with multiple links now showing on the front page for our main keywords. Whereas authorship was not displaying, it now is, combined with our schema reviews.
-
We've seen one of our sites jump from low 40's to 11 overnight after months of being low.
We're UK based as well, more a directory style site then e-commerce.
-
Great article, and very much on time. Thanks for sharing. I have checked some of our sites and at the moment I cannot see any changes or major drops. We seem to be doing it right so far
It would be good to review the traffic in a week. Please get in touch with us on @<a class="view_profile profile_link js-avatar_tip ss_tip tip_left" data-tip="View 's Profile" data-sstip_class="twt_avatar_tip" data-screen-name="FingoMarketing">FingoMarketing </a>
-
We're in the UK and 2 of our ecommerce sites that had been suffering have seen improvements this morning. One of them particularly has improved in rankings massively for many, many queries.
Also noticed that some queries are dropping forum threads as results.
-
Not so far, no.
In fact, from our early analysis, small biz is hurting. A search for "Melbourne pizza" shows 4-5 news articles and only the local block of businesses.
Other searches show more directories than ever and a lot more gumtree/craigslist.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google still indexing home page even after with 301 - Ecommerce Website
Hi all,
Local Website Optimization | | David1986
We have a 301 redirect problem. Google seems to continue indexing a 301 redirect to our old home page. Even after months. We have a multiple language domain, with subfolders: www.example.com (ex page, now with a redirect to the right locale in the right country) www.example.com/it/home (canonical) www.example.com/en/home (canonical) www.example.com/es/home (canonical) www.example.com/fr/home (canonical) www.example.com/de/home (canonical) We still see the old page (www.example.com) in Google results, with old metadata in English and, just in some countries (i.e.: France), we see the correct result, the "new" homepage, www.example.com/fr/home in first position.
The real problem is that Google is still indexing and showing www.example.com as the "real" and "trusted" URL, even if we set: a 301 redirect the right language for every locale in Google Search Console a canonical tag to the locale url an hreflang tag inside the code a specific sitemap with hreflang tag specified for the new homepages Now our redirect process is the following (Italy example).
www.example.com -->301
www.example.com/en/home --> default version --->301
www.example.com/it/home --> 200 Every online tool, from Moz to Bot simulators see that there is a 301. So Correct. Google Search Console says that: on www.example.com there is a 301 (correct) in the internal link section of Google Search Console the www.example.com is still in first position with 34k links. Many of these links are cominig from property subdomains. Should we change those links inside those third level domain? From www.example.com to www.example.com/LOCALE/home? the www.example.com/LOCALE/home are the real home page, they give 200 code Do you know if there's a way to delete the old home page from Google results since this is 301? Do you think that, even after a 301 redirect, if Google sees too many internal links decides to ignore the 301? Thanks for your help! Davide0 -
SEO Client not rankings in Google
Hello, I have a client that has continued to be problematic for my team and I. They have fair to middling rankings in Yahoo and Bing, but none in Google. I realize that they are three separate search engines each with their own criteria, but this client is the only one experiencing this problem. There is no significant duplicate content that can find, same with restrictions in the robots.txt file. These seems to be no reason why all my tools say that this client has no presence at all in google, especially when the client gains most of their traffic through Google. Can anyone assist me in finding out what is going wrong? Client website for reference: http://www.volvethosp.com/ Best, BeyondIndigo
Local Website Optimization | | BeyondIndigo0 -
Virtual Offices & Google Search
United Kingdom We have a client who works from home and wants a virtual office so his clients do not know where he lives. Can a virtual office address be used on his business website pages & contact pages, in title tags and descriptions as well as Google places. The virtual office is manned at all times and phone calls will be directed to the client, the virtual office company say effectively it is a registered business address. Look forward to any helpful responses.
Local Website Optimization | | ChristinaRadisic0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
2 clients. 2 websites. Same City. Both bankruptcy attorneys. How to make sure Google doesn't penalize...
Hi Moz'ers! I am creating 2 new websites for 2 different bankruptcy attorneys in the same city. I plan to use different templates BUT from the same template provider. I plan to host with the same hosting company (unless someone here advises me not to). The content will be custom, but similar, as they both practice bankruptcy law. They have different addresses, as they are different law firms. My concern is that Google will penalize for duplicate content because they both practice the same area of law, in the same city, hosting the same, template maker the same, and both won't rank. What should I do to make sure that doesn't happen? Will it be enough that they have different business names, address, and phone numbers? Thanks for any help!!
Local Website Optimization | | BBuck0 -
How to get facebook likes, for a business that no one would like?
Ok so a crazy title, but please hear me out. I have a construction website. It's a small business. It serves people one time, we get the job done right the first time, and usually never go back. A lot of our clients are older, some who have a rotary phone still. So for the sake of social seo, how do you get people to like a company on facebook, when its a local business who basically gets in and gets out. I read somewhere to place ads with poll type pictures. For an example say, Like us if you like a clean bath tube. It sounds weird, but it seems to help some. It gives you a like, and they all add up. So for a business who is trying to get likes, without buying them, how do you approach it? Thank you Chris
Local Website Optimization | | asbchris0 -
Local Business Schema Markup on every page?
Hello, I have two questions..if someone could shed some light on the topic, I would be so very grateful! 1. I am still making my way through how schema is employed, and as I can tell, it is much more specific (and therefore relevant) in its details than using the data highlighter tool. Is this true? 2. Most of my clients' sites have a footer with the local business info included on every page of their site (address and phone). This said, I have been using the structured data markup helper to add local business schema to home page, and then including the footer markup in the footer file so that every page benefits from the local business markup. Is this incorrect to use it for every page? Also, I noticed that by just using the footer markup for the rest of the pages in the site, I am missing data that was included when I manually went through the index page (i.e. image, url, name of business). Could someone tell me if it is advisable and worth it to manually markup every page for the local business schema or if that should just be used for certain pages such as location, contact us, and/or index? Any tips or help would be greatly appreciated!!! Thanks
Local Website Optimization | | lfrazer0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0