Google's Search Algorithm update to 'Local Snack Pack'
-
Hi there - I was wondering if anyone else has noticed a big shift in the Google Local 'snack pack' in the past 48 hours?
We have noticed a big change in clients results - specifically today.
Has anyone else noticed any changes or perhaps data on possible changes?
I am aware of this update: https://www.seroundtable.com/big-google-search-algorithm-ranking-update-29953.html but perhaps there maybe another update since.
Any input would be much appreciated!
Phil.
-
Hi I know this question is a bit old, but there has not been any real major shakeup in the local Algo. I believe in August there was a local algo glitch but hat was fixed in a few days.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Strategy for Local Site
Hi everyone I am working in a Media Agency in Los Angeles and we have to locally rank this site to get leads. Arts by Yaseen https://artsbyyaseen.agency/ We are strictly going for whitehat SEO. Can experts here give me an strategy to rank this site on top results? Thank You
Link Building | | jack4920 -
404
We find 404 pages on our company website using Screaming Frog SEO tool. it's important to fix these 404 errors, a web designer can normally help.
Web Design | | sarahwalsh0 -
New business / content marketing
Hi all SEO experts, if a website is brand new, so published in the last 3 months- new domain name and website design. We have rebranded recently, using a new domain as entered new business partnership, there doesn’t seem to be much guidance on this at all, from various SEO websites, so our question is would you delay publishing new blog posts / content marketing as frequently because the company website is brand new? So would SEO’s decrease the frequency of publication of blog posts, because the website is new? Or perhaps it does not matter, and would still post every week as you would if the website has been live for a long time? So, in nutshell, what we are wondering is, is the “Google Sandbox” still in use?
Local SEO | | Ryan070 -
Brand reputation - how to improve?
Our brand has relatively bad reputation locally and I was wondering how moz can help to improve this.
Local SEO | | LendonMarketing0 -
National services provider and localized SEO (no physical stores)
Doing work for a telecom provider who operates in over 25 states. They are not trying to drive traffic to their brick-and-mortar stores. They want their marketing website to show products/services/pricing dynamically when a user enters their zip code. Previously, we could not show this until the shopper was already in the purchase flow that began with their serviceable address. They want to move these location-based details more forward in the shopping experience. They would likely have a "default" zip and set of services/pricing displaying until a user changes their location. My question is how does Google treat local SEO on a site where all location-targeted content is dynamic? Will the website suffer in localized search, when a shopper, say, in Colorado, wants to search for Internet providers? Is it better to have distinct landing pages for each territory with services/pricing?
Local SEO | | sprydigital0 -
Unsolved Have we been penalised?
Hey Community, We need help! Have we been penalised, or is there some technical SEO issue that is stopping our service pages from being properly read? Website: www.digitalnext.com.au In July 2021, we suffered a huge drop in coverage for both short and longtail keywords. We thought that this could have been because of the link spam, core web vitals or core update around that time period. SEMRush: https://gyazo.com/d85bd2541abd7c5ed2e33edecc62854c
Technical SEO | | StevenLord
GSC: https://gyazo.com/c1d689aff3506d5d4194848e625af6ec There is no manual action within GSC and we have historically ranked page 1 for super competitive keywords. After waiting some time thinking it was an error, we have then have taken the following actions: Launched new website. Rewrote all page content (except blog posts). Ensured each page passes core web vitals. Submitted a backlink detox. Removed a website that was spoofing our old one. Introduced strong pillar and cluster internal link structure. After 3 months of the new website, none of our core terms has come back and we are struggling for visibility. We still rank for some super long-tail keywords but this is the lowest amount of visibility we have had in over 5 years. Every time we launch a blog post it does rank for competitive keywords, yet the old keywords are still completely missing. It almost feels like any URLs that used to rank for core terms are being penalised. So, I am wondering whether this is a penalisation (and what algorithm), or, there is something wrong with the structure of our service pages for them to not rank. Look forward to hearing from you
Steven0 -
Google Maps marker inconsistency
We just discovered that depending on the address format you enter into Google, you may come across incorrectly placed marker locations on Google Maps. Is this because our Google Places address format is not consistent with Google Maps' format? If so, when I go into Google Places to update the address format, am I going to have to go through the citation process all over again?
Algorithm Updates | | SSFCU0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0