Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Significant "Average Position" dips in Search Console each time I post on Google My Business
-
Hi everyone,
Several weeks ago I noticed that each Wednesday my site's Average Position in Search Console dipped significantly.
Immediately I identified that this was the day my colleague published and back-linked a blog post, and so we spent the next few weeks testing and monitoring everything we did.
We discovered that it was ONLY when we created a Google My Business post that the Average Position dipped, and on the 1st July we tested it one more time. The results were the same (please see attached image).
I am 100% confident that Google My Business is the cause of the issue, but can't identify why. The image I upload belongs to me, the text isn't spammy or stuffed with keywords, the Learn More links to my own website, and I never receive any warnings from Google about the content.
I would love to hear the community's thoughts on this and how I can stop the issue from continuing. I should note, that my Google My Business insights are generally positive i.e. no dips in search results etc.
My URL is https://www.photographybymatthewjames.com/
Thanks in advance
Matthew
-
No worries, you often get the weekly dips when global - no idea why. However, when adding country it all flattens out. If the question is answered or happy please mark it accordingly. Hope that helps.
-
Actually, no. Those big dips each Wednesday don't appear when I select Denmark. So I guess this makes much more sense if I am only targetting a local market such as Denmark.
Thanks for making this clear...
-
Ta, but do you get the big dip now when Denmark is applied?
-
Thanks again.
No, the country filter wasn't set, so it was showing an average position for all countries.
The site is optimised as best as possible as far as I am aware.
-
Apologies my question was unclear - when you extracted your position data from search console. Did you limit it to Denmark? There is navigation that lets you limit to the country. Can you let me know if we are working from that data set on positional changes?
Zero or 0 is worse than 196. 0 means your not ranking at all, whereas 196 means that is where you are ranking position 196.
They are very low rankings, is the site optimised from a title tag, meta description, H1 perspective?
Check out the URL structure component,
https://moz.com/learn/seo/on-page-factors
Also see excerpt below:
An Ideally Optimized Web Page
An ideal web page should do all of the following:
- Be hyper-relevant to a specific topic (usually a product or single object)
- Include subject in title tag
- Include subject in URL
- Include subject in image alt text
- Specify subject several times throughout text content
- Provide unique content about a given subject
- Link back to its category page
- Link back to its subcategory page (If applicable)
- Link back to its homepage (normally accomplished with an image link showing the website logo on the top left of a page)
Hope that helps.
- Be hyper-relevant to a specific topic (usually a product or single object)
-
Hi, and thanks for the fast response.
Yes, I have specifically been targetting Denmark for the past 3-4 years (I set this up in the old Webmaster Tools and it still appears the same). So in short, I am targetting one country.
As for Queries – yes, there are significant drops for certain customer queries. The column on the left with the highest number shows the average page rank on the day I posted on Google My Business, vs. the day before I posted. i.e. event photographer dipped from 0 average page ranking to 196
Does this make things clearer in any way?
| event photographer | 196 | 0 | 196 |
| event photography | 193 | 0 | 193 |
| sports photographer | 188.3 | 0 | 188.3 |
| sports photography | 182 | 0 | 182 |
| james harrison | 101 | 0 | 101 |
| photoshoot københavn | 98 | 0 | 98 |
| copenhagen photos | 97 | 0 | 97 |
| danish courses in copenhagen | 96 | 0 | 96 |
| dhl stafet københavn | 92 | 0 | 92 |
| cph business | 91.5 | 0 | 91.5 | -
Hi
On search console - have you limited that to a country? Or is that "countryless"". As if countryless - not unusal.
Can you clarify? Ideally, limit to the country you are targetting.
Also on top of the overall position dips - is there any corresponding dips for major customer queries you are tracking?
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google only show high rankings to webmasters?
Hello! I'm using Google Chrome browser. On the browser logged in with the Google Web Master account, the keyword ranking of my site is ranked on pages 1 to 2. However, in the case of the Google Secret tab and other browsers that have not logged in to the Google Web Master account, the keyword ranking of my site is only 10 pages. Which one is more reliable?
White Hat / Black Hat SEO | | 1kB_man_YWH
Does it only show higher rankings to webmasters?
Or will you soon be able to see the high ranking that has only been seen by webmasters in other browsers? I desperately need your help. Thank you. I look forward to your kind cooperation.0 -
Should I Report A SEO Agency to Google
Our competitor has employed the services of a spammy SEO agency that sends spammy links to our site. Though our rankings were affected we have taken the necessary steps. It is possible to send evidence to Google so that they can take down the site. I want to take this action so that other sites will not be affected by them again.
White Hat / Black Hat SEO | | Halmblogmusic0 -
Best practices for retiring 100s of blog posts?
Hi. I wanted to get best practices for retiring an enterprise blog with hundreds of old posts with subject matter that won't be repurposed. What would be the best course of action to retire and maintain the value of any SEO authority from those old blog pages? Is it enough to move those old posts into an archive subdirectory and Google would deprioritize those posts over time? Or would a mass redirect of old blog posts to the new blog's home page be allowed (even though the old blog post content isn't being specifically replaced)? Or would Google basically say that if there aren't 1:1 replacement URLs, that would be seen as soft-404s and treated like a 404?
White Hat / Black Hat SEO | | David_Fisher0 -
Does google sandbox aged domains too?
Hello, i have a question. Recently i bought a domain from godaddy auction which is 23 years old and have DA 37 PA 34 Before bidding i check out the domain on google using this query to make sure if pages of this website are showing or not (site:mydomain.com) only home page was indexed on google. Further i check the domain on archive web the domain was last active in 2015. And then it parked for long about 4 years. So now my question does google consider these type of domain as new or will sandboxed them if i try to rebuild them and rank for other niche keywords ? Because its been 4 weeks i have been building links to my domain send several profile and social signals to my domain. My post is indexed on google but not showing in any google serp result.
White Hat / Black Hat SEO | | Steven231 -
Should I delete older posts on my site that are lower quality?
Hey guys! Thanks in advance for thinking through this with me. You're appreciated! I have 350 pieces of Cornerstone Content that has been a large focus of mine over the last couple years. They're incredibly important to my business. That said, less experienced me did what I thought was best by hiring a freelance writer to create extra content to interlink them and add relevancy to the overall site. Looking back through everything, I am starting to realize that this extra content, which now makes up 1/3 my site, is at about 65%-70% quality AND only gets a total of about 250 visitors per month combined -- for all 384 articles. Rather than spending the next 9 months and investing in a higher quality content creator to revamp them, I am seeing the next best option to remove them. From a pros perspective, do you guys think removing these 384 lower quality articles is my best option and focusing my efforts on a better UX, faster site, and continual upgrading of the 350 pieces of Cornerstone Content? I'm honestly at a point where I am ready to cut my losses, admit my mistakes, and swear to publish nothing but gold moving forward. I'd love to hear how you would approach this situation! Thanks 🙂
White Hat / Black Hat SEO | | ryj0 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
href="#" and href="javascript.void()" links. Is there a difference SEO wise?
I am currently working a site re-design and we are looking at if href="#" and href="javascript.void()" have an impact on the site? We were initially looking at getting the links per page down but I am thinking that rel=nofollow is the best method for this. Anyone had any experience with this? Thanks in advanced
White Hat / Black Hat SEO | | clickermediainc0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0