Search visibility degrading gradually
-
We have several web pages with the same structure released in several countries. Each website contains information about spam callers in the country the website has been released for.
Now I have the problem that I see a slow degradation of search traffic in the US. The UK website on the other hand is doing quite well, actually improving.
According to MOZ our mobile search visibility dropped significantly in the last week and I am at the moment not able to pin this down. Can anyone please give me a hint at what data best to analyze to find the source of this problem? TIA
Best
Thomas -
Thanks to Donald's post, we could improve the value we get out of moz.com. As our keywords are phone numbers and the "best" numbers are changing a lot, we are now updating our keywords on a weekly basis. This gives us a much more relevant visibility value.
-
Yes that is correct
-
Donald,
Thanks for your reply. When it comes to keywords we are a kind of strange animal. Our keywords are phone numbers. As I said we are a directory of spam numbers. Users who search for us find us generally by phone number. So if I understand you right, search visibility looks at the keywords I gave to MOZ and determines the visibility by these keywords, right?
-
If you add keywords to your profile that do not rank this will bring down search visibility. Search visibility is only a percentage of where your tracked keywords rank.
If you remove all keywords and leave only one with a page 1 rank at position 1 you will have 100% search visibility. If you add four more keywords that don't rank at all you will have a 20% search visibility.
If these keywords fluctuate up and down the search visibility will also fluctuate up and down. So you must determine if the loss is due to adding new keywords that do not rank or they keywords you have been tracking have dropped from their former positions.
Thanks,
Don Silvernail
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema markup in tag manager for multiple locations not registering in tester tool or search console
Hi Moz community, I had implemented schema markup for companies with multiple branches. We setup an organization with multiple points of contact tag, a Local Business tag for each branch and tags for specific products/ services - all through Google tag manager. I managed to fix the product markup with a small update to the code I found in a Google forum but have been unable to revive our local business markup. The tags of the schema mark up are active but when I run the Google structured data testing tool it doesn't find any schema tags. We are seeing some of the tags show up in search console but not all of them. Has anyone else had this problem and found a solution? Or, do you have any recommendations on how to markup an organization with multiple branches? Should we have one overall organization tag and a separate one for each business or is there another way of presenting each branch? Appreciate any insight!
Technical SEO | | Alexanders0 -
Google Search console says 'sitemap is blocked by robots?
Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: *
Technical SEO | | Extima-Christian
Disallow: Sitemap: http://www.website.com/sitemap_index.xml It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue?1 -
How to remove my cdn sub domins on Google search result?
A few months ago I moved all my Wordpress images into a sub domain. After I purchased CDN service, I again moved that images to my root domain. I added User-agent: * Disallow: / to my CDN domain. But now, when I perform site search on the Google, I found that my CDN sub domains are indexed by the Google. I think this will make duplicate content issue. I already hit by the Panguin. How do I remove these search results on Google? Should I add my cdn domain to webmaster tools to request URL removal request? Problem is, If I use cdn.mydomain.com it shows my www.mydomain.com. My blog:- http://goo.gl/58Utt site search result:- http://goo.gl/ElNwc
Technical SEO | | Godad1 -
How to search HTML source for an entire website
Is there a way for me to do a "view source" for an entire website without having to right-click every page and select "view source" for each of them?
Technical SEO | | SmartWebPros0 -
Is this tabbed implementation of SEO copy correct (i.e. good for getting indexed and in an ok spot in the html as viewed by search bots?
We are trying to switch to a tabbed version of our team/product pages at SeatGeek.com, but where all tabs (only 2 right now) are viewed as one document by the search engines. I am pretty sure we have this working for the most part, but would love some quick feedback from you all as I have never worked with this approach before and these pages are some of our most important. Resources: http://www.ericpender.com/blog/tabs-and-seo http://www.google.com/support/forum/p/Webmasters/thread?tid=03fdefb488a16343&hl=en http://searchengineland.com/is-hiding-content-with-display-none-legitimate-seo-13643 Sample in use: http://www.seomoz.org/article/search-ranking-factors **Old Version: ** http://screencast.com/t/BWn0OgZsXt http://seatgeek.com/boston-celtics-tickets/ New Version with tabs: http://screencast.com/t/VW6QzDaGt http://screencast.com/t/RPvYv8sT2 http://seatgeek.com/miami-heat-tickets/ Notes: Content not displayed stacked on browser when Javascript turned off, but it is in the source code. Content shows up in Google cache of new page in the text version. In our implementation the JS is currently forcing the event to end before the default behavior of adding #about in this case to the url string - this can be changed, should it be? Related to this, the developer made it so that typing http://seatgeek.com/miami-heat-tickets/#about directly into the browser does not go to the tab with copy, which I imagine could be considered spammy from a human review perspective (this wasn't intentional). This portion of the code is below the truncated view of the fetch as Googlebot, so we didn't have that resource. Are there any issues with hidden text / is this too far down in the html? Any/all feedback appreciated. I know our copy is old, we are in the process of updating it for this season.
Technical SEO | | chadburgess0 -
How to stop Search Bot from crawling through a submit button
On our website http://www.thefutureminders.com/, we have three form fields that have three pull downs for Month, Day, and year. This is creating duplicate pages while indexing. How do we tell the search Bot to index the page but not crawl through the submit button? Thanks Naren
Technical SEO | | NarenBansal0 -
Why do I see dramatic differences in impressions between Google Webmaster Tools and Google Insights for Search?
Has anyone else noticed discrepancies between these tools? Take keyword A and keyword B. I've literally seen situations where A has 3 or 4 times the traffic as B in Google Webmaster Tools, but half the traffic of B in Google Insights for Search. What might be the reason for this discrepancy?
Technical SEO | | ir-seo-account0