Why does my site dissappeare from the top 50?
-
Hellow
I am having some problems with my site www.kondomanija.si.
It was ranked on the first page for my main KW kondomi (in www.google.si, Slovenia) but now it is not in the top 10 pages. And this has happened before, it drops out of the top 10 pages and in a cople of moths it is back for a short time (till it drops out again).
It think the site has a week link profile... Could this be the reason?
Does anybody know what is going on?
-
The problem is not that they outrank me, but that my site disappears from the search results.
No we did not get spam links from any sex sites, we do however have a lot of links from our other sites (same C block).
The traditional spamy links were not used...
-
Who are your competitors? it's hard to tell how strong or weak your link profile is until you compare it to the people outranking you.
Also, I ran your site through Open Site Explorer, and it's also difficult to evaluate for two other reasons. 1. I don't speak your language, but there's nothing we can do about that. 2. Your site is looks like a condom sales site, so it's in the sex industry and a lot of your backlinks are also in the sex industry. Since the sex industry has so many spammers, it's hard to tell if those are quality links or spam links. Were most of your links earned through quality content related to your industry or was this the traditional spammy way most sex related links get added?- Ruben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google adding main site name to the title tags of pages in the sub folders: How to handle?
Hi community, Ours is a WP hosted website. We have given our site title which reflects across all the website page title suffix. Like "Moz SEO" will be default at the title for pages like "Local SEO - Moz SEO". We have given different page title suffix to our sub-folders' pages like blog and help guides. For blog we have given "Moz blog" as title tag suffix which was working fine. But Google suddenly started showing main website's title as suffix in pages of sub folders. Ex blog: "How to rank better - Moz blog - Moz SEO". Here we can see "Moz SEO" has been added which is not required. How to handle this? Thanks
Algorithm Updates | | vtmoz0 -
One of our top visited page (login page) missing primary keyword, does this makes ranking drop of our homepage for same keyword?
Hi all, So, I have removed the "primary keyword" from login page, which is most visited page on our website to avoid keywords in non related pages. I noticed our homepage ranking dropped for same "primary keyword". Visitors of this login page directly land without searching with "primary keyword". Then how removing it from such page drops our ranking? Thanks
Algorithm Updates | | vtmoz0 -
Confused about PageSpeed Insights vs Site Load for SEO Benefit?
I was comparing sites with a friend of mine, and I have a higher PageSpeed Insights score for mobile and desktop than he does, but he his google analytics has his page load speed higher than. So assuming all things equal, some quality of conent, links, etc, is it better to have a site with a higher PageSpeed score or faster site load? To me, it makes more sense for it to be the latter, but if that's true, what's the point of the PageSpeed insights? Thanks for your help! I appreciate it. Ruben
Algorithm Updates | | KempRugeLawGroup0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
We have dynamically updated xml sitemaps which we feed to Google et al. Our xml sitemap is updated constantly, and takes minimal hands on management to maintain. However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone. So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
Algorithm Updates | | linklater0 -
Dedicated IP Address on my forum site www.astigtayo.com?
Hello and Good Day, Does having a dedicated IP Address to my site affect my search engine ranking? https://www.astigtayo.com
Algorithm Updates | | ificallyoumine0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Tuesday July 12th = We suddenly lost all our top Google rankings. Traffic cut in half. Ideas?
The attached screenshot shows all. Panda update hit us hard = we lost half our traffic. Three months later, Panda tweak gave us traffic back. Now, this past Tuesday we lost half our traffic again and ALL our top ranking Keywords/phrases on Google (all other search engines keywords holding rank fine). Did they tweak their algorithm again? What are we doing wrong?? eartheasy.com wtf.jpg
Algorithm Updates | | aran0880