Woot! So glad to see it wasn't a penalty!
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by KristinaKledzik
-
RE: Home page suddenly dropped from index!!
-
RE: Home page suddenly dropped from index!!
I would be incredibly surprised if internal links to the homepage caused the issue. Google expects you to have a bunch of internal links to the homepage.
What you're going to need to do now is do a thorough review of all of the external links pointing to your homepage. I would do this with a tool - I recommend Kerboo, although I'm sure there are others that could do the same thing. Otherwise, you can look through all of the links yourself and look for spam indications (steps outlined in this handy Moz article).
Either way, make sure that you pull your list of links from Ahrefs or Majestic. Ideally both, and merge the lists. Moz doesn't crawl nearly as many links.
Since you haven't gotten a manual penalty warning, you're going to have to take as many of the spammy links you find down as you can and disavow the others. For speed, I'd recommend that you immediately upload a list of spammy links with Google's disavow tool, then start asking for an actual removal.
Keep in mind that you're probably going to disavow links that were helping rankings, so expect that your homepage won't come back ranking as well for nonbranded search terms as it used to. You'll probably want to start out uploading a very conservative set of URLs to the disavow tool, wait a couple of days to see if that fixes the problem, upload a bigger set, check, etc.
Good luck!
-
RE: Home page suddenly dropped from index!!
Any luck so far? Usually it only takes a few hours for Google to crawl new pages after you submit them in GSC, in my experience.
-
RE: Home page suddenly dropped from index!!
Okay, I ran some tests, and can't see anything that could've gone wrong. That does make it seem like a penalty, but given that this coincided with setting up Optimizely, let's go down that path first.
While your team is taking down the test - have you checked Moz to see if its crawler sees anything that could be causing an issue? I set up my Moz crawler to look into it, but it'll take a few days.
-
RE: Home page suddenly dropped from index!!
Hm, I've done a lot with Optimizely in the past, and it's never caused an SEO problem, but it's completely possible something went wrong. Since that's your first inkling, have you tried pausing that test and removing the Optimizely code from the homepage? Then you can determine whether or not it's an Optimizely problem.
Another thing you can do is use the Fetch as Googlebot feature in GSC. Does GSC say it can fetch the page properly?
If it says it can, try searching for "site:www.yourcompanysite.com". This will show if Google's got your URL in its index. If nothing comes up, it's not there; if it comes up, Google's decided not to rank it for some reason.
After those steps, get back to us so we can figure out where to go from there!
Good luck,
Kristina
-
RE: Why my website disappears for the keywords ranked, then reappears and so on?
Hi Paul,
This question is marked as "answered," so there aren't many SEOs reading it, unless they have the same problem as you! Please post a new question, so we can get the right people to answer.
Best,
Kristina
-
RE: Mass Removal Request from Google Index
Hi Ioannis,
What about the first suggestion? Can you create a page linking to all of the pages that you'd like to remove, then have Google crawl that page?
Best,
Kristina
-
RE: Mass Removal Request from Google Index
Hi Ioannis,
You're in quite a bind here, without a good URL structure! I don't think there's any one perfect option, but I think all of these will work:
- Create a page on your site that links to every article you would like to delete, keeping those articles 404/410ed. Then, use the Fetch as Googlebot tool, and ask Google to crawl the page plus all of its links. This will get Google to quickly crawl all of those pages, see that they're gone, and remove them from their index. Keep in mind that if you just use a 404, Google may keep the page around for a bit to make sure you didn't just mess up. As Eric said, a 410 is more of a sure thing.
- Create an XML sitemap of those deleted articles, and have Google crawl it. Yes, this will create errors in GSC, but errors in GSC mean that they're concerned you've made a mistake, not that they're necessarily penalizing you. Just mark those guys as fixed and take the sitemap down once Google's crawled it.
- 410 these pages, remove all internal links to them (use a tool like Screaming Frog to make sure you didn't miss any links!), and remove them from your sitemap. That'll distance you from that old, crappy content, and Google will slowly realize that it's been removed as it checks in on its old pages. This is probably the least satisfying option, but it's an option that'll get the job done eventually.
Hope this helps! Let us know what you decide to do.
Best,
Kristina
-
RE: Hundreds of 404 errors are showing up for pages that never existed
There have been a few people at Moz with similar problems with GSC. People always throw a few ideas around: maybe Google is creating URLs to try to find pages that it can't find through crawling links alone? Maybe another site was trying to hack your site by creating URLs they were hoping would trigger certain content on your site (a laughable idea now, but I remember my college professor showing us a site that put cost parameters in the URL during check out)?
However they got there, though, Eric and Chris gave you some good ways to make sure that you're not still in trouble (if you ever were).
Hope this helps!
-
RE: How to check if the page is indexable for SEs?
I understand the difference between what you're doing and what Google shows, I guess I'm just not sure when I'd want to know that something could technically be indexed, but isn't?
I guess I'm not your target market! Good luck with your tool.
-
RE: How to check if the page is indexable for SEs?
Ah, gotcha. Personally, I use Google itself to find out if something is indexable: if it's my own site, I can use Fetch as Google, and the robots.txt tester; if it's another site, you can search for "site:[URL]" to see if Google's indexed it.
I think this tool could be really good if you keep it as an icon and it glows or something if you've accidentally deindexed the page? Then it's helping you proactively.
Hope this helps!
Kristina
-
RE: How to check if the page is indexable for SEs?
You're probably already doing this, but make sure that all of your tests are using the Googlebot user agent! That could cause different results, especially with the robots.txt check.
A sense check: what is your plugin going to offer over Google Search Console's Fetch as Google and robots.txt Tester?
-
RE: Structured Data on mobile and desktop version of a page
Hi Jochen,
SUPER interesting find, thanks for pointing this out, Jochen.
To me, this looks like Google understands that these two pages are the same page, except for different devices, and is using information on the desktop page to make their search results more robust for mobile.
You can see the connection by looking for Google's cache of your mobile page. The best way to do this is to search in Google for "cache:[URL]". If you search for "cache:http://m.avogel.ch/de/ihre-gesundheit/erkaeltung/alles_ueber_erkaeltungen.php", Google will send you to the desktop version of the page.
Here's my theory: Google has one index for both desktop and smartphone users, so it combines data and gives the user the best result possible. Google's doing more and more to try to improve its search results even without SEO intervention, so I'm not too surprised about this, but can't seem to find this in any SEO articles out there.
In answer to your question: I recommend that you continue to keep you mobile and desktop sites similar enough that Google is pulling from both. In the past, some SEOs would build sites differently for mobile users, but I've never seen any UX studies that shows that that's a better approach. Given that Google strongly recommends that you use responsive web design, it's certainly not Google's recommended approach.
I hope this helps? I'm not sure if this was a post because you were worried about something - this seems like good news to me!
Kristina
-
RE: Have an eBook. What is best practice for SEO?
Hi Laura,
It sounds like your ebook is assisting in the SEO of your website, since individual chapters are ranking. You can see how much of a page (or PDF) Google can read by searching for cache:[URL]. Here's Google's cache of chapter 8, which you shared.
But, you're on the right track, turning these pages into HTML pages will make them easier for Google to crawl, and you'll probably get more traffic out of it. Here's one way you could handle this:
- Keep http://re-timer.com/the-product/how-to-sleep-better/ as it is to encourage sign up
- Create a page for each chapter of the book, with the same content. Make sure to canonical the PDF chapters to the HTML counterparts.
- Link to those chapters somewhere else on the site, so it doesn't discourage people who land on the PDF download page.
That way, you get the benefits of the content individually, but keep the landing page.
Hope this helps!
Kristina
-
RE: EDU Links to my Site Never Show in Webmaster Tools
Hmm, a main menu link is usually great. Have you double checked to make sure those links are in HTML, so Google can read them?
-
RE: EDU Links to my Site Never Show in Webmaster Tools
Hi there,
There are a lot of components to your question, so I'm going to answer each as thoroughly as I can. Apologies if I say something that you already know (which will probably be a lot of this, since I'm trying to be thorough).
Why aren't my links showing up in GWT/Search Console?
There are some things to note about GWT/Search Console:
- Links only show up for the exact domain that you're looking at in GWT: that means that you should have 4 profiles for one site: http://www.site.com, http://site.com, https://www.site.com, https://site.com
- Links take awhile to show up, so if these are new links, you might just have to wait (although it doesn't seem like this is the case for you)
- I and many other SEOs are fairly certain that GWT is much, much dumber than actual Google. I have links from pages displaying ads show up in GWT as if they're actual links, and I'm sure Google's ranking algorithm does not see this. If links don't show up in GWT, check in other ways to make sure they're being counted in Google's ranking algorithm.
Which links are counting for Google's ranking algorithm?
If you read through Matt's post, he shows you how you can use tools to find which links should be passing value to your site. If you only have one or two links that you want to check, I like to follow these steps, to really see what Google sees:
- Search for the page that should be linking to your site. If it's not indexed, end of story, it doesn't count, or it's hurting you.
- Open the cached version of the page. You can do this by clicking on the upside down triangle to the right of the green URL in search results, and clicking "Cached". You can also do this by searching for "cache:[url]". If you use a browser that has a URL bar that doubles as a browser, this can be really quick, if you just add "cache:" in front of the URL of the page you're already on.
- At the top of the cached version of the page, you'll see a Text Only option. Click on that. Search for your link. You might need to view the source for this, if you don't know the anchor text that's used. If you can find the link, there's a good chance that Google can, too.
- Double check that the link isn't nofollowed, and is a correct, direct link.
Why aren't I ranking when I have better inbound links?
Inbound links, while incredibly important, are not the be-all end-all to SEO. If you're not ranking as well as you think you should be, check these things:
- Is my page as relevant to the search query as the page ahead of me? Relevancy can have a huge impact, especially when competing sites have a similar number of backlinks. I would determine relevancy by: is the search query in my page title as well as my competitor's? Do I use the keyword on my page as often as my competitor? Do I use the keyword on my site as often as my competitor? Do I have the keyword in anchor text in links to my page as often as my competitor?
- _Do I have as many links to the ranking page as my competitor? _Google definitely looks at the strength of the entire domain for rankings, but links to the relevant page are much more powerful.
- _Do I have a domain that's as strong as my competitor's? _To the opposite point, if you have 10 links to your page, and your competitor has 5, but they have a DA of 60 and you're DA 50, they may still win.
- _Has this competitor been around longer / been ranking for this term longer than I have? _Historical ranking definitely plays a factor in SEO. That doesn't mean it can't be overcome, but expect it to take more links than your competitor has to outrank them. It can sometimes take twice as many links as the #1 result to claim that spot.
I know this is probably more info than you needed, but I'm not sure why you feel that you're not ranking where you should be, so I wanted to share my methods so you can tell me where you have questions, or where you disagree.
Let me know your thoughts!
Kristina
-
RE: EDU Links to my Site Never Show in Webmaster Tools
Yeah, the 301ing could be creating an issue. Google Webmaster Tools really isn't good at following through redirect paths. After hearing that, and reading everything that Matt researched, it's probably worth reaching out to your .edu partners and asking them to update their links to the proper URL for your site (including if it's https now!).
Also, forgive me if I just missed your answer, but have you set up an https profile of your site in Google Webmaster Tools? Google absolutely distinguishes between the two - my site upgraded about a year ago, and I watched traffic drop in one and jump in the other. I keep both now to monitor inbound links that are to either. It's worthwhile creating a profile with and without www as well.
-
RE: Duplicate content on recruitment website
Hi Issa,
You're right, duplicate content and bad usability could be triggering the slow rolling Panda 4.2, but I'd dig in a little more (apologies if you already did this research):
-
You mentioned 200 pages are potentially duplicate; how many are on the site in total? If you have thousands of pages indexed, 200 duplicates probably aren't going to cause a Panda penalty.
-
How similar are these postings? Just the page title? Or is the entire page extremely similar in content? (To answer this: if you made a keyword cloud for these similar job descriptions, would they show roughly the same mapping?)
-
If it's just the page title that's similar, make sure to set the pages apart by including the name of the hiring company (which I assume makes the different positions unique) towards the beginning of the page title
-
If the entire page is similar, then add more content to make the pages more unique, like a blurb about the hiring company, how long the job has been up, how many applicants the job has (if available), etc.
-
Either way, make sure you don't have any old jobs that still have live pages! If possible, I'd redirect them to a similar job posting.
-
Like John asked, did your traffic drop dramatically one day, or has it been tapering off? If it's tapering off, I'd guess it's not Panda.
-
And, last, which pages lost traffic and rankings? Which keywords dropped in rankings? You may be able to tell how you were penalized by which keywords were most affected.
Hope this helps,
Kristina
-