Yes, but article got 1y so it can be not so accurate now.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by PenaltyHammer
-
RE: Passing link juice via javascript?
-
RE: Passing link juice via javascript?
I think the same but need some more "proofs" like a/b tests or something.
-
Passing link juice via javascript?
Hello
Client got website with javascript generated content. All links there (from mainpage to some deeper page) are js generated. In code there're only javascripts and other basic typical code but no text links (<a href...="" ).<="" p=""></a>
<a href...="" ).<="" p="">The question is: are those js links got the same "seo power" as typical html href links?.For example majestic.com can't scan website properly and can't show seo metrics for pages. I know google crawls them (links and pages) but are they as good as typical links?</a>
<a href...="" ).<="" p="">Regards,</a>
-
RE: Indexed pages
Ok so check with site something under 1000 pages and go to the last results page. You'll see that there'll be different number (in almost all cases).
-
RE: Indexed pages
Hi
Most accurate number is from screaming frog (if you have less than 500 pages or paid version if more than 500).
Google indexes what it wants and if good enough to show in google index. If some pages are similar, got quality issues, blocked by robots etc then it won't show all. BTW don't think number in GSC or google index is good, check it manually because there can be 468 but in fact 200 only.
Moz can have "historical" pages that now don't exists or don't care about quality issues.
The truth is in screaming frog - most accurate number. If you used google user agent then number is the max that can appear in google index. If screaming frog user agent with turned off robots then you'll see bigger number (but google won't show it because of blocks).
If you want to check what's indexed then use tool like scrapebox. First get all urls (maybe without images if you don't care), then check indexed with sb. What's not indexed, can have some issues.
-
RE: Redirecting homepage to internal page (2nd Tier page)
Hi Satish
Hell no even if mainpage got nice seo metrics. Like Clever said it'll confuse users (there'll be no homepage). If you want to add more linkjuice/seo metrics/pa/da etc better to add link to mainpage with keyword you want to rank on. Something like: "more about KEYWORD here!" where KEYWORD is that keyword (or phrase) you want to rank.
Rest is the same as always: content, links and so on for the page you're talking about.
-
RE: Hacked website - Dealing with 301 redirects and a large .htaccess file
So robots part could be at the end but in my case it worked fine too.
-
RE: Hacked website - Dealing with 301 redirects and a large .htaccess file
Hi
I just finished similar job.
What you should do:
- collect all bad "pages" and links pointing to them
- find a pattern like some kind of directory
- set them (directories I believe?) 410, not 404
- set robots to disallow those directories
- push all pages and links to reindex
- remove from Google index
- done (need to wait some time)
Important thing is to get rid of all bad links pointing to those pages. If you do that, then there'll be no issues. However this could be ongoing negseo. If you need help with that, pm me.
Krzysztof
-
RE: Homepage not indexed - seems to defy explanation
Hey Marcus. You just need some links from high authority website like moz:) People say you're indexed so case closed, job done:)
-
RE: Homepage not indexed - seems to defy explanation
Unfortunately you're not amazon so maybe you must try harder;)
or force to index mainpage with some software or indexer website then wait a while.
I'm thinking about some negative seo made for your mainpage but so far can't see any symptoms.
-
RE: Homepage not indexed - seems to defy explanation
Hi Marcus
The only thing I think it can be the issue is the number of words on mainpage. Mostly I see images and words from menus, links and not main content. Digging deeper can help (seo audit).
This can be a penguin too but to know the answer, full link analysis is needed. After quick glance I see some unnatural links but not in larger number. Maybe they got footprints not visible at once (same ip, c class, content with link etc).
-
RE: Website rankings drop significantly after moving to new hosting provider
Not as important reason as the rest I mentioned. Correct all and will be fine.
-
RE: Website rankings drop significantly after moving to new hosting provider
Hi Sandi Matic.
Indeed, server location is in US and that matters.
On the other hand I see in google search result versions of your website: with www and without www. I see also this at the end of google results:
"In order to show you the most relevant results, we have omitted some entries very similar to the 89 already displayed.
If you like, you can repeat the search with the omitted results included."
so this means you have some amount of similar content.In code I see this: but your website is dedicated to New Zealand right? Change this (and all other) to en-NZ. Setting hreflang can help too.
Also you have little amount of linking domains (weak or weaker domains after new penguin 4). Worth to do is linkbuilding.
Another errors are meta descriptions are part of main content, heading h1 doubled, etc. To check all, page by page seo audit is needed. Even if you had it before, now you have to fix those onpage things to get better serps/traffic.