Make a query, go to bottom of page, and click on advanced search, select 100 pages.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best posts made by AlanMosley
-
RE: Google SERP count 100
-
RE: Some questions on Canonical tag AND 301 redirect
Shane is correct in his advice,
Q1
you dont need the canonical, if you did not have a 301 redirect, then the canonical should be on the old page pointing to the new. but as Shane said you dont need it when you have a 301 in place.
Q2
I would canonical to http://www.example.com/new-widget-category for all p1 to p5
As i wonder if the change of the products in the grid is enouth to make the pages unique. If you have sorting it just gets more messy
Your product pages will have this info for each product anyhow.
i would try to make the category page relevant for the catgory.
Rather then use rel=canonical I would use rel=next and rel=previous
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663744
-
RE: Why crawl error "title missing or empty" when there is already "title and meta desciption" in place?
I just did a cral on your site using Bings ToolKit, and i did not find any errors concerneing tittle.
In fact your site has the best score i have ever got from a wordpress site. Usely a wordpress site is a mess, especialy with un-necasary 301's
I found only 2 html errors, 1 un-necessary redirect and multiple h1.
Wait to next crawl it may come good.
-
RE: Is there such thing as a good text/code ratio? Can it effect SERPs?
There is no set ration but clean code is important, large amount sof script, css, json and viewstate can affect your SEO, usly messy code has errors, many of todays CMS packages create messy code with errors. Seach engines have to try to work out what is visisble to the users, this is no easy feat when you have mess code with errors.
Herre a few errors that Bing picks up, no dount Googes does also
http://perthseocompany.com.au/seo/reports/violation/the-page-contains-a-large-amount-of-script-code
-
RE: Sitmap Page - HTML and XML
too many according to google. make of it what you will, does not look like it is for any technial reason anyymore, but obviously there is a limit to how much of page they will crawl.
http://www.mattcutts.com/blog/how-many-links-per-page/You see how page rank flows, having a lot of links on your home page works to your advantage. Using numbers from Googles original algo,
Assuming every page starts with 1PR, a page passes %85 of its link juice, so if you have 100 links that’s 0.0085 each. To 100 internal pages, making them 1. 0085each , now they all pass back 85% that’s 0.857225 each, x 100 = 85.7225 back to your home page, now we do the sums all over again and again till they numbers stay static. Now this calculation relies on the internal pages having no other links, so you are unlikely to get figures as good as this, but you get the idea.
See link for better explanation.
http://www.webworkshop.net/pagerank.html check out calculator
Remember don’t stuff up your linking stuckture for the users just for the sake of page rank.I see it as like a golf swing after a lesson, if you try to do what you just learnt too much, you will get all stiff and un-natural, it’s better to swing naturally with what you have learnt in the back of your head.