Questions created by derderko
-
Compute PageRank yourself - Which OSE metrics as input
Hi, we're running a project here where we try to optimize our internal link structure by computing the PageRank ourselves given your existing link graph. Right now we weight all pages the same, which is kind of incorrect given that quite a number of our pages have external links, hence have more LinkJuice. We'd now like to include a weight that respects the external link power to a page. Anyone got suggestions OSE metric would be good? I was thinking of Page Authority, but I assume this is computed by taking both external and internal links into consideration. Alternatively I could use sum all PageAuthority values of inlinks to a certain page, but as Page Authority seems exponential, this seems to be the wrong thing. Anyone out there you who tried the same before? Cheers,
Reporting & Analytics | | derderko
schuon0 -
Schema for Price Comparison Services - Good or Bad?
Hey guys, I was just wondering what the whole schema.org markup means for people that run search engines (i.e. for a niche, certain products) or price comparison engines in general. The intend behind schema.org was to help the engines better understand the pages content. Well, I guess such services don't necessarily want Google to understand that they're just another search engine (and thus might get thrown out of the index for polluting it with search result pages). I see two possible scenarios: either not implement them or implement them in a way that makes the site not look like an aggregator, i.e. by only marking up certain products with unique text. Any thoughts? Does the SEOmoz team has any advice on that? Best,
Technical SEO | | derderko
schuon0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0