Questions created by raido
-
How to detect where Google gets indexed URL's
Google index some kind of way some links that create duplicate content. We doesn't understand how these are created so we would like detect where Google robots find these links. We tried: Moz Crawl Diagnostics but it shows 0 as Internal Link Count for these kind of links. Find some information from Google Analytics, that maybe there is trace (site content - all content) from visitors side. There wan't. We tried to find some information in Webmaster Tools under Internal link and HTML Improvements but didn't find any trace. Tried some search commands. Is there maybe some good one to search. TO search URL's form code with https://search.nerdydata.com.
Reporting & Analytics | | raido0 -
Duplicate product description ranking problems (off-site duplicate content)
We do business in niche category and not in English language market. We have 2-3 main competitors who use same product information as us. They all do have same duplicate products descriptions as we. We with one competitors have domains with highest authority in this market. They maybe have 10-20% better link profile (when counting linking domains and total links). Problem is that they rank much better with product names then we do (same duplicate product descriptions as we have and almost same level internal optimisation) and they haven't done any extra link building for products. Manufacturers website aren't problem, because these doesn't rank well with product name keywords. Most of our new and some old product go to the Supplemental Results and are shown in "In order to show you the most relevant results, we have omitted some entries very similar to the ... already displayed. If you like, you can repeat the search with the omitted results included.". Unique text for products isn't a option. When we have writen unique content for product, then these seem to rank way better. So our questions is what can we do externaly to help our duplicate description product rank better compared to our main competitor withour writing unique text? How important is indexation time? Will it give big advantage to get indexed first? We have thought of using more RSS/bing services to get faster indexation (both site will get products information almost at same time). It seems our competitor get quicker in index then we do. Also are farmpages helpful for getting some quick low value links for new products. We have planed to make 2-3 domains that would have few links pointint to these new products to get little advantage right after products are launched and doesn't have extranl links. Sitemap works and our new product are shown on front pages (products that still mostly doesn't rank well and go to Supplemental Results). Some new product have #1 or top3 raking, but these are only maybe 1/3 that should have top3 rankings. Also we have noticed problem that when we index products quickly (for example Fetch as Google) then these will get good top3 results and then some will get out of rankings (to Supplemental Results).
International SEO | | raido0 -
#! (hashbang) check help needed
Does anybody have experience using hashbang? We tried to use it to solve indexation problem and I'm not fully sure do we use right solution now (developers did it with these FAQ and Guide to Ajax crawling as information source). One of our client has problem, that their e-shop categories, has solution where search engines aren't able to index all products. In this example a category, there is this "Näita kõiki (38)" that shows all category products for users but as I understand search engines aren't able to index it as /et#/activeTab=tab02 because of #. Now there is used #! (hashbang) and it is /et#!/activeTab=tab02. Is this correct solution? Also now example category URL is defferent for better indexation with:
Intermediate & Advanced SEO | | raido
/et#!/
../et And when tabs "TOP ja uued" and "Näita kõik" where activated/clicked then:
/et#/activeTab=tab01
/et#/activeTab=tab02 I tried to fetch it in Google Webmaster Tools but it seems it didn't work. I would appreciate it if anybody can check this solution?0