Agree with Marcus. Using different CSS for different devices and same content will help to avoid content duplication and many other issues with SEO.
- Home
- de4e
de4e
@de4e
Job Title: Head of WW SEO/SEM
Company: Veeam Software
Website Description
I do marketing consulting (SEO/PPC/CRO/Analytics)
Favorite Thing about SEO
Search algorithms, statistical models.
Latest posts made by de4e
-
RE: Responsive web design and SEO
-
RE: Worldwide CDN with mainland China support.
Thanks for your answer, but unfortunately Azure doesn't support mainland China users (It's only exclusion - http://bquot.com/9j8).
Anyway i'll try trial version, and test delivery speed.
-
Worldwide CDN with mainland China support.
Hello Mozers,
Can you please recommend me Content Delivery Network (CDN) working both for mainland China and worldwide regions?
Many people suggest ChinaCache which provide services only for Asian countries. But for me will be preferable to find one ultimate worldwide solution.
Any thoughts will be appreciated.
Thanks.
-
RE: How is this achieved - SPAM
Freelance.com has 12,300,000 pages in index and most of them are this type of pages, so it's very hard to monitor all keywords manually. If only part of this pages works - bounce rate to other doesn't matter at all, by the way they have page "/jobs/iPad/" too.
User relevance still main goal for Google, but using statistical algorithm has some limitations especially for such rare queries. For more frequently and competitive keywords this tactic will not work.
Personally i think it's black hat with so many internal links and custom generated pages, because it hurt user experience, but using 1-3 such internal links is ok, and can positively affect positions in SERP .
-
RE: How is this achieved - SPAM
It's done to get long term keywords traffic. When competitions is very low internal links are enough to rank on 1st page. Below i've try to describe how they reach this:
1. Create the list of keywords from keyword tools or site content data mining.
2. Create custom URLs structure for these keywords pages: /job-search/keyword/ =
3. Automatically create related links from all relevant pages with exact anchor text.
4. Content on this aggregated page is highly relevant to query and have enough internal links from other pages with high relevance. All such page are unique. Also quantity of content is much more then for separate items, so page indexing is easier.
5. Profit!
PS: It works rather good for sites with large number of pages in google index and large close-related pages clusters like freelance.com.
An one more point - they use rss search because Google likes fresh content and in this case newest pages are on top.
-
RE: Hotspot area for SEO
Just today have read good related article http://www.seobythesea.com/2012/01/sets-semantic-closeness-segmentation-and-webtables/ .
Personally i recommend start using HTML 5 semantic tags to help Google better understand structure of you content. Even if there are not ranking factors at this time, once HTML5 became a standard it will.
Also agree with Egol - stuff at the top of the page (in html code) a lot more important know.
-
RE: Strange bounce rate trending
Agree, possible there are duplicate GA scripts on part of pages.
-
RE: Google is not Indicating any Links to my site
Hi,
I've got an idea.
You said:
These pages linking to our new domain are indexed
The links existed the day the site was launched so when the new pages were crawled they existed.
SO the question is: were pages linking to new domain re-indexed after you've add those links?
If no, then just add them to "google addURL" for re-index.
Also operator is "link:" not "links:"
-
RE: Has Google changed its algorithm? My traffic has almost doubled and I don't know why.
This site resell affiliate traffic - so if you've visitors from this source, than somebody definitely register account and add you site.. if traffic stopped it can be trial $10.
-
RE: Report site for duplicate content
I think this site violate Google Advertising Policies - http://bquot.com/997 , so you can send a message to Google support here https://support.google.com/adwords/bin/request.py?hl=en&display=categories
Hope it helps.
Best posts made by de4e
-
RE: Robots.txt for subdomain
Robots.txt work only for subdomain where it placed.
You need to create separate robots.txt for each sub-domain, Drupal allow this.
it must be located in the root directory of your subdomain Ex: /public_html/subdomain/ and can be accessed at http://subdomain.root.nl/robots.txt.
Add the following lines in the robots.txt file:
User-agent: *
Disallow: /
As alternative way you can use Robots <META> tag on each page, or use redirect to directory root.nl/subdomain and disallow it in main robots.txt. Personally i don't recommend it. -
RE: Canonical Tag - Question
To answer this questions needed to understand why google have implement "canonical" tag.
Before, to determine is content duplicated or not. Google bot downloaded page content and via complex algorithm compare it with other page in index. As i think there are special bot running through indexed pages database and searching duplicates (that's why copy-paste sites take ban not right after indexation but in some time after).
Tag "Canonical" make this task more easier, Google bot don't need to download page with duplicate content, just need to check section, and may be hash or something like "hashsumm" for . So there are no necessity to download and store same data few times(delete stored data is hard for high-load data centers). It's more effective and fast way to crawl large data sets like web. Also link and url related data, i think, should be added to primary page data set.
I've made a test on this, Google download much less data if the page has rel="canonical" to other page, compare to primary page.
So according this answers for your questions are:
1. Link just flow as usual's, all link data for duplicated pages merge with data for primary page. So PR may slightly decrease in some cases,by the way if you have links from same pages to both primary and duplicated pages. But impact is not critical, almost similar to 301.
2. No, because Google bot check not only canonical. About this i have one more point, Google is statistical SE, and rate pages on topics, so in you case even if canonical will added to pages, it will not help you rank better for both terms.
-
RE: Report site for duplicate content
I think this site violate Google Advertising Policies - http://bquot.com/997 , so you can send a message to Google support here https://support.google.com/adwords/bin/request.py?hl=en&display=categories
Hope it helps.
-
RE: "And" vs "&"
There are both stop words, and most SE's ignore them. they apply in SERP only if you search exactly "and".
So (Holiday Inn and Suites)=(Holiday Inn & Suites) , but if they search ("Holiday Inn and Suites ") he find only page with exactly phrase.
-
RE: How is this achieved - SPAM
It's done to get long term keywords traffic. When competitions is very low internal links are enough to rank on 1st page. Below i've try to describe how they reach this:
1. Create the list of keywords from keyword tools or site content data mining.
2. Create custom URLs structure for these keywords pages: /job-search/keyword/ =
3. Automatically create related links from all relevant pages with exact anchor text.
4. Content on this aggregated page is highly relevant to query and have enough internal links from other pages with high relevance. All such page are unique. Also quantity of content is much more then for separate items, so page indexing is easier.
5. Profit!
PS: It works rather good for sites with large number of pages in google index and large close-related pages clusters like freelance.com.
An one more point - they use rss search because Google likes fresh content and in this case newest pages are on top.
-
RE: How should I structure my product URLs?
Hi Dru,
Look at this from visitors point of few. realthread.com/products/american-apparel-2001 is much easier to read, looks better in SERP and backlinks .
As we know Google goal is searcher satisfaction, so what is good for visitors is good for Google.
-
RE: Keyword tracking using Advanced Segment
I think you can try use for space.
Or just create segment like this:
Keyword exactly match "logo design india"
Or
Keyword exactly match "logodesign"
-
RE: Frequent server changes
Agree with Istvan, i didn't see any negative effect on ranking, only positive if new hosting is faster.
-
RE: Hotspot area for SEO
Just today have read good related article http://www.seobythesea.com/2012/01/sets-semantic-closeness-segmentation-and-webtables/ .
Personally i recommend start using HTML 5 semantic tags to help Google better understand structure of you content. Even if there are not ranking factors at this time, once HTML5 became a standard it will.
Also agree with Egol - stuff at the top of the page (in html code) a lot more important know.
-
RE: What is your onsite linking strategy?
I'm not sure the best link structure exist. But i usually use 2 different approach for internal cross-linking.
First one is ease to use and useful for database sites, like shops, directories etc. It's simple tree hierarchy with vertical links: home<>category<>object and horizontal links between related objects and categories.
Second one i use for sites with large amount of content. It's more algorithmic and lead to measurable results in my tests. Here is it:
1. Select keyword and target page for this keyword.
2. Search in google: "keyword" site:mysite.com
3. Google highlight "keyword" in relevant pages text, or part of keyword.
4. what you need is link to target page via this highlighted keywords, or if there are part of keyword - update text to exactly match, or add additional sentence.
Usually i use no more than 10 links with each anchor, and it increase ranking on 1-3 positions. Be carefully - too many links is not good, per link efficiency drop down when target page link from more the 10 pages with same anchor..
I'm graduated as radio engineer with deep math and physics background. working as Marketing manager and consultant for few different size companies.
Looks like your connection to Moz was lost, please wait while we try to reconnect.