Agree with Marcus. Using different CSS for different devices and same content will help to avoid content duplication and many other issues with SEO.
Posts made by de4e
-
RE: Responsive web design and SEO
-
RE: Worldwide CDN with mainland China support.
Thanks for your answer, but unfortunately Azure doesn't support mainland China users (It's only exclusion - http://bquot.com/9j8).
Anyway i'll try trial version, and test delivery speed.
-
Worldwide CDN with mainland China support.
Hello Mozers,
Can you please recommend me Content Delivery Network (CDN) working both for mainland China and worldwide regions?
Many people suggest ChinaCache which provide services only for Asian countries. But for me will be preferable to find one ultimate worldwide solution.
Any thoughts will be appreciated.
Thanks.
-
RE: How is this achieved - SPAM
Freelance.com has 12,300,000 pages in index and most of them are this type of pages, so it's very hard to monitor all keywords manually. If only part of this pages works - bounce rate to other doesn't matter at all, by the way they have page "/jobs/iPad/" too.
User relevance still main goal for Google, but using statistical algorithm has some limitations especially for such rare queries. For more frequently and competitive keywords this tactic will not work.
Personally i think it's black hat with so many internal links and custom generated pages, because it hurt user experience, but using 1-3 such internal links is ok, and can positively affect positions in SERP .
-
RE: How is this achieved - SPAM
It's done to get long term keywords traffic. When competitions is very low internal links are enough to rank on 1st page. Below i've try to describe how they reach this:
1. Create the list of keywords from keyword tools or site content data mining.
2. Create custom URLs structure for these keywords pages: /job-search/keyword/ =
3. Automatically create related links from all relevant pages with exact anchor text.
4. Content on this aggregated page is highly relevant to query and have enough internal links from other pages with high relevance. All such page are unique. Also quantity of content is much more then for separate items, so page indexing is easier.
5. Profit!
PS: It works rather good for sites with large number of pages in google index and large close-related pages clusters like freelance.com.
An one more point - they use rss search because Google likes fresh content and in this case newest pages are on top.
-
RE: Hotspot area for SEO
Just today have read good related article http://www.seobythesea.com/2012/01/sets-semantic-closeness-segmentation-and-webtables/ .
Personally i recommend start using HTML 5 semantic tags to help Google better understand structure of you content. Even if there are not ranking factors at this time, once HTML5 became a standard it will.
Also agree with Egol - stuff at the top of the page (in html code) a lot more important know.
-
RE: Strange bounce rate trending
Agree, possible there are duplicate GA scripts on part of pages.
-
RE: Google is not Indicating any Links to my site
Hi,
I've got an idea.
You said:
These pages linking to our new domain are indexed
The links existed the day the site was launched so when the new pages were crawled they existed.
SO the question is: were pages linking to new domain re-indexed after you've add those links?
If no, then just add them to "google addURL" for re-index.
Also operator is "link:" not "links:"
-
RE: Has Google changed its algorithm? My traffic has almost doubled and I don't know why.
This site resell affiliate traffic - so if you've visitors from this source, than somebody definitely register account and add you site.. if traffic stopped it can be trial $10.
-
RE: Report site for duplicate content
I think this site violate Google Advertising Policies - http://bquot.com/997 , so you can send a message to Google support here https://support.google.com/adwords/bin/request.py?hl=en&display=categories
Hope it helps.
-
RE: Facebook question
Agree, also only unique peoples counts. So if same user share 2 posts only 1 will count, but if 2 other different peoples comment on these shares, it will be 3 "talk about" peoples
-
RE: I want tips on how to start a Lead Mangement For a Human Resource website?
Hi,
Can you please clarify what lead means in your case. Is it a company who search for employees, or HR agency who want to advertise their vacancies, or specialist who search for job, or may be all of them? And what should they do to became a lead?
-
RE: Site wide search v catalogue search
Hi,
I'm personally recommend Google Custom Search. It works great for me.
Also You can find in Google many search scripts for most popular platforms(php, .aspx etc). Their will help if you don't want add you pages in Google index, or have private segments on site (personalized by the way)
-
RE: Why Do Links, DA and PA have no affect on this search result?
Agree, i have same picture. Also almost all http://www.plumbingcourse.org.uk/ back links have this key phrase in anchor text.
-
RE: How should I structure my product URLs?
Hi Dru,
Look at this from visitors point of few. realthread.com/products/american-apparel-2001 is much easier to read, looks better in SERP and backlinks .
As we know Google goal is searcher satisfaction, so what is good for visitors is good for Google.
-
RE: Canonical
Watch this video. http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday
This topic covered pretty good.
-
RE: Changing preferred domain
Way you propose is good if you can ask to change most of external links. If no, some link weight will be lost after redirect.
I can propose alternative decision. You can create special subdomain or folder for country with such issues. And redirect visitors by IP. Of course in this case you should use canonical to avoid duplicate content. In result you will have primary page URL in SERP, but all people from specific country will be redirected to working pages.
Cheers,
Vladimir
-
RE: Frequent server changes
Agree with Istvan, i didn't see any negative effect on ranking, only positive if new hosting is faster.
-
RE: Getting 403 error in forum
Hi,
As i see access for this type of pages allow only for registered users. Is it right?
If yes, don't worry about ranking, just disallow pages like this in robots.txt
-
RE: Crawl Diagnostics finding pages that dont exist. Will Rel Canon Help?
Hi,
Yes, canonical tag will word great.
But if people share this page and currently there are small amount of links to www.completeoffice.co.uk/Products/Products.php, i recommend to create 301 redirect for better URL look.
-
RE: Robots.txt for subdomain
Robots.txt work only for subdomain where it placed.
You need to create separate robots.txt for each sub-domain, Drupal allow this.
it must be located in the root directory of your subdomain Ex: /public_html/subdomain/ and can be accessed at http://subdomain.root.nl/robots.txt.
Add the following lines in the robots.txt file:
User-agent: *
Disallow: /
As alternative way you can use Robots <META> tag on each page, or use redirect to directory root.nl/subdomain and disallow it in main robots.txt. Personally i don't recommend it. -
RE: Changing URL Structure
You should create 301 redirects for all old pages. By the way you can use "Mod rewrite" module for Apache.
If it's hard to determine rules, you can redirect object pages to new category page, but of course it is less efficiency way.
-
RE: What is your onsite linking strategy?
I'm not sure the best link structure exist. But i usually use 2 different approach for internal cross-linking.
First one is ease to use and useful for database sites, like shops, directories etc. It's simple tree hierarchy with vertical links: home<>category<>object and horizontal links between related objects and categories.
Second one i use for sites with large amount of content. It's more algorithmic and lead to measurable results in my tests. Here is it:
1. Select keyword and target page for this keyword.
2. Search in google: "keyword" site:mysite.com
3. Google highlight "keyword" in relevant pages text, or part of keyword.
4. what you need is link to target page via this highlighted keywords, or if there are part of keyword - update text to exactly match, or add additional sentence.
Usually i use no more than 10 links with each anchor, and it increase ranking on 1-3 positions. Be carefully - too many links is not good, per link efficiency drop down when target page link from more the 10 pages with same anchor..
-
RE: Noindex duplicate content penalty?
Do you really want to double your work? Parse and later remove forums content?
I think will be much better rewrite yahoo answers, of course it need more time and resources, but your content will be unique. And you've got search traffic much faster. It's ease to find cheap rewrites, who fill your forum very fast.
-
RE: Sub Domains
First way will work great. And you can always update you content later.
-
RE: What about "CAPS" in site titel.
Search is always case insensitive - http://me.lt/7ZVGv.
What it can really influence is CTR for SERP. By the way #1 site for your example show up for me.
But be carefully to avoid "looks like spam" snippet for you site.
-
RE: Additional Pages in SERP
keyword.com is totally unreachable at this stage.
I've proposed that google limit number of my urls on 1st page. Before mysite.com is not #1. Have you any thoughts about this?
Anyway now i am going to keep all #4 pages and use videos, Thank you for analysis.
-
RE: Best way to index backlinks
Easiest way is add url here http://www.google.ru/url?sa=t&rct=j&q=google%20add%20url&source=web&cd=1&ved=0CDYQFjAA&url=http%3A%2F%2Fwww.google.com%2Faddurl%2F&ei=tpfLTtzfNIf0-gb7i_nSDg&usg=AFQjCNEAk-snUt37grGpxHVUfBBq-DPG6g&sig2=YV7PFrWKmA7ygq6R9UrJWg&cad=rjt
Satellite-builders said it's work perfect)
Also there are special tools for this, if you need to add many pages at once and don't won't to type capchas.
-
RE: Canonical URL's - Do they need to be on the "pointed at" page?
Agree,
There are many possible variations of same URLS, not under site owner control - different ?parametrs etc. So better add cannonical to each page.
-
Additional Pages in SERP
Hi Mozers,
Can anybody help me with this.
For "keyword phrase" SERP looks like this:
4. mysite.com/page2 ...
13. Mysite.com/page3
14. Mysite.com/page4
Is it possible to include Mysite.com/page3-4 both to the top 4th-5th, or better merge this pages and promote only one?
Thanks.
-
RE: Keyword tracking using Advanced Segment
I think you can try use for space.
Or just create segment like this:
Keyword exactly match "logo design india"
Or
Keyword exactly match "logodesign"
-
RE: "And" vs "&"
There are both stop words, and most SE's ignore them. they apply in SERP only if you search exactly "and".
So (Holiday Inn and Suites)=(Holiday Inn & Suites) , but if they search ("Holiday Inn and Suites ") he find only page with exactly phrase.
-
RE: Blocking robots.txt
You can block on server side all IP except Google bot for any file, but it may lead to ban, because of cloaking.
-
RE: Canonical Tag - Question
To answer this questions needed to understand why google have implement "canonical" tag.
Before, to determine is content duplicated or not. Google bot downloaded page content and via complex algorithm compare it with other page in index. As i think there are special bot running through indexed pages database and searching duplicates (that's why copy-paste sites take ban not right after indexation but in some time after).
Tag "Canonical" make this task more easier, Google bot don't need to download page with duplicate content, just need to check section, and may be hash or something like "hashsumm" for . So there are no necessity to download and store same data few times(delete stored data is hard for high-load data centers). It's more effective and fast way to crawl large data sets like web. Also link and url related data, i think, should be added to primary page data set.
I've made a test on this, Google download much less data if the page has rel="canonical" to other page, compare to primary page.
So according this answers for your questions are:
1. Link just flow as usual's, all link data for duplicated pages merge with data for primary page. So PR may slightly decrease in some cases,by the way if you have links from same pages to both primary and duplicated pages. But impact is not critical, almost similar to 301.
2. No, because Google bot check not only canonical. About this i have one more point, Google is statistical SE, and rate pages on topics, so in you case even if canonical will added to pages, it will not help you rank better for both terms.