I know about code structure and how to write down in code! I have question with different manner. Can I add nofollow attribute for internal links. If yes so, How can I justify link whether I add nofollow attribute or not...
Posts made by CommercePundit
-
RE: How to Define rel=nofollow Attribute for External Links?
-
How to Define rel=nofollow Attribute for External Links?
I want to define rel=nofollow attribute for Vista Patio Umbrellas. I have designed narrow by search section on home page. I want to define rel=nofollow attribute for all text links which are available in left navigation. So, what is best solution for that?
-
RE: Number of Indexed Pages are Continuously Going Down
My domain is active since 1 year. I want to know more about following two statement.
Most likely, all your pages aren't "worthy" of current indexing by Google
Make sure as well to REMOVE any questionable or thin content.
BTW: Thanks for your wish!
-
Multiple Domain Tracking with One Google Analytics Account
I am working on multiple online retail stores as follow. Right now, I have tracking facility with multiple account. Attached image can give you more idea about it.
http://www.vistastores.com/
http://www.lampslightingandmore.com/
http://www.vistapatioumbrellas.com/I don't want to track my statistics with multiple account. Because, it may take too much time behind Goal set up as well as analysis. Can I track all data under one account for all online retail stores?
-
Number of Indexed Pages are Continuously Going Down
I am working on online retail stores. Initially, Google have indexed 10K+ pages of my website. I have checked number of indexed page before one week and pages were 8K+. Today, number of indexed pages are 7680.
I can't understand why should it happen and How can fix it? I want to index maximum pages of my website.
-
RE: Can I add NOFOLLOW or NOINDEX attribute for better organic ranking?
It's sound interesting. I can go forward with Rel canonical...
-
Can I add NOFOLLOW or NOINDEX attribute for better organic ranking?
I am working on online retail store which is highly dedicated to Patio Umbrellas.
My website is on 2nd page of Google web search for Patio Umbrellas keyword.
I have one another internal page with Patio Umbrellas text link.
http://www.vistapatioumbrellas.com/21/patio-umbrellas.html
I assume that, Google have confusion to give rank for my keyword during Patio Umbrellas keyword.
I want to set NOFOLLOW attribute or NOINDEX FOLLOW meta for this page.
Will it help me to rank high for Patio Umbrellas keyword. My ultimate goal is to reduce confusion for Patio Umbrellas keyword.
-
RE: How to fix issues regarding URL parameters?
Your concern is that, Google will crawl following all pages. If I will not do any thing with those pages. Right?
http://www.vistastores.com/table-lamps
http://www.vistastores.com/table-lamps?limit=100&p=2
http://www.vistastores.com/table-lamps?limit=60&p=2
http://www.vistastores.com/table-lamps?limit=40&p=2
Now, my website is on 3rd page of Google for Discount Table Lamps keyword.
I have fear that, If Google will crawl multiple pages with duplicate Title tag so it may mesh up current ranking for Discount Table Lamps keyword.
What you think about it?
-
RE: How to fix issues regarding URL parameters?
Will it really work? Because, both page have different content.
http://www.vistastores.com/table-lamps have 100 products and
http://www.vistastores.com/table-lamps?limit=100&p=2 have different + unique 100 products.
One another problem is regarding Meta info. Both page have same Meta info. If Google will index both pages so it may create warning message for duplicate Meta info across too many pages.
-
RE: How to fix issues regarding URL parameters?
Honestly, I did not getting it. Because, I have read one help article about URL parameters by Google.
It shows me some different thing. Google suggested to use Google webmaster tools. But, I have restricted all dynamic pages by robots.txt.
So, I want to know best practice which may help me to gain my crawling and no of indexed pages.
-
RE: Natural vs UnNatural Links: How to Understand It?
Thanks for your insightful answer. Can you give me your Twitter handler so, I can follow you...
-
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google.
http://www.google.com/support/webmasters/bin/answer.py?answer=1235687
I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax.
URLs:
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit=Now, I am confuse. Which is best solution to get maximum benefits in SEO?
-
RE: Natural vs UnNatural Links: How to Understand It?
It's really great list to know about it. But, I have big question to define quality of website. I am searching too many questions on Google and found too many new website during my daily stuff. So, I am finding too many opportunity to create my links over there. But, I have hesitation to create links over there. Is there any specific method which may help us to define quality of external website?
-
Natural vs UnNatural Links: How to Understand It?
Today, I was reading one help article on Google webmaster tools help regarding Google friendly sites.
I found that, Natural links can give us more benefits compare to Unnatural links. So, I have search on Google regarding Natural vs Unnatural links and come to know about great video.
But, I am still confuse with exact understanding. I am trying to create external links on different websites. So, is it natural or not?
How can I justify value of external website before creating link over there.
-
How can I create product page in Facebook to sell products?
Today, I found interesting stuff in Google result. I found that few Facebook fan page have created product page in Facebook to sell products via Facebook like follow.
So, How can I get it done?
-
How to rank high on Bing?
Today, I was checking Google Analytics for my website. I found too many variation in Visits between Google & Bing.
I can understand that, We are not getting specific visits from Bing due to bad ranking.
Then, I have done detail R & D + read few useful articles to get it done. I found such a important discussion on Bing forum which is associated to draft on page.
I also checked Bing webmaster tools to know more about my website. I found that, crawling from Bing is really slow and let duane know about it via Twitter.
I have read Bing rankings cheat sheet. But, I am really confuse after reading it. Because, I am following Google webmaster guidelines as well as Bing webmaster tools guidelines.
So, How can I improve my performance over Bing? Is there any rock solid method for Bing like Google which may help me to achieve milestone?
I have one request for Rand to write down marathon blog post to rank high on Bing....
-
RE: How can I track search engine optimization data in Google analytics?
Yes, I am looking in new version. I have connected my Google webmaster tools to Google Analytics on 31st October 2011.
-
How can I track search engine optimization data in Google analytics?
My website is linked to a Google Analytics web property. But, I am not able to track search engine optimization data in Google Analytics. So, How can I get it done?
-
How to Fix Issue for Inactive Products in Google Merchant Center?
Today, I was checking my Google merchant center account. I come to know that, there are 145 inactive products are available from product feed. I have checked few products manually and found following error.
"The URL specified in your data feed wasn't working correctly when we reviewed this item." You can view more by attached image.
I have checked my URLs and it's working well. There is no issue in URL.
So, How do I fix issue?
-
Search Engine Blocked by robots.txt for Dynamic URLs
Today, I was checking crawl diagnostics for my website. I found warning for search engine blocked by robots.txt
I have added following syntax to robots.txt file for all dynamic URLs.
Disallow: /*?osCsid
Disallow: /*?q=
Disallow: /*?dir=
Disallow: /*?p=
Disallow: /*?limit=
Disallow: /*review-form
Dynamic URLs are as follow.
http://www.vistastores.com/bar-stools?dir=desc&order=position
http://www.vistastores.com/bathroom-lighting?p=2
and many more...
So, Why should it shows me warning for this? Does it really matter or any other solution for these kind of dynamic URLs.
-
RE: How to Solve Mysteries for Disabled Products?
It's really good explanation. I am going to set redirect. But not 301 but, 302. Because, I want to drill down more on same subject and will find out alternative solution with my problem.
I am not opposing you. I am picking parallel solution [~302 instead of 301] what you recommend.
BTW: Thanks for your time on my question. I am still not mark it as answered. Let's see what happen in next response.
-
RE: How to Solve Mysteries for Disabled Products?
Oh... That's really great.. But, I don't want to set 301 redirect to associated products or category? You will ask that, why? Because, similar products are available on my product page. You can check any product page. So, user can find out similar product from there.
Another reasons are as follow.
- I am doing hard work to get ranking for specific product page. If I will set 301 redirect so Google will cut down my impression with that specific keyword.
- I have measured that, Google is not passing equal page rank or similar data to redirect page. If I have 50 products with 3 page rank so it's not that much beneficial to me.
- Web designer, product mangers and similar other people are working to hard to decor any product page. So, why should I redirect that page to any where after hard work?
- You are right as per SEO point of view but, it is over my head. Why why why I redirect any one to my bedroom after decor entire living room with full of light?
-
How to Solve Mysteries for Disabled Products?
I want to solve mysteries regarding disabled products on my eCommerce website. I want to give one example for my one product to know more about it.
Product URL: http://www.vistastores.com/indoorlighting-patiolivingconcepts-20947.html
Product Name: Floor Lamp in Monterey Bronze Finish
- Before 3 Months, This product was live on my website with In Stock status. Google have crawled that product, added in XML sitemap, added in Google merchant center, added in too many external website during link building campaign.
- Before 15 Days, This product was live on my website with Out of Stock status. Now, visitor can visit this page but, can not add in shopping cart.
- Now, This product is disabled from website and not available for sell. I have done lot of work to compile content, image, page rank and many other SEO stuffs to get rank with specific long trail keyword. This product is suddenly disabled from website so, it's shows 404 error and redirect to custom 404 error page.
- But, I am not satisfy with 301 redirect to set 302 redirect. But, is it really good?
- Is it require to set 301 or 302 redirect on disable products?
- I will never sell this product again on website. But, what about my indexing, external links, page authority?
- This is creating too many up an down in webmaster tools data, merchant center data, xml sitemap data and impression data.
What is best solution for it? Can any one share good example for eCommerce website.
-
How to enable crawling for dynamic generated search result pages?
I want to enable crawling facility for dynamic generated search result pages which are generating by Magento Solr search. You can view more about it by following URLs.
http://code.google.com/p/magento-solr/
http://www.vistastores.com/catalogsearch/result/?q=bamboo+table+lamp
http://www.vistastores.com/catalogsearch/result/?q=ceramic+table+lamp
http://www.vistastores.com/catalogsearch/result/?q=green+patio+umbrellaRight now, Google is not crawling search result page because, I have added following syntax to Robots.txt file.
Disallow: /*?q=
So, How do I enable crawling of search result pages with best SEO practice? If any other inputs in same direction so, it will help me more to get it done.
-
Does It Really Matter to Restrict Dynamic URLs by Robots.txt?
Today, I was checking Google webmaster tools and found that, there are 117 dynamic URLs are restrict by Robots.txt. I have added following syntax in my Robots.txt You can get more idea by following excel sheet.
#Dynamic URLs
Disallow: /?osCsidDisallow: /?q=
Disallow: /?dir=Disallow: /?p=
Disallow: /*?limit=
Disallow: /*review-form
I have concern for following kind of pages.
Shorting by specification:
http://www.vistastores.com/table-lamps?dir=asc&order=name
Iterms per page:
http://www.vistastores.com/table-lamps?dir=asc&limit=60&order=name
Numbering page of products:
http://www.vistastores.com/table-lamps?p=2
Will it create resistance in organic performance of my category pages?
-
RE: How to Define Best URL Structure for Product Pages?
I want to add my response on this question after long time. Because, I have made few changes as per discussion. You can see by this excel sheet.
I have changed entire structure for URLs and finished following tasks.
- 301 Redirect [Old URLs to New URLs]
- Multiple XML Sitemaps [Create Category Wise & Submit to Webmaster Tools]
- Rel=canonical for duplication
I have very simple question for crawling. How Google will act for these changes. Will Google slow down my crawling or not? OR any other inputs which may help me in same direction!
-
RE: Delete Facebook Account to Trasfer Username on My Fan Page
Thanks for your prompt reply. I am going to follow which you have suggested and let you know status about it very soon.
-
Delete Facebook Account to Trasfer Username on My Fan Page
I have following Facebook fan page and want vistastores username which is my domain name. But, I am not able to get it due to my personal profile.
Fan page:
http://www.facebook.com/pages/Vista-Stores/160667877282170
Personal profile:
http://www.facebook.com/vistastores
I am thinking to delete my Facebook account. [Profile & Fan page] Then, I will re-open my Facebook account and create fan page.
So, Can I transfer my username from personal profile to fan page by this method?
-
How to Improve Performance on The Find Merchant Center?
I have uploaded 6718 products to The Find merchant center. There is no error in data feed as per dashboard.
I got only 13 visits from The Find in October 2011. [1st October to 13th October]
Is there any specific method to improve performance over The Find Merchant Center?
OR any success story or experiment which help me more to understand about this issue.
-
RE: How to Resolve Duplication of HTTPS & HTPP URLs?
I have set robots.txt file for HTTP and HTTPS versions. You can find out both file above your response. Thanks for your answer.
-
RE: How to Resolve Google Crawling Issues for My eCommerce Website?
I aware about it. But, I have confusion with omitted results. Google shows me 509 URLs in visible portion and remaining one in omitted results.
I have checked similar result for my competitor with table lamps keyword.
site:simplytablelamps.com
Google shows 2470 pages for my competitor and omitted result is very less compare to my website.
I really don't know more about it but it may cut out my impression with products which are included in omitted result. This is my assumption. What you think about it? If you can give me more idea so it will help me more.
-
RE: How to Set Custom Crawl Rate in Google Webmaster Tools?
Hi Daniel,
I was thinking about it due to very less number of indexing. Thanks for your suggestion.
-
How to Set Custom Crawl Rate in Google Webmaster Tools?
This is really silly question to set custom crawl rate in Google webmaster tools. Any one can find out that section under setting tab. But, I have confusion to decide number for request per second and second between requests text field.
I want to set custom crawl rate for my eCommerce website.
I checked my Google webmaster tools and find out as attachment. So, Can I use this facility to improve my crawling?
-
RE: How to Resolve Google Crawling Issues for My eCommerce Website?
Hi, Liam
I come back on this question after long time. Because, I am still surviving with crawling issue. Google is not crawling my website after implement all checklists.
Today, I read one blog post about to increase Google crawl rate.
Blog suggesting to set custom crawl rate with help of Google webmaster tools. So, Does it really matter to improve crawling?
What is important and helpful ... natural crawling by Google or embarrassing crawling by Google?
-
RE: How to Define Quality of External Website During Link Building?
I wouldn't recommend spending money or much time on a directory inclusion/link placement for a page that is not in the primary index and has at least a PageRank of 1 (the page itself - not the domain)
Thanks for your great list and sharing of experience. But, I have one question about above statement. Does it really matter?
Recently, I saw following Webinar on SEOmoz.
Both Webinar are excellent and get too many new things about link building. They recommend to do directory with specific manner and spending less time over there. They are focusing to do something real to gather natural links. What's something real in link building.
I can 100% agree with quality and paid directory where we can add our website with certain budget management.
In social culture, natural link environment... Does it rally matter to focus on directory? What you think about it? If you can give me more idea so it will help me more to understand about it. Thanks again for your answer.
-
RE: How to Define Quality of External Website During Link Building?
I checked your tool and really helpful. But, I have one question for this tool.
Yesterday, I have submitted one comment on blog post of Inspired Mag. My comment is on 2nd place. It's with nofollow attribute so I know that, it will not pass any page rank to my website.
Now, I checked that blog post URL with tool and give me following statistic.
Total Links found: 1624
Unique Links found: 1307
Questionable Links found: 3
Pages scanned: 40So, It's only for specific web page? If yes so what about entire website? I am not opposing you but have mind bubble to be more clear about it. Thanks again for your insightful answer.
-
How to Define Quality of External Website During Link Building?
I want to know about process, method or tool which can help me to define quality of external website during link building.
We are searching too many questions and topics on Google to resolve daily mind bubbles which land us on different website with different subject.
I found that, I was able to drop my website URL over there but confuse about quality of website.
I am selling Football and create external link from baby care website. So, will it make sense? My concern is that, Can we create external links from different subject website or specific to subject oriented website?
Is there any specific method which help me to understand more about external website and help me to take decision about link building?
-
RE: How to Specify Canonical Link Element for Better Performing?
I got it.... I am going to implement as previous one. Thanks for your prompt reply.
-
RE: How to Disallow Specific Folders and Sub Folders for Crawling?
Can I use Remove URL facility from Google webmaster tools?
-
RE: How to Specify Canonical Link Element for Better Performing?
@Gianluca Fiorelli
I have added following Meta in all duplicate products [2 to 11] exclude primary product [1].
I have marked this question as answered but raise one question after observe source code of all product pages. I have implemented following canonical on all duplicate product pages pointing to unique product.
So, now is it require on duplicate pages? Can I remove it from entire website? Because, duplication will not occur due to prevention of indexing for all duplicate products.
Note: I am still surviving from crawling issue. My crawling is still very slow and only 113 pages were indexed by Google.
-
RE: How to Disallow Specific Folders and Sub Folders for Crawling?
I have added Options -Indexes for images folder in htaccess file.
But, I still able to find out images folder in Google indexing.
Can I check? Is it working properly or not? I don't want to index or display images folder in web search any more.
-
URL Parameter is not available in website which was monitored by Google
I was checking URL parameters section over Google webmaster tools. Google have monitored following parameters and exclude it from crawling.
utm_campaign
utm_medium
utm_source
I have built URLs with following tool to track visits from vertical search engine like Google shopping and other comparison shopping engines.
http://www.google.com/support/analytics/bin/answer.py?answer=55578
So, I am quite confuse to see over my data.
Will Google consider external URLs which are available with above parameters or require to consist on live website?
Note: I am asking for my eCommerce website. http://www.lampslightingandmore.com/
-
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search.
One of my competitor is restricting website access to specific IP address or Geo location.
I have checked multiple categories to know more. What's going on with this restriction and why they make it happen?
One of SEO forum is also restricting website access to specific location.
I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A.
But, why Lamps Plus have set this? Is there any specific reason?
Can I improve my organic ranking?
Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
-
RE: How to Specify Canonical Link Element for Better Performing?
It's manufacturer part number.
-
RE: How Google treat internal links with rel="nofollow"?
Even if you don’t want a page to rank,
Page rank is ranking factor? I don't think so... I am not opposing you but in my category there are many websites which are performing well with low page rank. And, high page rank website is still at bottom.
Have you any idea about it?
-
RE: How to Specify Canonical Link Element for Better Performing?
Are you talking like this?
I have fix URL structure for all products and manipulate that product in multiple categories.
There will no change in URL structure.
-
How Google treat internal links with rel="nofollow"?
Today, I was reading about NoFollow on Wikipedia. Following statement is over my head and not able to understand with proper manner.
"Google states that their engine takes "nofollow" literally and does not "follow" the link at all. However, experiments conducted by SEOs show conflicting results. These studies reveal that Google does follow the link, but does not index the linked-to page, unless it was in Google's index already for other reasons (such as other, non-nofollow links that point to the page)."
It's all about indexing and ranking for specific keywords for hyperlink text during external links. I aware about that section. It may not generate in relevant result during any keyword on Google web search.
But, what about internal links? I have defined rel="nofollow" attribute on too many internal links.
I have archive blog post of Randfish with same subject. I read following question over there.
Q. Does Google recommend the use of nofollow internally as a positive method for controlling the flow of internal link love? [In 2007]
A: Yes – webmasters can feel free to use nofollow internally to help tell Googlebot which pages they want to receive link juice from other pages
_
(Matt's precise words were: The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity. Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt'ed out), but nofollow on individual links is simpler for some folks to use. There's no stigma to using nofollow, even on your own internal links; for Google, nofollow'ed links are dropped out of our link graph; we don't even use such links for discovery. By the way, the nofollow meta tag does that same thing, but at a page level.)Matt has given excellent answer on following question. [In 2011]
Q: Should internal links use rel="nofollow"?
A:Matt said:
"I don't know how to make it more concrete than that."
I use nofollow for each internal link that points to an internal page that has the meta name="robots" content="noindex" tag. Why should I waste Googlebot's ressources and those of my server if in the end the target must not be indexed? As far as I can say and since years, this does not cause any problems at all.
For internal page anchors (links with the hash mark in front like "#top", the answer is "no", of course.
I am still using nofollow attributes on my website.
So, what is current trend? Will it require to use nofollow attribute for internal pages?
-
RE: How to Specify Canonical Link Element for Better Performing?
You are 100% right. I am not able to see significant changes in crawling after 4 days of implementation. I am thinking to add meta for robots with noindex, nofollow specification on all duplicate product page.
Google will crawl and index only primary product. [That's unique one.] What you think about it? Will it work for me or not?
-
RE: How to Specify Canonical Link Element for Better Performing?
No, I don't want to index duplicate pages. And, not able to define unique attributes on all duplicate pages. Can you suggest me any alternative?
-
How to Specify Canonical Link Element for Better Performing?
I read Google webmaster centeral's blog post and help article about rel="canonical" which was compiled by Matt.
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
http://www.google.com/support/webmasters/bin/answer.py?answer=139394
I am working on eCommerce website and found too many duplicate pages with same product as follow.
1. www.lampslightingandmore.com/50_62_10133/java-bronze-floor-lamp-with-walnut-shade.html
2. www.lampslightingandmore.com/48_10133/java-bronze-floor-lamp-with-walnut-shade.html
3. www.lampslightingandmore.com/48_55_10133/java-bronze-floor-lamp-with-walnut-shade.html
4. www.lampslightingandmore.com/48_57_10133/java-bronze-floor-lamp-with-walnut-shade.html
5. www.lampslightingandmore.com/50_10133/java-bronze-floor-lamp-with-walnut-shade.html
6. www.lampslightingandmore.com/50_56_10133/java-bronze-floor-lamp-with-walnut-shade.html
7. www.lampslightingandmore.com/50_63_10133/java-bronze-floor-lamp-with-walnut-shade.html
8. www.lampslightingandmore.com/63_10133/java-bronze-floor-lamp-with-walnut-shade.html
9. www.lampslightingandmore.com/68_10133/java-bronze-floor-lamp-with-walnut-shade.html
10. www.lampslightingandmore.com/68_58_10133/java-bronze-floor-lamp-with-walnut-shade.html
11. www.lampslightingandmore.com/68_59_10133/java-bronze-floor-lamp-with-walnut-shade.htmlI have consider 1st product as a primary product and set following rel canonical tag on remaining products. Primary product also contain following rel canonical tag.
This was my experience to set canonical tag. But, I am not able to see any improvement on crawling. I was in that assumption due to duplication Google did not crawled my pages. But, Now what is problem with it? How can I fix it and specify proper canonical link element for better crawling?
Note: I am working to compile unique content on each product pages and make it live very soon.