Hi Damiano,
Matt explained very good in this video and basically he answers all your question.
If you have additional Q. please let me know
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Damiano,
Matt explained very good in this video and basically he answers all your question.
If you have additional Q. please let me know
No doubt, your post about Google Docs and Fb/Twitter/SU api's its just awesome.
but imho i think these are some great content
*** After watching EGOL Video... and Ed's Poke the box... u just can delete my answer lol
so it wont hurt your site by having duplicate content for the same product, on your site and on Amazons
and also found this on Google Books http://books.google.com/books?id=HbO4v_V8nQQC&pg=PA69&lpg=PA69&dq=listing+products+on+amazon+guide&source=bl&ots=soRDzUU_ps&sig=6ZPW52aQXXg3J7a6cUa7URa9JVk&hl=en&ei=r7G5TbGxDZK4tgfV5oDeBA&sa=X&oi=book_result&ct=result&resnum=5&ved=0CDYQ6AEwBDgK#v=onepage&q=listing%20products%20on%20amazon%20guide&f=false
Hi Cathy,
This is seems a good community to follow www.amazonsellercommunity.com
One good tip i can share, is when submitting your products to amazon, try to use a different/reworded description of the product.
I dnt think you are
here is the query and basically your doing fine
http://www.google.com/search?sourceid=chrome&ie=UTF-8&q=danz+hot+tubs
what do you mean by serious link building. if you got 200 links in one day its puts some flags
Echo1,
I would go to archive.org and see the website history and look if there is anything not normal with the site (porn, viagra, others)
In case you entered and purchased a domain with a penalty on it, i would submit a reconsideration request with google, explaining your situation
I ll stick with Montreal,
Both IPs are CA. I ll take into consideration If ips where in different countries ie russia and canada
Yes,
Bing have a Excel Addon.
where you can pull all keyword tool data the Microsoft Ad Inteligence (by gender, by volume) very useful
here is the url http://advertising.microsoft.com/support-center/adcenter-downloads/microsoft-advertising-intelligence
Dejan
how about the twitter profile feed?
lets say your profile feed and using a wordpress plugin for example that imports that feeds and displays it in your blog/company website... most of the plugin dont add nofollow on your links in your twitter feed. ( did not mention about twitter directories thats adds a big portion of your feed )
make sense?
Concerning Pagination,
I would create a "view all" where all the products are listed under this category. then i add rel canonical linking to the "View All " page.
its can help you with your first question and for the issue using filters.
We can get these info threw this API so Google
You can use the LEN() function in excel and it counts the characters inside a cell
=LEN(A1)
In my opinion,
Linking out to Business/Gov/Local authoritative sites can help built that trust in your brand, in the Eyes of the searcher. and Google will acquire these info about your site too. ( that u do exist and your are legit---> Trust)
For example BBB badge, NLA (national Limo association), FAA.gov, Local Chamber of commerce.
Hi Mik,
before this year, the explanation to this issue where Google show alternative title, because it used to pull it out from the DMOZ Directory and/or Yahoo Directory. thats when webmasters start using robots meta for "NOODP" and "NOYDIR".
So Google recently is changing the title depending on anchor text of the backlinks or what ever it thinks its relevant to the search query. (Higher CTR)
some current discussion happening here that you can learn from http://www.seroundtable.com/google-title-selection-12989.html
Yes for both questions.
Maintining the N.A.P(Name, Phone, Address) with your google places page and your website and with all local directories and basically all web properties are crucial for ranking, as it shown here by David Mihm.
Having your NAP on other web properties are called "Citations" and these are considered like links to your actual website. so the more you have uniformed NAP citations accross multiple web properties will increase rankings.
you can take a look to at :
Free accounts that you can create RSS feeds for any site
I think 20 days is too much ... but SEOmoz Webapp do provide alot of information threw the crawl.. i would wait couple of days and i try to contact support @ http://www.seomoz.org/about/contact
in meanwhile i would advise you to read this, maybe it can shed light to your problem
http://seomoz.zendesk.com/entries/409821-why-isn-t-my-site-being-crawled-you-only-crawled-one-page
Is your site more than 5000 pages? or in other words are your website have alot of pages?
if i am not mistaken, intitle exist in Bing and yahoo but not anymore inurl:
There is 2 way to submit your products to Google Product Search
ref (http://www.google.com/intl/en_us/products/waystosubmit.html)
yes Asif,
Google will pass the anchor text from the first one
please note , that if www.domain.com/widget is linked from top nav it will use it first
lol ok
when you click submit un Header Response you should get:
| Status: HTTP/1.1 200 OK |
Go to your site dashboard, under "Diagnostics" go to "crawl errors" just double check if google is receiving crawling errors while fetching ur page.
what s going to happen is googlebot will fetch that page and show you what it sees.
Tyler,
I would do 2 things
please inform us with the results
Hi Michael,
First, i think u have a extra "W" in the 3rd line. that needs to go .
to move pages to another domain do as follows
redirect 301 /oldpage.html http://www.newdomain.com/newpage.html
Michael,
if you got hit with on the 24th of february, this was the Panda algorithm update.
First, if you sure that your content is 100% unique and a high quality site? i would go to
http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en&start=800
This thread is dedicated to people that have a high quality site that has been negatively affected by this change. a Google employee will look closer .
On the other hand, the stuff you can do to help your site are (this is my opinion, still webmasters and SEOs trying to figure our how they can get out or what are the criteria that triggered the panda update on their site)
I think your talking about askthetrainer.com
after short analysis am sure that is not a penalty.
your site might be harmed from the Google Panda update
Yes,
it might take time to get removed from search, but crediting the base page will start with the first crawl after implementing
Hi barry,
If you use these PLR with a standard spun, it will eventually have a negative effect. (especially after the scraper update and the panda)
I would do large scale spun (paragraph, sentences)
Hi Kyle,
I think your kinda misunderstood canonical tags,(being on the base pages)
what basically what is going to happen when SE spider comes to http://www.domain.com/page.html?keyword=keyword#source=source, the canonical tag will tell them, that http://www.domain.com/page.html is the original page.
and its not an issue if it resided in the base pages. and basically it's a very good solution with your client problem
Dear inetteam,
I use OpenSiteExplorer.org and i export my backlinks to an excel workbook.
Each time there is an update (monthly) i copy the data and paste them in the Sheet 1. then i go to sheet 2 and adjust the "Data Source" (list is bigger now) and i click refresh data.
normally all the new links will appear under the last link of last month.
I bet there is other Mozzers can do this in another way or better
Dear searchpl,
It will be better to add and verify blog.domain.com in your GWT, and add a separate sitemap inside this account.
Hi,
Google is putting a closer eye the last year on Google Places.
by stuffing custom categories with keywords its not a good idea, try to use their own categories and emphasis more on increasing citation to your Google Places listings and acquire more reviews from your customers.
Your Competitors are kinda grandfathered in, but soon or later, Google will drop them or minimize/decrease its value in ranking if it look to keywordy.
Lawrence,
how about leveraging free blog platforms for the link wheel?
WordPress.com — Get a Free Blog Here
http://blogger.com
LiveJournal: Discover global communities of friends who share your unique passions and interests.
Blogsome
Bravenet - Web Hosting, Free Web Hosting and Web Tools
Friendster - Home
Knol - a unit of knowledge: share what you know, publish your expertise.
Welcome to Windows Live (MSN Spaces http://msnspaces.com )
Squidoo : Welcome to Squidoo
Sign up | Tumblr
Weebly - Create a free website and a free blog
Webs - Make a free website, get free hosting
Hubpages.com
This Tactic is expensive and take much time to implement.
because you need to have each mini site, hosted on different class C ip, Different TLDs, Different Whois, different GA / Adsense codes (if implemented).
mainly this tactic is kinda popular in the gambling industry
it wont harm to try this and try to add it before all redirects
<code>Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{THE_REQUEST} ^/default.htm HTTP/ RewriteRule ^/(default.htm)$ http://www.otherdomain.com [R=301,L]</code>
Do you mind posting URL or PM me
**For your first question, **
Yes, from <url>to</url> is 1 video.
so you have to add for each video you have on your site.
**Concerning the second question, **
it doesn't matter what you name the file.
but when you finish creating the sitemap.
Cheers
try to check your http header please.
add your http://www.website.com
and see what it will serve you. a 200 OK or a 301 to default.htm
i have a list of 50 "General" directories that i have been submitting with each client i handle, that i compiled it over the years.
but for relevant to niche / location directories yes i do search for them.
No,
Disallow ?condition=
Disallow ?cat=
Disallow ?instructional_level=
Check the google cached date of the homepage and listing page.
the more frequent caching the more valuable
Yes you can block it thew robots.txt and also by Adding a rel="canonical" link in the page code itself will accomplish the task.
Disallow: /category/english-language-learners/*?
That is the Issue,
while submitting a Video Sitemap is should be like this
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:video="http://www.google.com/schemas/sitemap-video/1.1"><url> <loc>http://www.example.com/videos/some_video_landing_page.html</loc> <video:video> <video:thumbnail_loc>http://www.example.com/thumbs/123.jpg</video:thumbnail_loc> <video:title>Grilling steaks for summer</video:title> <video:description>Alkis shows you how to get perfectly done steaks every time</video:description> <video:content_loc>http://www.example.com/video123.flv</video:content_loc> <video:player_loc allow_embed="yes" autoplay="ap=1"> http://www.example.com/videoplayer.swf?video=123</video:player_loc> <video:duration>600</video:duration> <video:rating>4.2</video:rating> </video:video> </url></urlset>
you are using the normal sitemap layout and not using the video sitemap version.
the most important information that is required to submit threw the sitemap is
* url
* title
* description
* Thumbnail of the video
* location of the video where is hosted (ie /videos/video1.flv)
Can you please provide your video sitemap url.
Hi joseph,
first investigate who is redirecting root to /default... is it htaccess or script?
try to use
redirectmatch 301 /default.htm http://www.website.com
please follow up