We noticed that goods offered in our email newsletters used to disappeared from first google search results page!?
-
We noticed that goods offered in our email newsletters used to disappeared from fisrt google search results page. Goods where in top 5 positions or even higher, but after email newsletters we didn't find them even in top 100. We suspect service provider of email sending is in blacklist? Could it be reason? If yes, how could we check that?
-
No that should not be an issue...a single privately sent email to customers should not effect your rankings in the google rankings. Do the due diligence on the provider company that you worked with, use the open site explorer and Webmaster Tools to see if you have new links, if so where did they come. It sounds like an issue that is not related to the email...but at any rate try to find out why a dropped happened
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Anyone use a white label SEO company?
I work on my own and beginning to get more clients than I can handle effectively so this is my first look into outsourcing some of the work. Does anyone have a good resource for white label SEO? Do you have any experience with the following? Others? Sky Diamond Media
Industry News | | Masbro
Webimax
Imprezzio (local)
Posirank
OrangeSoda
Profit By Search0 -
How Google could quickly fix the whole Links problem...
A Thursday morning brainstorm that hopefully an important Google manager will see... Google could quickly end all the problems of link buying, spammy links, and negative SEO with one easy step: Only count the 100 best follow links to any domain. Ignore all the nofollows and everything beyond the 100 best. They can choose what "best" means. Suddenly links would be all about quality. Quantity would not matter. Fiverr links, comment links, and all the other mass-produced spam links would literally be ignored. Unless that's all a domain had, and then they would surely be stomped by any domain with 100 decent natural links. Would it be an improvement over today's situation?
Industry News | | GregB1230 -
What is the most current advice on editing your company Wikipedia page?
I've noticed a lot of incorrect and misleading information on our company Wikipedia page. Should we, as a company, be actively monitoring and updating this page when we spot factual errors. I realize wikipedia doesn't want marketeze or heavy PR on company pages, but those are the people that generally pay attention to wikipedia entries. Also, we certainly don't want to get penalized or cause any undue media attention from our edits. If possible can you also offer links to other sources rather than just opinion or personal experience, I'm having to present this to a large corporate board and want to be as detailed as possible. Thank you,
Industry News | | kerplow
Kerplow0 -
What is the best way to share good content (and help myself in the process)
i spend a decent amount of my spare time browsing quora, stumbleupon, google reader, pulse, etc keeping up with all the different aspects of internet marketing when i come across a particuarly valuable piece of content i take a few seconds and share it using hootsuite on my linked, twitter, facebook (i spread them out so there is only one post a day) this rewards those who write valuable content but im not doing anything to benefit my site - www.sawwebmarketing.com im establishing myself as somebody who has good stuff to share to my potential clients.... what a quick, easy way to share the high quality things i come across that will create links for me as well? (short of writing my own blog post listing the high quality articles i found with my thoughts on each) thoughts? opinions? thanks everybody! Matthew
Industry News | | Mrupp440 -
How to report rankings after the Google Venice update?
As a profesional agency we focus on traffic and conversions, but rankings are still a good KPI to please customers. Unfortunately rankings are not reliable anymore sinds the Google Venice update. My question is; "How can you still report about rankings, but without the risk that your customer sees total different results?" Software we use At the moment we use Rank Tracker from Link Assistant.
Industry News | | VanSoelen0 -
Google to Target Overly SEOd Sites
I just watched the video from Barry Schwartz talking about the new Update to come with the Google Algo. Video of his Friday Post: http://www.youtube.com/watch?v=fJqSPT2NXdA I have also started reading on Webmasterworld on this topic: http://www.webmasterworld.com/google/4429947.htm What do you think Google has in the list of changes?
Industry News | | Ben-HPB0 -
How many small businesses use SEO?
I'm looking for data (not opinions) on how many small businesses use SEO, that is, either do it in-house, hire a firm, or hire a freelancer. I've poked around on SEMPO and eConsultancy but can't find what I'm looking for.
Industry News | | jsteimle0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690