Attracta.com / "weekly submissions to top 100 search engines"
-
I recently received an offer from Attracta.com because I have a hostgator account. They are offering different levels of service for submitting xml sitemaps on a weekly basis. Is this a good idea?
Thanks for your feedback!
Will
PS see graphic:
-
There seems to be some confusion surrounding XML sitemap usage so I'd like to clear that up.
Many people question the need for an XML Sitemap, and for frequently updating it.However, creating and submitting an accurate up-to-date and properly formatted XML Sitemap in a fundamental part on good on-site SEO and really pays off.Websites are never hurt or penalized in any way by using sitemaps. Even if you already have a XML sitemap, adding a second will still help. In fact, if you look at Google's own robots.txt file ( http://www.google.com/robots.txt ) you can clearly see they are using multiple sitemaps themselves.
Google recommends XML Sitemap submission in all of their Webmaster and SEO tutorials: " Whether your site is old or new, we highly recommend you submit an XML Sitemap" - Google Webmaster Tutorial
In fact, recommending people NOT use a XML Sitemap in #4 on SEOmoz's list: "The Biggest SEO Mistakes SEOmoz Has Ever Made" written by a widely recognized SEO expert, the CEO of SEOMoz himself: http://www.seomoz.org/blog/whiteboard-friday-the-biggest-seo-mistakes-seomoz-has-ever-made
Attracta is the largest supplier of XML sitemaps with over 2.8 million websites using them, and that seems to be what makes them have such a very high PR. Also, Attracta sitemaps are free, show you exactly when Google and other search engines access them, and can be updated and resubmitted at anytime.
-
I found your replies helpful as I have just found a competitor site jumped into no.1 position from nowhere. Their strongest link in OSE is from their sitemap page listed on http://cdn.attracta.com/sitemap/competitor-sitemap/ which seems to have very high DA/PA so I was thinking of submitting the same.
Of course I already have my own sitemap.xml created on my site and submitted to Google Webmaster Tools and I don't have any crawl or indexing problems, but I was thinking that getting this additional sitemap hosted on Attracta must be passing some link juice because of the PA?
They seem to have a free submission just for sitemap so I thought an additional submission here would be free and harmless? Would it not?
I guess I am hesitating after what has been written here though.
-
Hi Will,
The advice already given here is solid.
In addition to that, a little personal experience I will share.
I had a client who accepted the offer you have received when the service first came out a couple of years or so ago. They told me that they saw a significant loss of indexed pages and traffic to their site within a short time.
I cannot say that what happened to their site was a direct result of using the service as I was not in control of the site at that time. However, I do know that once they decided they wanted to remove the service it took me almost 8 months to get the company to do so!
Hope that helps,
Sha
-
Do you need this service? Do you have a problem getting your site indexed?
As Casey said, as long as you follow the guidelines search engines can quickly index your site. Building some good links from good authority sites can help search engines find you.
Google webmaster tools tells you when you sitemap was last processed and you can see your crawl stats to see how many pages are being crawled/day.
If your site isn't getting indexed, then I'd try to find out the route cause rather than pay for a "search engine nagging" service like this.
-
Hi Will,
Today, there isn't much value in search engines besides the top 3, Google, Bing & Yahoo. There is really no difference in those 100 search engines to the top 3.
That being said, Google, Bing & Yahoo are all sophisticated enough to crawl and find your website without it being "submitted". It is a great idea to submit a sitemap for sure, but you dont need to submit it every week. If you follow the http://www.sitemaps.org/ guidelines you can set it up where the search engines will automatically revisit your sitemap and crawl your content on a daily, weekly, monthly, etc basis.
Also, If you are using WordPress you can download a free plugin to create your sitemap that makes it easy to set up the crawl basis.
I hope this helps.
Thanks,
Casey
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
Why did this fabric site disappear for "fabric" and why can't we get it back?
Beverlys.com used to rank on the first page for "fabric." I'm trying to get the date of their demise, but don't have it yet so I can't pinpoint what Google update might have killed them but I can guess. In doing a backlink analysis, there were hundreds of poor quality, toxic sites pointing to them. We have carefully gone through them all and submitted a disavow request. They are now on page 9 from nowhere to be found a week ago. But, of course, that's not good enough. They are on page 2 for "fabric online" and "quilt fabric." So Google doesn't completely hate them. But doesn't love them enough even for those terms. Any suggestions? They are rebuilding the site to use a different ecommerce platform with new content and new structure. They will also be incorporating the blog within the site and I've advised them on many other ways to attract traffic and backlinks. That's coming. But for now, any suggestions and help will be much appreciated. Something has got to be holding them back for that one gem of a keyword. Also, I would like to know what experiences others have had with the disavow request form. Does Google absolutely hold you to making every attempt you can at getting those links removed? ANd how does it know? No one responds so it seems to be such a waste of time. And many now actually charge to remove your links. Thoughts? Thanks everyone!
White Hat / Black Hat SEO | | katandmouse0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Disavow wn.com?
I am cleaning up some spammy backlinks for a client and will be submitting a disavow at Google. This particular company website has 2,000+ backlinks from the domain wn.com which appears to be "World News". If you go to it, it appears to be nothing more than scraped content from other sites. Here is a recent example, where my client is linked to (I don't even see the backlink on the page, but it is in the source code!):
White Hat / Black Hat SEO | | gbkevin
http://article.wn.com/view/2013/11/22/Hungarian_Woman_Sentenced_to_One_Year_in_Prison_for_Her_Role/#/related_news But when I look at Moz metrics, WN.com has a domain authority of 90! So I don't want to disavow something that could POTENTIALLY be helping us. The client's website gets zero traffic from wn.com and I've never seen my client linked to in anything worthwhile... it kinda looks spammy to me. If you were me, after looking at WN.com and taking everything into account... would you disavow it? This client really needs to create a healthier backlink profile. Thanks!0 -
How cloudflare might affect "rank juice" on numerous domains due to limited IP range?
We have implemented quite a few large websites onto cloudflare and have been very happy with our results. Since this has been successful so far, we have been considering putting some other companies on CL as well, but have some concerns due to the structure of their business and related websites. The companies run multiple networks of technology, review, news, and informational websites. All have good content (Almost all unique to each website) and rankings currently, but if implemented to cloudflare, would be sharing DNS and most likely IP's with eachother. Raising a concern of google reducing their link juice because it would be detected as if it was coming from the same server, such as people used to do for their blog farms. For example, they might be tasked to write an article on XYZ company's new product. A unique article would be generated for 5-10 websites, all with unique, informative, valid and relevant content to each domain; Including links, be it direct or contextual, to the XYZ product or website URL. To clarify, so there is no confusion...each article is relevant to its website... technology website- artciel about the engineering of xyz product
White Hat / Black Hat SEO | | MNoisy
business website - How xyz product is affecting the market or stock price
howto website - How the xyz product is properly used Currently all sites are on different IP's and servers due to their size, but if routed through cloudflare, will Google simply detect this as duplicate linking efforts or some type of "black hat" effort since its coming from cloudflare? If yes, is there a way to prevent this while still using CL?
If no, why and how is this different than someone doing this to trick google? Thank you in advance! I look forward to some informative answers.0 -
"NOINDEX,FOLLOW" same as "NOINDEX, FOLLOW" ?
Notice the space between them - I am trying to debug my application and sometimes it put in a space - Will this small difference matter to the bots?
White Hat / Black Hat SEO | | bjs20100 -
Infographic submission sites potentially offering paid links....
Good Morning/Afternoon fellow Mozzers, I recently created an infographic and am now looking to get it distributed via as many publications as possible. I discovered some great sites with collections of infographics.However I have discovered a multitude of sites offering to review and feature the infographic, or "express" submissions so the graphic features faster for a price..... links below. http://www.amazinginfographics.com/submit-infographics/ http://infographicjournal.com/submit-infographics/ 2 questions 1. Is this considered as buying links? My instincts say Yes. 2. Some sites offer mix of free and "express" paid submissions. If the answer to Q.1 is yes, should I avoid them all together even if my graphic gets picked up free? Thanks in advance for the feedback.
White Hat / Black Hat SEO | | RobertChapman0 -
What to do about "Penguin" penalties?
What are your suggestions about moving forward with sites hit by the "Penguin" penalties? Wait it out and see if the penalty goes away Try to remove spammy backlinks and resubmit (is this worth the time and effort) Build quality backlinks to offset (will this even work if they have thousands of spammy links) Blog more (I think this is probably a no brainer) Scrap the site and start from scratch (This is last resort and don't want to do this if at all possible) Or any other ideas are greatly appreciated
White Hat / Black Hat SEO | | RonMedlin0