What are the most trusted SEO sites?
-
Other then SEOmoz what sites can you trust for SEO?
Is there some type of formula I can use to find out if any site is trustworthy?
-
+1
What a great list.
^^ What they said, good luck !
-
+1
Getting news and/or advice right from the horses mouth is excellent. Just don't let one persons opinion on SEO guide you. You need to research because most of SEO is theory and if you get a "bad poster on a great site" like Ryan mentioned, you might waste a lot of time!
I'm sure there are still places out there touting article spinning and blog networks, but don't be fooled!
Kevin Phelps
http://www.linkedin.com/in/kevinwphelps -
Dana and Kevin's responses are great and deserve a thumbs up. I would add a few key site which have not been mentioned. I would place these primary information sites above any secondary sources:
http://googlewebmastercentral.blogspot.com/
http://insidesearch.blogspot.com/
http://www.youtube.com/user/GoogleWebmasterHelp
http://www.bing.com/community/site_blogs/b/webmaster/default.aspx
Before looking to other sources, my advice is go to the official source of information.
If you decide you need more analysis, SEOmoz is (obviously) my preferred suggestion. I would suggest the above sources combined with SEOmoz is enough to keep a person busy for a year.
When you do view other sources, including SEOmoz, keep in mind there are great posters on bad sites, and bad posters on good sites. The difference is, the good sites will (eventually) correct any bad information. I prefer SEOmoz because they do an exceptional job of correcting misinformation. Even so, it may take a week for the error to be caught. With such a large, active community it is a huge amount of work to review every post made on the site.
-
Dana pretty much nailed it. You're already in the right place though (SEOmoz). To add to this list I'd include:
- http://www.searchenginejournal.com/
- http://searchenginewatch.com/
- http://searchengineland.com/
- I'm actually a writer for the SEO.com blog which you might enjoy as well: http://www.seo.com/author/kphelps/
Not really any "formula", just exposure, experience and time with this industry.
Going to have to disagree with the Rank Tracker Add on that Dana is talking about though. If you are running less than 10 keywords, you'll be fine but anything more than that and Google will block your IP and your results will be messed up.
Does this help?
Kevin Phelps
http://www.linkedin.com/in/kevinwphelps -
Hi Haviv,
I think you are asking this question in the right place. I am sure there will be different responses than mine that are equally as good but here are my favorites (and totally trusted) favorites:
http://www.seobook.com (especially for the Firefox Rank Tracker Add On)
I am not an affiliate of any of these sites and these are not affiliate links. I just use all of these on a regular basis.
Hope that helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google & Site Architecture
Hi I've been reading the following article about Google's quality signals here: https://searchenginewatch.com/2016/10/10/guide-to-google-ranking-signals-part-6-trust-authority-and-expertise/?utm_source=Search+Engine+Watch&utm_campaign=464594db7c-11_10_2016_NL&utm_medium=email&utm_term=0_e118661359-464594db7c-17828341 They mention - 3) All your categories should be accessible from the main menu. All your web pages should be labelled with the relevant categories. Is this every category? We have some say 3 levels deep, and they aren't all in the menu. I'd like them to be, so would be good to make a case for it. Thank you
Algorithm Updates | | BeckyKey1 -
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
Algorithm Updates | | Stamats0 -
Duplicate Product Pages On Niche Site
I have a main site, and a niche site that has products for a particular category. For example, Clothing.com is the main site, formalclothing.com is the niche site. The niche site has about 70K product pages that have the same content (except for navigation links which are similar, but not dupliated). I have been considering shutting down the niche site, and doing a 301 to the category of the main site. Here are some more details: The niche sites ranks fairly well on Yahoo and Bing. Much better than the main site for keywords relevant to that category. The niche site was hit with Penguin, but doesn't seem to have been effected much by Panda. When I analyze a product page on the main site using copyscape, 1-2 pages of the niche site do show, but NOT that exact product page on the niche site. Questions: Given the information above, how can I gauge the impact the duplicate content is having if any? Is it a bad idea to do a canonical tag on the product pages of the niche site, citing the main site as the original source? Any other considerations aside from duplicate content or Penguin issue when deciding to 301? Would you 301 if this was your site? Thanks in advance.
Algorithm Updates | | inhouseseo0 -
Did .org vs. .com SEO importance recently changed?
I have seen previous answers in the Forum about this subject but Google has seemed to have again changed the playing surface. Within the past 30 days, we have seen a huge spike in organic search returns seeming to favor .org as domain authorities. Has anyone else noticed this shift and is it just coincidence or worth factoring in? If it is a shift, will Google punish those that have .org but have used.com previously for switching the redirects to serve .org first? Thanks, Jim
Algorithm Updates | | jimmyzig0 -
Local SEO: 1 Location Covering Multiple Surrounding Cities
I am setting up local pages on our main site for each of our dealers. Some of them cover multiple cities. For example, one dealer in Santa Rosa, CA, but also covers San Francisco (50 mile drive). While I know that with Google+ Local I can add coverage radius or zip code/cities covered, what about on that dealer's local page on our site? Should I create local pages for each city covered or cram local optimization into one? Keep in mind I only have one address to work with for each dealer (P.O. Boxes or Virtual Mail Boxes are NOT a good solutions). Looking for any white hat tips before I implement for all 100+ dealers.
Algorithm Updates | | the-coopersmith0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Implications of removing all google products from site
Is there any data on the implications of removing everything google from a site; analytics, adsense, webmaster tools, sitemaps, etc. Obviously they still have their search data and they say they dont use these other sources of data for ranking information but has anyone actually tried this or is there any existing data on this?
Algorithm Updates | | jessefriedman0