SEO Tools You Can't Live Without?
-
Hi Guys,
I'm currently in the middle of creating a comprehensive blog post covering SEO Tools that I wouldn't be able to work without. So far I've got the following down, as I use these on a day to day basis and they make my job infinitely easier.
- SEOMoz / OSE
- AHrefs
- BuzzStream
- Scrapebox
- Xenu / Screaming Frog
- Excel
- GWT / Analytics / Adwords Keyword Tool
What tools or subscriptions do you use on a daily basis and couldn't be without?
-
Google Webmaster Tools
-
What is GWT? Can you give a description or link? Thanks in advance!
-
Most of my work is focused on offsite optimization, so here's my list:
1. SEOMoz: I like OSE and the rank tracker as well, even though it's functions are pretty simple.
2. Majestic SEO: especially their site explorer.
3. GeoRanker: I use it for more advanced SERP reports by tracking search results as they are displayed to users from different countries and cities (especially useful in my local seo campaigns).
4. Ahrefs: cannot live without it.
5. Linkdiagnosis: basic but interesting backlink checker
6. GWT: I've given up Analytics and all other Google products, GTW I still use on a regular basis for to configure sitelinks, check for backlinks and discover search queries relevant to my sites.
-
I have to add in Ahrefs. I checked them out but I never really got engaged.
Also citation labs has Link Prospector tool. Which is great for link building from list standpoint to find websites that could be viable options for links.
-
We are trialing Team work live to manage our projects. Seems like a good bit of software and much cheaper than basecamp which we previously use.
SEO moz and Google Docs are used daily here. Along with Majestic at least weekly.
-
Hi Chad,
Thanks for your contribution. Big fan of screaming frog myself.
I find Majestic isn't quite as accurate as Ahrefs, but SEMRush is invaluable and I currently use SerpBook but authority labs looks a bit better for the same price point!
-
Day in and out, I have to do on page analysis and off page analysis; so screaming frog is one great desktop software that I use to see lengths on title tags, meta desc, meta keywords, on page content, keyword density and other internal linking b/w pages.
The offsite I use and cannot live without Majestic SEO backlink profile lookup and SEM Rush to see overall trends when studying competition. These two are web-based and offer a free version; however, the pro version is worth the money!
I use Authority Labs to track keywords that reports tracking for all three search engine platforms. They offer reports to you and show you up and down movement on all your keywords and even the long-tail-keywords!
Hope these tools help with your post!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I better noindex 'scripted' files in our portfolio?
Hello Moz community, As a means of a portfolio, we upload these PowerPoint exports – which are converted into HTML5 to maintain interactivity and animations. Works pretty nicely! We link to these exported files from our products pages. (We are a presentation design company, so they're pretty relevant). For example: https://www.bentopresentaties.nl/wp-content/portfolio/ecar/index.html However, they keep coming up in the Crawl warnings, as the exported HTML-file doesn't contain text (just code), so we get errors in: thin content no H1 missing meta description missing canonical tag I could manually add the last two, but the first warnings are just unsolvable. Therefore I figured we probably better noindex all these files… They appear to don't contain any searchable content and even then; the content of our clients work is not relevant for our search terms etc. They're mere examples, just in the form of HTML files. Am I missing something or should I better noindex these/such files? (And if so: is there a way to include a whole directory to noindex automatically, so I don't have to manually 'fix' all the HTML exports with a noindex tag in the future? I read that using disallow in robots.txt wouldn't work, as we will still link to these files as portfolio examples).
Intermediate & Advanced SEO | | BentoPres0 -
Homepage organization schema question: logo lives on amazon server, can I call that out on the structured data?
Basically, the homepage organization schema has called out the logo, but it lives on the amazon server. We're having issues with Google rendering the correct logo on the knowledge graph. The URL for the amazon asset looks something like this: <brandname>-assets.s3-us-west-2.amazonaws.com/<logo>.png</logo></brandname> Calling that out on the organization structured data for the logo is okay right?
Intermediate & Advanced SEO | | imjonny1230 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
New website won't rank for branded keywords in Google, but does in Bing
We launched a website in October www.butterfly.com. The branded product name "Butterfly Body Liners" will not rank until page 2 of Google, but it ranks #1 in Bing. Organic traffic never really picked up so it's not easy to tell if it's been "hit" by any penalty. The strange thing is, this website: http://archive.is/PQZdO is ranking #1. This is an archived version of the site. Does anyone have any insight as to why this is happening?
Intermediate & Advanced SEO | | LaughlinConstable0 -
Can Wordpress plugins like WPLeadMagnet and PopUp Domination damage SEO? | Please give thumbs up here ➜
Hello guys, Can Wordpress plugins like WPLeadMagnet and PopUp Domination damage SEO? http://wpleadmagnet.com/ http://www.popupdomination.com/ I don't know what Googles thinks about using for example wpleadmagnet.com in order to catch exit & bounce traffic. What are you guys think? If i use wpleadmagnet.com can i expect to maintain my good positions in SERP or not?
Intermediate & Advanced SEO | | EestiKonto0 -
What would your Seo tactic's be for this
Hiya guys... Just a quicken, So my forum, talknightlife.co.uk is currently 10th on google for "nightlife forum" I have about 15 back links, 26 page autority. Now what i'm trying to do, which everyone else is doing, is trying to move it up a couple of spots maybe to 5th or something. What would your tactics be, I'm disregarding all the crap I read in the forums etc, you guys on here tend to have the best explanation. Let it rip 🙂 Cheers guys Luke.
Intermediate & Advanced SEO | | Lukescotty0 -
In order to improve SEO with silos'urls, should i move my posts from blog directory to pages'directories ?
Now, my website is like this: myurl.com/blog/category1/mypost.html myurl.com/category1/mypage.html So I use silos urls. I'd like to improve my ranking a little bit more. Is it better to change my urls like this: myurl.com/category1/blog/mypost.html or maybe myurl.com/category1/mypost.html myurl.com/category1/mypage.html Thanks
Intermediate & Advanced SEO | | Max840 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1