Is there a tool to find out which key phrase has been tweeted the most?
-
As the title suggests ^ ^
-
There are a few different tools I've found helpful to find out what's trending on Twitter, namely:
http://www.twee.co/ - what's trending now on Twitter
http://www.topsy.com - more trend information
Another new one I've found helpful for trends throughout the USA and world is http://trendsmap.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mac-Friendly, DOM-Rendering Spidering Tool for Multiple Users
Hello! I am looking for a spidering tool that: Is Mac-friendly Can render the DOM and find JS links Can spider password-protected sites (prompts for password and then continues spider, etc.) Has competitive pricing for 8+ users. Screaming Frog is amazing - and maybe we're just going to have to bite the bullet there. But if anyone has any other ideas, I've love to hear them. Thanks!
Intermediate & Advanced SEO | | mirabile0 -
URL structure with broad search phrase but specific intent
My question is regarding some difficult URL structure questions in an online real estate marketplace. Our problem is that our customers search behavior is very broad, but their intent very narrow. For IRL examples go to objektia (dot) se. Example: Lease commercial space Stockholm Is a usual search query, wherein the user searches for the **broad category **commercial space, in the geography of Stockholm. The problem is that their intent is actually much more specific, since: Commercial space === [Office, Retail, Industrial, Storage, Properties] I have previously asked the forum for help regarding the placement of products in our URL-hierarchy, in which I got some good answers. We chose to go the route of alternative #3, ie placing our products (real estate listings), directly beneath their respective category (neighborhoods). https://moz.com/community/q/placement-of-products-in-url-structure-for-best-category-page-rankings Basically we chose to have the following URL structure: Structure: domain.se/category/subcategory/product Example: domain.se/Stockholm/suburb-of-stockholm/specific-listing-12 Now the question is, how do we deal with the **space type **modifier in our URL structure. Nobody wants to see retail space when they are after office space, so our current search page solution (category page) is the following: Structure: domain.se/space-type/neighborhood/sub-neighborhood All space types: domain.se/commercial-space/neighborhood/sub-neighborhood Specific space type: domain.se/office-space/neighborhood/sub-neighborhood Now, the problem with our current solution in combination with our intent to move our product pages into this hierarchy, is that every product page will be (and is today) linking towards the specific type category. Our internal link network would be built around type categories that are extremely relevant from a UX standpoint, but almost worthless (surprisingly) from an organic traffic standpoint. Also, every search page (category page) for each space type would be competing for the same search broad search phrase. The alternative is to place the type modifier at the end of the URL: Category page type at the end: domain.se/neighborhood/sub-neighborhood/type Listing page (product page), type at the end: domain.se/neighborhood/sub-neighborhood/street-address/type/listing-12
Intermediate & Advanced SEO | | Viktorsodd0 -
Tools to scan entire site for duplicate content?
HI guys, Just wondering if anyone knows of any tools to scan a site for duplicate content (with other sites on the web). Looking to quickly identify product pages containing duplicate content/duplicate product descriptions for E-commerce based websites. I know copy scape can which can check up to 10,000 pages in a single operation with Batch Search. But just wondering if there is anything else on the market i should consider looking at? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Do Google webmaster tool and other backlinks analysis tool ignore the disavow data ?
Hello, Lots of site i have disavow so if i download backlinks of my site from google webmaster so google will ignore the disavow data and give me backlinks other than disavow data? Same if i use backlink tools like moz or semrush or ahref etc for checking backlinks of my site or competitor site so will this tool ignore the disavow data? If such tools not aware of disavow then it is worthless to check competitor links? Thanks! dev
Intermediate & Advanced SEO | | devdan0 -
Cannot Increase Ranking For a Keyword Phrase
I've been working on the keyword phrase, "Niceville Assisted Living" for the website: http://nicevilleassistedliving.com and my increase in rankings has pretty much stalled. When I first started working on this website, a lot of the content was duplicated (which we took care of by plugging in unique content), there were locations listed on the homepage that were throwing my rankings off, I've created blog posts each week (we've even tried posting one post every day for a week), added the Facebook feed to the homepage, corrected errors in the theme, and I'm trying to get a resources page built. I know content is a very, very large part of SEO.. but it seems like the content I am plugging in isn't helping. There aren't any errors in Webmaster Tools and my keyword density is fairly close to the website ranking #1. I think my biggest problem is backlinks. Other websites have quite a few whereas the website I'm working on doesn't have any (I'm working on that, but the number I have doesn't compare to the websites ranking in the top three). I'm stumped as to what to do next. Does anyone have suggestions to improve the ranking for this keyword phrase?
Intermediate & Advanced SEO | | ReviveMedia0 -
Tool that can retrieve mysite URL's
Hi, Tool that can retrieve mysite URL's I am not talking about href,open explorer, Majestic etc I have a list of 1000 site URL's where my site name is mentioned. I want to get the exact URL of my site next to the URL i want to query with Example http://moz.com/community is the URL i have and if this page has mysite name then i need to get the complete URL captured. Any software or tool that can do this? I used one for sure which got me this info but now i don't remember it Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Places ranking for a non-locational phrase?
http://www.google.co.uk/search?ie=UTF-8&q=coach+hire&pws=0&gl=GB The link above takes you to a SERP for a general phrase with no hint of locations involved (Coach Hire). However oddly enough there is a single google places listing that has pooped up at #4. Liverpool Minibus Coach hire <cite>www.localcoachhireuk.co.uk/</cite> Now if this was "Coach Hire London" I would expect places, and indeed there is a list of places. But how do you get a places listing ranking for a phrase without a place name? Also of interest is the fact that this website doesn't even exist! It is a 301 redirect to another site. Google seems to be picking up the 301 since it shows the redirected site in the page snapshot and has no pages indexed for this domain. So an un-indexed site with a 301 redirect is #4 for the top phrase in this industry. I have no doubt that this will only be a temporary thing but it would be interesting to know how it was possible.
Intermediate & Advanced SEO | | PPCnSEO0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0