What's the website that analyzes all local business submissions?
-
I was recently looking at a blog post here or a webinar and it showed a website where you could see all of the local sites (yelp, Google places) where your business has been submitted. It was an automated tool. Does anyone remember the name of the site?
-
That's the one! Thanks!
-
That has to be it.
-
Not sure if I am understanding the question correctly, but are you referring to getlisted.org?
-
I think you might be referring to localeze or whitespark?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have recently re-done my website. My buyers guide and my category page are ranking for keywords I'm after.
I have recently re-done my entire site (only a few days). I believe Google is still re-crawling and updating (however, the amount of movement on other searches has been significant). My buyers guide is ranking very high for its intended keywords, as well as high for the keywords of the category page. Both are at the beginning of the second page and I wonder if its dragging me down. What do you think I should do? Is it to early to take action as everything has been completed redone.
Technical SEO | | Code2Chil0 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Launching Website
We are developing a new website and thinking google would not find it because of the directory we put it in (no homepage yet) and because there are no links to it. For example, we are building it in this directory example.com/wordpress/ but somehow google found it and indexed pages not ready to be indexed. What should we do to stop this until we are ready to launch? Should we just use a robots.txt file with this in it? User-agent: *
Technical SEO | | QuickLearner
Disallow: / Will this create repercussions when we officially launch?0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
What is Google's Penguin effect on SEO?
I want to know about Google's Penguin. Specially, how it works to protect spam links <seo>or other jobs. </seo> How I can protect this problem. Kind Regards John
Technical SEO | | JohnDooley0 -
Two companies merge: website A redirect 301 to website B. Problems?
Hi, last december the company I work for and another company merged. The website of company A was taken offline and the home page was 302 redirected to a page on website B. This page had information about the merger and the consequences for customers. The deeper pages of website A were 301 redirected to similar pages on website B. After a while, the traffic from the redirected home page decreased and we thought it was time to change the redirect from a 302 into a 301 redirect to the home page. Because there are still a lot of links to the home page of website A and we wanted to preserve the link juice. Two weeks ago we changed the 302 redirect from website A into a 301 redirect to the home page of website B. Last week the Google webmaster tools account of website B showed the links from the 301 redirected website A. The total amount of links doubled and the top anchor text is the name of company A instead of company B. This, off course, could trigger an alarm at Google. Because we got a lot of new links with a different anchor text. A tactic used by spammers/black-hats. I am a bit worried that our change will be penalized by Google. But our change is legit. It is to the advantage of our customers to find us if they search for the name of company A or click on a link to website A. We didn´t change the change of address of domain A in Google webmaster tools yet. Is it a good idea to change the change of address of domain A into domain B? Are there other precautions we can take?
Technical SEO | | NN-online0 -
Fowarding URL's Have No SEO Value?
Good Morning from -3 Degrees C no paths gritted wetherby UK 😞 Imagine this scenario. http://www.barrettsteel.com/ has been optimised for "Steel suppliers" & "Steel stockholders". After runnning an on page SEO moz report its recommended that the target terms should be placed in the url eg www.steel-suppliers.co.uk Now the organisation will not change the url but think setting up a forwarding url eg registering www.steel-suppliers.co.uk to then forward to www.steel-suppliers.co.uk will be of benfit from an SEO perspective. But i think not. So my question is please "is a forwarding url of no value but a permanent URL (struggling for the terminology to describe the url a site is set up with) such as www.steel-suppliers.co.uk would be of value?" Any insights welcome 🙂
Technical SEO | | Nightwing0 -
What's the max number of links you should ever have on a page?
Our homepage has a few hundred links and our index pages(pages that link to our spintext pages) have about 900 links on them with no content. Our SEO guy said we have to keep the links under 1000 but I wanted to see what you guys think.
Technical SEO | | upper2bits0