Mission Possible? You have 3 hours to do Local SEO. Which top 5 sites do you go Social Bookmark, Local Search Engine Submit and Directory List.
-
Mission Possible? Here is a test. Suppose you had 3 hours (okay 7) to go and submit links, etc, on Social Bookmarking, Local Search Engines and Directories, which top 5 or more of each would you do? (Assuming your on-page is already sweetened).
I just got 2 more clients and I need to get started on a few things for each.
Thankful for all your advice.............
-
Well, you can start with authenticating your site in Google and Bing, then going to brightlocal.com / whitespark.ca , tw/g+/fb/li and social bookmarking sites (5-10 tops).
Web directories take a lot of time to get into but do try odp/joeant/botw and then aim for local directories + links from your customers' connections.
Do you need a list of sites or just some pointers?
-
Without knowing your client's industry, generally speaking If I had 3 hours to work on Local SEO, I would split it up between Google+/G+ Local and Yelp. If I had any time left over, I would try to make connections with reporters from local news websites.
-
Patch.com is a good start (even commenting on related posts)
-
How many started to hum the mission impossible theme as you were reading this post?
-
I would not do social bookmarks or search engine submissions.
I would:
- search for a few good quality industry or local directories to submit to
- try to get links from vendors, local businesses, etc. the client has a relationship with
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you combine YouTube and on-site hosting as part of a Video SEO strategy?
My question is sparked by how Moz uses its Whiteboard Friday videos. We are currently capturing video stories from our customers. Its excellent and engaging content we'd love to share with a wider audience. I'm puttting together a strategy for video SEO to drive traffic to our site and Moz's approach intrigues me. As we know, the world of video rich snippets changed in 2014 - their appearance in universal search reduced dramatically and what remained was almost entirely (90%+) YouTube snippets. Useless if you're looking to drive traffic to your own site. Of course, it's still possible to earn SERPs for video in Google video search, but I imagine the search volume is greatly reduced. From what I can see, first Moz host their Whiteboard Friday video on Wistia, complete with transcript and whiteboard capture. Suprisingly, I see no Schema markup for video. Can anyone shed a light as to why this might be a good idea? 3-6 months later the same video is then uploaded to youtube, with the same title and a similar description. The end result is multiple SERPs in universal search, almost always in the following order: the original post on Moz a YouTube result complete with a video rich sippet This has me asking the following questions - I have some theories - but i'd love your input: Why use two platforms to upload and host the video? Why not just YouTube? Why avoid using Schema on the Wistia video hosted on the original post? Surely, this would allow an additional result in Google Video Search? Why wait 3-6 months after the first post to upload the YouTube video?
Intermediate & Advanced SEO | | RobertChapman0 -
Huge organic drop following new site go live
Hi Guys, I am currently working on a site that's organic traffic suffered ( and is still suffering ) a huge drop in organic traffic. From a consistent 3-400 organic visits a day to almost zero. This happened as soon as the new site went live. I am now digging to find out why. 301s were put in place ( over 2, 500 over them ) and there are still over 1,100 outstanding after review search console this morning. Having looked at the redirect file that was put in place when the new site went live, it all look OK, apart from the redirects look like this... http://www.physiotherapystore.com/ to http://physiotherapystore.com/ Where the new URL is missing www. - I am concerned this is causing a large duplicate issue as both www. and non www. work fine. I am right to have concern or is this something not to worry about?
Intermediate & Advanced SEO | | HappyJackJr0 -
On-site Search - Revisited (again, *zZz*)
Howdy Moz fans! Okay so there's a mountain of information out there on the webernet about internal search results... but i'm finding some contradiction and a lot of pre-2014 stuff. Id like to hear some 2016 opinion and specifically around a couple of thoughts of my own, as well as some i've deduced from other sources. For clarity, I work on a large retail site with over 4 million products (product pages), and my predicament is thus - I want Google to be able to find and rank my product pages. Yes, I can link to a number of the best ones by creating well planned links via categorisation, silos, efficient menus etc (done), but can I utilise site search for this purpose? It was my understanding that Google bots don't/can't/won't use a search function... how could it? It's like expeciting it to find your members only area, it can't login! How can it find and index the millions of combinations of search results without typing in "XXXXL underpants" and all the other search combinations? Do I really need to robots.txt my search query parameter? How/why/when would googlebot generate that query parameter? Site Search is B.A.D - I read this everywhere I go, but is it really? I've read - "It eats up all your search quota", "search results have no content and are classed as spam", "results pages have no value" I want to find a positive SEO output to having a search function on my website, not just try and stifle Mr Googlebot. What I am trying to learn here is what the options are, and what are their outcomes? So far I have - _Robots.txt - _Remove the search pages from Google _No Index - _Allow the crawl but don't index the search pages. _No Follow - _I'm not sure this is even a valid idea, but I picked it up somewhere out there. _Just leave it alone - _Some of your search results might get ranked and bring traffic in. It appears that each and every option has it's positive and negative connotations. It'd be great to hear from this here community on their experiences in this practice.
Intermediate & Advanced SEO | | Mark_Elton0 -
How to remove my site's pages in search results?
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
Intermediate & Advanced SEO | | esiow2013
Disallow: /
Allow: /$0 -
Does Automated High Quality Content Look Like Low Quality to Search Engines?
I have 1,000+ pages that all have very similar writing, but different results.
Intermediate & Advanced SEO | | khi5
Example:
Nr of days on market
Average sales price
Median sales price
etc etc etc All the results are very different for each neighborhood. However, as per the above, the wording is similar. The content is very valuable to users. However, I am concerned search engines may see it as low quality content, as wording is identical across all these pages (except the results). Any view on this? Any examples to back up such views?0 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
HTML 5 sites, segmentation and Meta data?
Hello Mozers, I am currently building an HTML 5 site. I've run into a couple of issues. While implmenting segmentation in each of my mian menu iten, I am able to pluggin Meta data only for one segement (or the page). I am unable to inser Meta data for each of the segments. For example: I have (main menu) Services ----> Submenu (teaching, upgrading, Dancing) I can implement meta data for the Services but not for teaching, upgrading and Dancing as they are segment in the same page. Whats the best logic to get around this
Intermediate & Advanced SEO | | waspmobile0 -
Sub domain versus separate domains, which is better for Search engine purposes?
We are pitching to a hotel client to build two new websites, a summer website and a winter website, two completely different looking websites. The client wants to automatically switch their domain name to point to one or the other, depending on the time of year. The customer does not want to use a landing page where you would choose which site to visit; they want the domain name to go directly to the relevant website. Our options: Set up two new domain names and optimise each website based on the holiday season and facilities offered at that time of year. Then change the exisiting domain name to point at the website that is in season. Or Use the existing domain name and setup two sub domains, switching the home page as necessary. We have been chewing this one over for a couple of days, the concern that we have with both options is loss of search visibility. The current website performs well in search engines, it has a home page rank of 4 and sub-pages ranking 2 and 3’s, when we point the domain at the summer site (the client only has a winter website at present) then we will lose all of the search engine benefits already gained. The new summer content will be significantly different to the winter content. We then work hard for six months optimising the summer site and switch back to the Winter site, the content will be wrong. Maybe because it's Friday afternoon we cannot see the light for the smoke of the cars leaving the car park for the weekend, or maybe there is no right or wrong approach. Is there another option? Are we not seeing the wood for the trees? Your comments highly welcome. Martin
Intermediate & Advanced SEO | | Bill-Duff0