Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Large Competitor closed, how to capitalize in search. Any ideas?
-
Hey Mozzers,
One of our biggest competitors closed down on January 1st, 2020 in several US cities. They did stay open in some areas just FYI. The competitor's website is www.execucar.com. This is a very large company that has a presence in almost all US major airports. It's a private car service just like Uber but for wealthy individuals.
For example. when you search " lax car service" they are #3 on Google or "car service to lax" they're #2 still.
What can we do to get more of their traffic and actual business? Has anyone done something like this before or knows quick and easy tactics to get their clients? We have a local landing page: https://dcacar.com/lax-car-service that ranks 9 through 11 for those same keywords.
Thanks for your thoughts and time.
Davit
-
Hi Miriam,
Hope you're doing well. Thanks for always answering my questions.
Yes, they closed at LAX Los Angeles and many other US cities as well. But they still rank for that location and rank very well. It has been almost 2 months since they stopped their operation at LAX. And to answer your 2nd question yes, we do serve LAX we're not ranking as well as we would like but I was hoping to try to capitalize on this opportunity.
One thing to mention that is important is this. Their website does not mention that they no longer serve these locations, in fact their city page still lists these pages as being served. But if you try to get a quote online or even call the phone number you will get a response that we no longer serve this location.
The company was sold to a venture capital firm which decided to close some of the unprofitable cities. I reached out to the venture firm to see if it possible to buy their local phone number, customer email list etc but they are not interested in selling those. So I was wondering what else can I do to get their customers. They have been in business for almost 20 years and they do have a large number of customer lists etc.
-
Hi Davit!
Some questions:
Are you saying that this business closed its location that services LAX, but they are still ranking for it?
Or, are you saying they still have a location open there?
Do you serve LAX?
Please, let me know. Thanks!
-
Same happened with https://www.devicesprice.com/ I have hired to worker who help me in case but I am also wondering to get good idea.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Image Search - Is there a way to influence the related icons at the top of the image search results?
Google recently added related icons at the top of the image search results page. Some of the icons may be unrelated to the search. Are there any best practices to influence what is positioned in the related image icons section? Thank you.
Intermediate & Advanced SEO | | JaredBroussard1 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
Which search engines should we submit our sitemap to?
Other than Google and Bing, which search engines should we submit our sitemap to?
Intermediate & Advanced SEO | | NicheSocial0 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Is it a good idea to remove old blogs?
So I have a site right now that isn't ranking well, and we are trying everything to help it out. One of my areas of concern is we have A LOT of old blogs that were not well written and honestly are not overly relevant. None of them rank for anything, and could be causing a lot of duplicate content issues. Our newer blogs are doing better and written in a more Q&A type format and it seems to be doing better. So my thought is basically wipe out all the blogs from 2010-2012 -- probably 450+ blog posts. What do you guys think?
Intermediate & Advanced SEO | | netviper1 -
Should I noindex the site search page? It is generating 4% of my organic traffic.
I read about some recommendations to noindex the URL of the site search.
Intermediate & Advanced SEO | | lcourse
Checked in analytics that site search URL generated about 4% of my total organic search traffic (<2% of sales). My reasoning is that site search may generate duplicated content issues and may prevent the more relevant product or category pages from showing up instead. Would you noindex this page or not? Any thoughts?0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Link Building Ideas for a health site
Hi, I am trying to rank a health related website. This is the url: www.ridpiles.com Domain age is 1 year 6 months. Done Directory submissions Blog Comments + Forum posts Done Social Bookmarks Article submissions (Not much) I have done competitor analysis. All of my competitors are just had links from directories and some link exchanges. They got links from quality sites like Yahoo dir. I know my site is far better than my competitors and has 100% unique content. I have submitted to yahoo directory inclusion, but still no luck i hadn't accepted into it. I am planning to go for a sponsered review but dont know, weather the link will be valuable for that much of money. I was left with Guest Blogging. I see this is the only option for me to build links. But i have a very tough competiton, i must compete with most reputed sites like webmd.com etc, i need to get more good links. But i cant get what other ways to get authoritative links. If Guest blogging is the only option for me, how many posts do i need to do daily? And can someone suggest me good Guest blogging sites? Anyhelp would be appreciated.
Intermediate & Advanced SEO | | Indexxess0