Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
-
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname.
The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100.
A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory.
**The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization.
What should we do? What have I missed?
-
Hi Scott,
No offense intended. We all run into problems we don't know how to solve. This is pretty specialized - someone who not only knows Bing really well, but also, whether there are issues with their local algo that could result in a branch outranking a main office. It's the Bing part of this that I think makes the question hardest. I feel like I'm pretty good at Local SEO, but because of Google's dominance in this sphere, I simply don't know enough about Bing to have ready, helpful examples for you. Sorry about that, and I really do hope you can get this sorted out. I'd love to hear what you learn.
-
I used to think of myself as a heavy hitter SEO LOL
I'd welcome any other answers on this, and will try to talk to Bing.
If I find a solution I'll report back.
-
Hi Scott,
I have seen a similar, but not identical, issue reported numerous times in the Google Places Help Forum, in which a branch office is outranking the main headquarters of the business in Google. See threads like this one:
Basically, when this comes up, the advice typically given is that there is no way to force the search engine to choose one location over another, and that the only thing the owner can do is to work on building the authority of the main location in hopes that Google will catch on.
Your situation is different in 3 important ways:
1. You're talking about Bing, not Google.
2. You're dealing with a broken contract and apparently don't have control over the branch office.
3. Your assessment is that the authority of the main office already far surpasses that of the branch office.
Unfortunately, I don't have an easy answer for you here, but I do have a suggestion. I recommend that you try Bing's support chat to see if you can talk to a live person about the issue:
You need to be able to share the actual details of the business - not use stand-ins - of course. Without the real data, no one will be able to assess what is going on, so hopefully, you have the permission of the client to share their info with something like a live chat. I'd be very interested to hear what the rep tells you.
If you don't get anywhere with this, I would recommend hiring a heavy hitting SEO, with whom you can share full details, to help you audit the situation. Someone who is highly experienced with Bing and Local would be my pick, but this may not be easy to find, as Local is so Google-centric, because of Google's dominance.
I hope I've given you at least a starting point here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HomePage Stopped Ranking For Brand on Aged Site
I've got an odd issue (that I've never encountered in 27 years in SEO). Our home page stopped ranking for our brand "BlowFish SEO" and is no place to be seen when searching our brand. I do get the knowledge panel on the right-hand side of the page. and our about page now comes up number #1. Technically the on-page SEO is correct This page has ranked for many years for our Brand. If I search blowfish SEO west palm beach I get the home page and all the nice site links. And other various variations of branded search. Our company has lots of mentions across the web and branded backlinks. No manual penalty has been placed on us. Im starting to think some type of negative SEO attack but I can't find it. I do know someone is using my name and brand along with many other companies in cloaked doorway redirected pages to gain SEO leads.. Yeah I know I've complained about it to Google they do nothing about it.. Other things I've checked: No one else seems to be using my brand Home page canonical tag points to itself Title tag contains brand name at the front (rest of site it's at the end) No manual penalty XML sitemap contains home page (and accurate for other pages) To make this even more confusing, if you search the brand name the physical location appears on the right rail with an accurate URL. Ive added an image of the search result when I search BlowFish SEO Please note the top result is PPC the about page is 1st organc Any other ideas that I may be missing? BT8F1fD.png
Local Website Optimization | | BlowFish-SEO0 -
Landing pages of web pages for multiple cities served
I have a customer that services literally hundreds of towns. I'm trying to figure out the best way rank in each town. Should I create a landing page or a webpage for each city and optimize for each particular town ( facts/information about the town. SEO titles H1, H2 and alt tags? Thank you!
Local Website Optimization | | Miles230 -
Virtual Offices & Google Search
United Kingdom We have a client who works from home and wants a virtual office so his clients do not know where he lives. Can a virtual office address be used on his business website pages & contact pages, in title tags and descriptions as well as Google places. The virtual office is manned at all times and phone calls will be directed to the client, the virtual office company say effectively it is a registered business address. Look forward to any helpful responses.
Local Website Optimization | | ChristinaRadisic0 -
Benefits of adding keywords to site structure?
Hello fellow Mozzers, This is kind of a hypothetical, but it might have implications for future projects. Do you think there would be any benefits (or drawbacks) to placing pages of a site into a directory named after a keyword? For example, if I had a local store that sold hockey equipment, and "hockey", "equipment", and "hockey equipment" were the main targets being optimized for, would it be better (assuming the actual pages were the same) to structure the site as hypotheticalwebsite.com/about-us/ hypotheticalwebsite.com/hockey-skates/ hypotheticalwebsite.com/hockey-sticks/ hypotheticalwebsite.com/blog/ or hypotheticalwebsite.com/hockey-equipment/about-us/ hypotheticalwebsite.com/hockey-equipment/hockey-skates/ hypotheticalwebsite.com/hockey-equipment/hockey-sticks/ hypotheticalwebsite.com/hockey-equipment/blog/ Additionally, would any of this change if the root domain or the individual pages ALSO used those keywords (or if both of them used it)? pseudonyms-hockey-gear.com/hockey-equipment/skates/ pseudonyms-penalty-box.com/hockey-equipment/hockey-skates/ pseudonyms-hockey-gear.com/hockey-equipment/hockey-skates/ I've got a hunch that some of these are overkill, but I'm not sure where the scale tips from helpful to negligible to actively counterproductive. Thanks, everyone!
Local Website Optimization | | BrianAlpert780 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Will subdomains with duplicate content hurt my SEO? (solutions to ranking in different areas)
My client has offices in various areas of the US, and we are working to have each location/area rank well in their specific geographical location. For example, the client has offices in Chicago, Atlanta, Dallas & St Louis. Would it be best to: Set up the site structure to have an individual page devoted to each location/area so there's unique content relevant to that particular office? This keeps everything under the same, universal domain & would allow us to tailor the content & all SEO components towards Chicago (or other location). ( example.com/chicago-office/ ; example.com/atlanta-office/ ; example.com/dallas-office/ ; etc. ) Set up subdomains for each location/area...using the basically the same content (due to same service, just different location)? But not sure if search engines consider this duplicate content from the same user...thus penalizing us. Furthermore, even if the subdomains are considered different users...what do search engines think of the duplicate content? ( chicago.example.com ; atlanta.example.com ; dallas.example.com ; etc. ) 3) Set up subdomains for each location/area...and draft unique content on each subdomain so search engines don't penalize the subdomains' pages for duplicate content? Does separating the site into subdomains dilute the overall site's quality score? Can anyone provide any thoughts on this subject? Are there any other solutions anyone would suggest?
Local Website Optimization | | SearchParty0 -
What is the best type map for local SEO?
Hi mozzers, Can someone tell me which type of map is best when embedding it into your service pages? or any map is good enough? Why? Thanks guys!
Local Website Optimization | | Ideas-Money-Art0 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0