Structured data and Google+ Local business page are conflicting
-
Hi,
A few (almost 8 now) months ago we have added structured data to our website. which according to the testing tool should work. (Our url: https://www.rezdy.com)
However when searching for our company name, our old local business page from Google+ shows up. I have reached out to google to tell them that we aren't a local business anymore and want the data from the page to be removed. But this all takes painfully long. I want my search result to be shown like the large businesses (examples: Adroll, Hubspot), including logo, twitter feed etc. etc.
Will this all work, if so, is there a way to speed up the process, any suggestions?
-
Hi Niek,
You know, this is a very tricky one, as frankly, I wouldn't advise reporting the business as closed to Google, given that you don't really want to send a signal to them that the place has gone out of business, even if you're no longer operating as a local company. Normally, I'm helping business owners who want to be sure they're doing well locally, so this question is reverse engineering for me. Obviously, you can't force Google to show a different type of knowledge panel (like the one they show for Hubspot) and I do confirm that what they are showing for a branded search for your business is a local-type knowledge panel which clicks over to the local data in the Maps view, but if you were to try to close the listing, it might not be the right thing to do.
I seldom make a recommendation like this, but if I were in your shoes, I'd see if I could get Joy Hawkins at Imprezzio Marketing to consult with me. She has made a special study of particular aspects of Google listings, including closing them, duplicates, etc. She would be the person I'd reach out to for some consulting time to see if you can erase a local footprint without harming your overall signals to Google that this business is, in fact, in operation but simply not in local operation anymore. Hope this recommendation is helpful.
-
Hey Nitin,
I've done that and according to the tool my https://schema.org/Organization is filled and should work.
But the Local Google+ page seems to be in the way.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link to AMP VS AMP Google Cache VS Standard page?
Hi guys, During the link building strategy, which version should i prefer as a destination between: to the normal version (php page) to the Amp page of the Website to the Amp page of Google Cache The main doubt is between AMP of the website or standard Version. Does the canonical meta equals the situation or there is a better solution? Thank you so mutch!
Technical SEO | | Dante_Alighieri0 -
Google Results Title vs My Page Title
I'm having some trouble with my titles of a new site, it has been online for around two months now and i'm getting weird titles from most indexed pages. Since my site is focused on finding courses, the course title format is the following: URL: https://www.maseducacion.com/estudios/programacion-curricular--tecnigrap-2982
Technical SEO | | JoaoCJ
My Title: Course - Institute | Mybrand
Google Search Title: Course - Institute | Mybrand - Educativa Half of my results have that word at the end, don't know where it comes from, that word is only included in two links. Any idea on how to fix it?0 -
How can I get Google to forget an https version of one page on my site?
Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load). The page already has many incoming links to the http version, and it has a canonical URL with http. I resubmitted it on http with webmaster tools. Is there anything else I could do?
Technical SEO | | BostonWright0 -
Google webmaster showing 0 indexed, yet I can see them all them Google search?
I can see them all the pages showing up in Google when i search for my site. But in webmaster tools under the sitemaps section in the indexed pages - the red bar is showing 0 indexed pages, even though they seem to be indexed. Any idea why is this showing like this? I don’t really think it’s that important as the pages are still indexed, but it just seems odd. Please see in the image.
Technical SEO | | Perfect0070 -
Google Local Gone Loco
I am a bankruptcy attorney in Southern California. I have been doing my own SEO since I had a couple of bad experiences paying someone to "do" it in the past. If you want it done right, do it yourself I suppose. Anyway, I have been ranking well in Google local results. At first I peeked in at 3/3 showing on the first page of the searches. Then I climbed to Number 2 in local searches, probably as a result of finding sites and making sure my addresses, phone numbers and business names were all correct. However, this week (as I climbed to #3 spot in the local search for my city+ bankruptcy attorney, my Google local result dropped to page 2. One of my employees rated me on google local and gave me a google + which is gone and the pictures that I uploaded to Local Google are gone. I don't know if this is some kind of penalty because an employee gave me a rating (they were completely up front about working for me) or if something else is going on. I was also trying to claim my business on Yahoo (which resulted in some kind of "Account Suspension"). I have no idea what is going on. You can take a look at my site if it helps: http://ashcraftfirm.com We are trying to rank for "murrieta bankruptcy attorney" Thanks for any help you can provide.
Technical SEO | | gcashcraft0 -
Domain structure for US Local Sites
We are planning on opening localized versions of our website throughout the world and in the US. For countries these websites will be: www.site.co.uk www.site.fr etc.... For the US would it be better to add the states onto part of the domain name or use a sub-folder. What is the advantage/disadvantages of each? Meaning, should it be: nj.site.com or site-nj.com
Technical SEO | | theLotter0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
Landing Page URL Structure
We are finally setting up landing pages to support our PPC campaigns. There has been some debate internally about the URL structure. Originally we were planning on URL's like: domain.com /california /florida /ny I would prefer to have the URL's for each state inside a "state" folder like: domain.com /state /california /florida /ny I like having the folders and pages for each state under a parent folder to keep the root folder as clean as possible. Having a folder or file for each state in the root will be very messy. Before you scream URL rewriting :-). Our current site is still running under Classic ASP which doesn't support URL rewriting. We have tried to use HeliconTech's ISAPI rewrite module for IIS but had to remove it because of too many configuration issues. Next year when our coding to MVC is complete we will use URL rewriting. So the question for now: Is there any advantage or disadvantage to one URL structure over the other?
Technical SEO | | briankb0