Verifying Site Ownership & Setting Up Webmaster tools for clients who use Hubspot
-
We are a Hubspot partner agency. I'm trying to find the best route for managing Google's tools as an extra resource for insight, not the primary basis for marketing effort. I also want to explore adwords in more depth. Finding a lot of our clients don't have one or the other or both Analytics/Webmaster tools in place.
- Can I verify site ownership to set up webmaster tools simply by having admin access to their analytics account or will that require ownership of the analytics account? With Google merging things together these days I'm not sure of the best approach to take.
- Usually clients have their site hosted somewhere and built on some platform and ADD a Hubspot blog and the landing pages/cta's, Hubspot tools on a subdomain hosted by Hubspot. Hubspot has tools in it's website settings for adding google analytics (actually it's just a field to add code to the header area). If a client has universal analytics on their primary domain do I still need to go and add a separate analytics property for the subdomain and go through Hubspot's tools to install it on the subdomain? Or just use the same code from their primary domain and add it to the Hubspot header? What is the best route?
Any additional thoughts on this subject are welcome - with so much updating and changing coming from Google (and Hubspot as we implement 3.0 - COS) I'm trying to avoid wasted effort, outdated methods, etc.
Thanks!
-
We were able to successfully verify ownership using the Meta Tag method from WMT. We copied the meta tag, logged into Hubspot, and under Content Settings > Page Publishing, we pasted the meta tag in the Site Header HTML box.
Hope this is helpful!
-
I'm curious - were you able to figure out how to verify Hubspot content in Webmaster Tools?
-
I'm a Hubspot user and have been unable to verify the subdomain we are using in WMT, as the root returns a 404, and webmaster tools will only accept a URL with a 200 response.
Hubspot will not allow you to put files in the root directly eliminating the file upload option...
I'm curious to learn if you were able to find a way to verify.
-
If they don't have a Webmasters account, you don't need Analytics to set it up. Just just need your own Webmasters account and add it that way.
The second scenario is exactly that. Remove the code from the website and re-verify again.
-Andy
-
Thanks for the feedback.
Andy, what about when they don't have a webmaster account in place. Most of the time, they don't - so I'm wanting to add that (set it up for them) and use the analytics to verify site ownership. Sounds like Bill is doing it this way.
If they already have a webmaster account in place and don't know it (someone created it for them and either didn't tell them or it got lost in personnel changes and so on) what's the remedy? Find and remove the code from the website and start fresh?
-
Hi Lisa,
If you want to verify a website, then you need access to their Webmaster account. They will need to set you up as an admin, or you will need to add them to your own profile.
As an admin, you can do most things, but you will never be able to do things like send a disavow file - this needs the owner to do it.
-Andy
-
Lisa, if you have admin access to the Google Analytics accounts, that generally should be enough to verify Google Webmaster Tools. I have only run into issues verifying the site with Webmaster Tools if the site doesn't have the latest Google Analytics tracking code on it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Site Worries
To cut a long story short, our old web developers who built us a bespoke site decided that they could no longer offer us support so we decided to move our back end to the latest Magento 2 software and move over to https with a new company. The new setup has been live for 3 weeks, I have checked in webmaster tools and it says we have 4 pages indexed, if I type in site:https://www.mydomain.com/ we have 6560 pages indexed, our robots.txt file looks like this:Sitemap: https://www.mydomain.com/sitemap.xml Sitemap: https://www.mydomain.com/sitemaps/sitemap_default.xml I use Website Auditor and Screaming Frog, Website Auditor returns a 302 for my domain and Screaming Frog returns a 403 which means I cannot scan any of these. If I check my domain using an https checking tool some sites return an error but some return a 200.
Reporting & Analytics | | Palmbourne
I have spoken to my new developer and he says everything is fine, in Webmaster tools I can see some redirects from his domain to mine when the site was in testing mode. I am concerned that something is not right as I always check my pages on a regular basis. Can anyone shed any light on this, is it right or am I right to be concerned. Thank you in advance0 -
Should Google Trends Match Organic Traffic to My Site?
When looking at Google Trends and my Organic Traffic (using GA) as percentages of their total yearly values I have a correlation of .47. This correlation doesn't seem right when you consider that Google Trends (which is showing relative search traffic data) should match up pretty strongly to your Organic Traffic. Any thoughts on what might be going on? Why isn't Google Trends correlating with Organic Traffic? Shouldn't they be pulling from the same data set? Thanks, Jacob
Reporting & Analytics | | jacob.young.cricut0 -
Why are these sites outranking me?
I've been hit by every update and have spent thousands of $ and hundreds of hours trying to survive. Survival looks doubtful if I can't get turned around in 4 weeks or less. I have found adwords and google errors and fixed them. Alexa says us-nano.com is the best ranked site. I used my moz bar and they are doing everything wrong, keyword stuffing, no H1's tags, poor design. How are they ranking & I'm not? My duplicate meta tags are from this week when I added alexa and bing ID's to my header to verify my site ownership. http://imgur.com/a/I1bsw I1bsw
Reporting & Analytics | | cheaptubes0 -
Site relaunch and impact on SEO
I have some tough decisions to make about a web site I run. The site has seen around for 20 years (September 1995, to be precise, is the date listed against the domain). Over the years, the effort I've expanded on the site has come and gone, but I am about to throw a lot of time and effort back into it. The majority of the content on the site is pretty dated, isn't tremendously useful to the audience (since it's pretty old) and the site design and URL architecture isn't particularly SEO-friendly. In addition, I have a database of thousands vendors (for the specific industry this site serves). I don't know if it's a factor any more but 100% of the links there have been populated by the vendors themselves specifically requesting inclusion (through a form we expose on the site). When the request is approved, the vendor link shows up on the appropriate pages for location (state) and segment of the industry. Though the links are all "opt-in" from vendors (we've never one added or imported any ourselves), I am sure this all looks like a terrible link farm to Google! And some vendors have asked us to remove their link for that reason 🙂 One final (very important) point. We have a relationship with a nationwide brand and have four very specific pages related to that brand on our site. Those pages are essential - they are by far the most visited pages and drive virtually all our revenue. The pages were put together with SEO in mind and the look and feel is very different to the rest of the site. The result is, effectively, a site-within-a-site. I need to carefully protect the performance of these pages. To put some rough numbers on this, the site had 475,000 page views over the last year, with about 320,000 of those being to these four pages (by the way, for the rest of the content "something happened" around May 20th of last year - traffic almost doubled overnight - even though there were no changes to our site). We have a Facebook presence and have put a little effort into that recently (increasing fans from about 10,000 last August to nearly 24,000 today, with a net gain of about 2,500 per month currently). I don't have any sense of whether that is a meaningful resource in the big picture. So, that's the background. I want to totally revamp the broader site - much improved design, intentional SEO decisions, far better, current and active content, active social media presence and so on. I am also moving from one CMS to another (the target CMS / Blog platform being WordPress). Part of me wants to do the following: Come up with a better plan for SEO and basically just throw out the old stuff and start again, with the exception of the four vendor pages I mentioned Implement redirection of the old URLs to new content (301s) Just stop exposing the vendor pages (on the basis that many of the links are old/broken and I'm really not getting any benefit from them) Leave the four important pages exactly as they are (URL and content-wise) I am happy to rebuild the content afresh because I have a new plan around that for which I have some confidence. But I have some important questions. If I go with the approach above, is there any value from the old content / URLs that is worth retaining? How sure can I be there is no indirect negative effect on the four important pages? I really need to protect those pages Is throwing away the vendor links simply all good - or could there be some hidden negative I need to know about (given many of the links are broken and go to crappy/small web sites, I'm hoping this is just a simple decision to make) And one more uber-question. I want to take a performance baseline so that I can see where I started as I start making changes and measure performance over time. Beyond the obvious metrics like number of visitors, time per page, page views per visit, etc what metrics would be important to collect from the outset? I am just at the start of this project and it is very important to me. Given the longevity of the site, I don't know if there is much worth retaining for that reason, even if the content changes radically. At a high level I'm trying to decide what questions I need to answer before I set off on this path. Any suggestions would be very much appreciated. Thanks.
Reporting & Analytics | | MarkWill0 -
If i was to drastically improve 5 critical things on my site, what would you suggest?
I have put in a lot of improvements on my site both onsite and offsite, I was just wondering from a critical point of view, what 5 things would you suggest would require an improvement, that will consequently lead to both, a better user experience and better Rankings on Google? Open even to criticism 🙂 Thank You..... Find my site here:http://bit.ly/1vW4GGP
Reporting & Analytics | | ConnectMedia0 -
How can you tell if Google has already assessed a penalty against your site for spammy links?
Is there any way to tell for sure if there is a penalty? My client has a ton of low quality back links, and I think they are in danger of a Penguin penalty. Any way to know? The links are there for a business reason.... their clients mention them in the footer, with a backlink. It is not a link scheme. but folks are generally not clicking on a footer link, and so there is a pro/con of leaving it as it. Any way, to diagnose whether a Penguin penalty has already hit?
Reporting & Analytics | | DianeDP2 -
Anyone notice a drop in results using site operator?
I set our site's preferred domain back on January 28. We had a www and non www domain being indexed. Since then, I've seen the number or results for our site site operator (site:) decline dramatically. Not sure if this is a good thing or bad thing. So, I'm trying to see if it's unique to our site. My gut is that the numbers are probably leveling out to where they should be and the duplicates are falling out, but I would think that as I see number of results for non www decline, the number of results for www would increase. Any thoughts? Anyone else seeing fluctuations in results using site: ? Lisa
Reporting & Analytics | | Aggie0 -
Site: Query Question
Hi All, Question around the site: query you can execute on Google for example. Now I know it has lots of inaccuracies, but I like to keep a high level sight of it over time. I was using it to also try and get a high level view of how many product pages were indexed vs. the total number of pages. What is interesting is when I do a site: query for say www.newark.com I get ~748,000 results returned. When I do a query for www.newark.com "/dp/" I get ~845,000 results returned. Either I am doing something stupid or these numbers are completely backwards? Any thoughts? Thanks, Ben
Reporting & Analytics | | BenRush0