Should we add our site to Google Webmaster Tools
-
Hello,
Should we add our site
nlpca(dot)com
to google webmaster tools?
Everything's very white hat but we do have a section on each of our 4 sites for "Our other Sites" that link to the others. It's been there for many years.
We're looking for clues as to why we've dropped in rank
Thanks!
-
Another useful feature is the ability to remove pages from the index. I've needed to do this a couple times. It's free and only takes a few minutes to set up so I'd do it for every site. Google seem to be developing and refreshing a lot of things at the moment too so I am sure Google Webmaster Tools will only improve with time.
-
Search query report will be of great help to analyze organic CTR and ranking. Even though the data is available only for the last 35 days, the report is very valuable.
-
I don't see any downside to using WT. If I read you right, you've avoided WT hoping to hide this recip page from google? Rest assured, google sees it whether you're on wt on not!
As to diagnosing your rankings drop, wt will not just give you an answer nor will they point to the portion of your link profile or content that they don't like. The data that they give you on queries and clicks are very vague and inaccurate (and intentionally so), and I don't think they would help much with your issue. It is very useful for finding crawl errors, uploading a site map, and seeing +1 activity. WT is just one more tool to help you make your site more useful.
-
Yes! Google webmaster tools is a great help in determining crawler issues, queries, duplicates etc.
Now Google Analytics can be linked with Webmaster Tools allowing for even better data analysis. There is no reason not to have every site you manage up and running with Webmaster Tools.
-
Will it provide all of this even though we are just now adding it? What will it provide on information on drops in ranking since we're just now getting it set up in GWT?
-
Hi Bob,
I think you should definently add nlpca to Google Webmaster Tools; as it provides you clues about the general health of your site.
In GWT you can look at the some of the following clues for drops in ranking:
-
Malware Errors
-
Crawl Rates
-
Duplicate meta data
-
Changes in search querie/landing page impressions and position
-
Recent server side issues i.e. if a certain page has been detected as a 404 page e.t.c
Hope this helps,
Vahe
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have 100+ Landing Pages I use for PPC... Does Google see this as a blog farm?
I am currently using about 50-100 domains for geotargeted landing pages for my PPC campaigns. All these pages basically have the same content, I believe are hosted on a single unique ip address and all have links back to my main url. I am not using these pages for SEO at all, as I know they will never achieve any significant SEO value. They are simply designed to generate a higher conversion rate for my PPC campaigns, because they are state and city domains. My question is, does google see this as a blog/link farm, and if so, what should I do about it? I don't want to lose any potential rankings they may be giving my site, if any at all, but if they are hurting my main urls SEO performance, then I want to know what I should do about it. any advice would be much appreciated!
White Hat / Black Hat SEO | | jfishe19881 -
Are links on sites that require PAD files good or bad for SEO?
I want to list our product on a number of sites that require PAD files such as Software Informer and Softpedia. Is this a good idea from an SEO perspective to have links on these pages?
White Hat / Black Hat SEO | | SnapComms0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Many sites added some excerpts of my Blog post and linking back ? Most of them are Spamy site !
Many sites added some excerpts of my Blog post and linking back ? Most of them are Spamy site ! Some are great blogs, but some blogs just copy some excerpts and link back to them - which i never approve. Will it affect my blog. i ask them to remove it. no use. !
White Hat / Black Hat SEO | | Esaky0 -
Pages higher than my website in Google have fewer links and a lower page authority
Hi there I've been optimising my website pureinkcreative.com based on advice from SEOMoz and at first this was working as in a few weeks the site had gone from nowhere to the top of page three in Google for our main search term 'copywriting'. Today though I've just checked and the website is now near the bottom of page four and competitors I've never heard of are above my site in the rankings. I checked them out on Open Site Explorer and many of these 'newbies' have less links (on average about 200 less links) and a poorer page authority. My page authority is 42/100 and the newly higher ranking websites are between 20 and 38. One of these pages which is ranking higher than my website only has internal links and every link has the anchor text of 'copywriting' which I've learnt is a bad idea. I'm determined to do whiter than white hat SEO but if competitors are ranking higher than my site because of 'gimmicks' like these, is it worth it? I add around two blog posts a week of approx 600 - 1000 words of well researched, original and useful content with a mix of keywords (copywriting, copywriter, copywriters) and some long tail keywords and guest blog around 2 - 3 times a month. I've been working on a link building campaign through guest blogging and comment marketing (only adding relevant, worthwhile comments) and have added around 15 links a week this way. Could this be why the website has dropped in the rankings? Any advice would be much appreciated. Thanks very much. Andrew
White Hat / Black Hat SEO | | andrewstewpot0 -
Beating the file sharing sites in SERPs - Can it be done and how?
Hi all, A new client of mine is an online music retailer (CD, vinyls, DVD etc) who is struggling against file sharing sites that are taking precedence over the client's results for searches like "tropic of cancer end of things cd" If a site a legal retailer trying to make an honest living who's then having to go up against the death knell of the music industry - torrents etc. If you think about it, with all the penalties Google is fond of dealing out, we shouldn't even be getting a whiff of file sharing sites in SERPs, right? How is it that file sharing sites are still dominating? Is it simply because of the enormous amounts of traffic they receive? Does traffic determine ranking? How can you go up against torrents and download sites in this case. You can work on the onsite stuff, get bloggers to mention the client's pages for particular album reviews, artist profiles etc, but what else could you suggest I do? Thanks,
White Hat / Black Hat SEO | | Martin_S0 -
Switching prices for google base
We would like to be able to submit lower prices to google than we do to other sources. How i see it working is that at the end of each url we submit to google base there is a tracking code (source=googlebase). When a user visits the site via one of these urls we would knock 10% of the price of that item and store the item in a cookie to ensure that the price of that item, for that user would stay at the low price for 24 hours. My question is whether google would have a problem with us doing this? The second part of my question is whether they check the full url including the query strings? If theyt just checked the canocial URL they would see a price thats 10% higher than the one we submitted to base - which, of course - would be bad
White Hat / Black Hat SEO | | supermarketonline0 -
Does your website get downgraded if you link to a lower quality site?
My site has a pr of 4. My friends site has a pr of 2 but I think that he is doing some black hat seo techniques. I wanted to know whether the search engines would ding me for linking to (i.e., validating) a lower quality site.
White Hat / Black Hat SEO | | jamesjd70