Yes agreed but if you are seeing that scrapers sites outrank your sites in SERPs in that case you should fill the form.
Thanks
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Yes agreed but if you are seeing that scrapers sites outrank your sites in SERPs in that case you should fill the form.
Thanks
Hi Justin,
I have never use myseotools so I can't say about authenticity one thing I would like to mention here that huge number of links from same site is not good thing & Google can easily caught such links.
I would like to suggest to check in some other tools once if you are getting there such number of links then you have to consider it seriously
Thanks
Hi,
I assume that you are saying when you search a particular keyword at front end that keyword used 40 times and when you check in view source that keyword mentioned 100 times.
If my assumption is true first I would like to let you know that keyword used in different tag don't show in front end like keyword used in meta title, description and other tags but in your case difference is huge . I think you are hiding keywords by some spam method that is absolutely not good for SEO.
If you are violating any Google guideline that will not good for your site and your site could be penalized. Use keywords in natural way to give better experience to users not only for search engine bot.
Hope this helps
Thanks
Hi John,
No, you can upload XML feed any time so in your case you should update XML feed immediately after price updation on website. Goggle doesn't mentioned anywhere that you have to submit/update XML feed at 12 AM only, I do submit the feed whenever I want.
If your prices change daily at a specific time, then a schedule to upload a fresh Data feed once these changes go live would be an idea. This takes out any possibility of human error in forgetting etc. Ensure you do have an email confirmation set up though, so you can watch for any failures in uploads.
Now coming to the price differ - Although your dashboard may report pricing discrepancies, if you have uploaded your feed and this matches the price listed on your site, it could be that when Google crawled your page, it was just before you did the price change on the site or alternatively, just before you uploaded the feed to reflect the change in price.
Hope it helps.
Thanks
Hi Justin,
Yes you can do that and that would be the best if they remove & in the worst case if they don't remove or asking to pay leave them .You can use disavow tool and disavow entire that directory domain.
Hope this helps
Thanks
Hi Lewis,
I think you need to give it a little more time. I recently launched a new site and the business info only appeared on the right hand side of the SERP's while searching for "www.domain.com" rather than the business name. A month later a search for the business name is pulling down Google+ info correctly.
You can speed the process up by obtaining high quality backlinks - ideally with Pea Soup Digital as the anchor text and/ or by sharing great content on your site.
I couldn't replicate the privacy policy issue, so it may well have sorted itself out. To be safe you could block it using robots.txt.
Nice site btw
Hi Rob,
I assume you are aware of that Google only show sitelinks for results when they think that 'll be useful to the user.
If you don't want to show "Kolea 10A" as sitelink you can demote them. As far as best practice is concern Google said
"There are best practices you can follow, however, to improve the quality of your sitelinks. For example, for your site's internal links, make sure you use anchor text and alt
text that's informative, compact, and avoids repetition."
Thanks
are you ranking well for all or most of the keywords? if yes then I wouldn't suggest you to do that because even if you use 301 redirect you will loose some link juice. If that is not the case you can go ahead.
My suggestion was purely based on what Google's recommendation and in terms of readability.
Thanks
Hi Sanjay,
As far as I know there is no such tool available & even I am going to one step further Moz won't be ever going to made such tool.
I 'm quoting moz on this "Since 2001, not only has search engine submission not been required, but has become virtually useless"
Read full post here @ http://moz.com/beginners-guide-to-seo/myths-and-misconceptions-about-search-engines
You can submit in Google & Bing manually.
Thanks
Hi Justin,
Similar question asked in this post @ http://moz.com/community/q/webmaster-tools-indexed-pages-vs-sitemap
Hope this helps you.
Thanks
Hi,
I don't think that by placing seo audit tool your site will get any ranking boost. You should check competitors on page optimization techniques and the backlinks.
Thanks
Hi Shawn,
It can take months for Google to start deindexing old URLs.There are several post on this issue I'm sharing two post that will help you to resolve your issue.
http://moz.com/community/q/proper-301-in-place-but-old-site-still-indexed-in-google
Thanks
Hi ,
Yes you are right.
I'm sharing a comment from legitimate source on the same
"Due to Wikipedia's model, its a necessity for them to have nofolow external links. It dissuades spam which is a massive problem with their model where anyone can edit and contribute. To answer your question, which is already answered here in the article, no, you do not need to nofollow your outgoing links if they are natural referring links. If you are selling links for traffic then nofollow is the norm. Thinking about "keeping" link juice is in the history books and we really need to stop thinking like its still 2002. Link out when useful to your readers without fear."
You can read more about this here @ https://www.rebootonline.com/blog/long-term-outgoing-link-experiment/
hope this helps.
Thanks
Hi David,
According to moz 'Use Keywords in Your URL' is moderate importance so it could affect a bit but on-page grade won't go beyond B.
I don't think you should worry about it because in a URL we can use one keyword only and most of the cases we target several keywords on a single webpage and they do rank well, not possible to use all targeted keywords in a URL.
Thanks
Hi Justin,
If those pages are refresh to button on our site that means that won't useful for users so you should deindexed those pages.
You can remove URL from Google search console and also place meta robots noindex on those pages.
Thanks
Hi Justin,
I agree with Jonathan If the site is decent then there is no problem to get links from there. That would be great for you.
*Scrapers can't Hurt Your Site *** & if scrapers outrank your website then you can report here @ https://docs.google.com/forms/d/1Pw1KVOVRyr4a7ezj_6SHghnX1Y6bp1SOVmy60QjkF0Y/viewform
Thanks
Hi,
IMO if you are a one person team or don’t have the dedicated resources to help you build, maintain or troubleshoot a website, I would recommend trying Squarespace.
You can also check comparison between WP & Squarespace @ http://www.websitebuilderexpert.com/squarespace-vs-wordpress/
Hope this helps.
Thanks
Hi Stephen,
First one is better and second one is like keyword stuffing and if already plutobeach is already in URL there is no need to use same word in URL . I wouldn't suggest you to use second one.
Thanks
Hi David,
First of all as far as I know paid campaign doesn't helps in organic ranking. Google repeatedly said that paid campaign doesn't affect organic rankings.
As far as I know Google says that showing one version to users and other version to boat is called cloaking and we must not use this but didn't say anything on paid & non paid visitors.
If I assume that paid campaign helps in organic ranking then it is the only one thing that can affect ranking by paid campaign that is CTR.
I do run AdWords campaign for my website over 8 years and CTR is minimum 10% but I never noticed that paid campaign helps in ranking.
** I wouldn't suggest you to do that***
Please also check this once @ https://support.google.com/adwordspolicy/answer/6020954?hl=en&rd=1#701
Hope this helps you.
Thanks
Hi Justin,
Many SEO experts thinks that links from directories are generally low quality links and they don't want to submit in directories.
In this case my simple question is If directory links didn’t work then why Google deindex few hundreds directories in 2012 ?
Links from the right directories can still do wonders for your site. So you should check spam score of those directories and if spam score is & directory is relevant to your website you can go for it.
Thanks
Hi,
I am sharing my personal experience on my website with .uk extension. In categories pages I am showing excluded price. When customer click on 'buy now' button they go to product page where I am showing both price including & excluding vat both but I have highlighted included price with bold & some graphics. For me it is working well & conversion rate is fine.
Thanks
Completely agreed with Moosa. you can also check below post.
http://blog.woorank.com/2013/03/a-guide-to-clean-urls-for-seo-and-usability/
Hope this helps.
Thanks
Hi Amaury,
If your brand has recognition and clout then there is no need to use in title otherwise you should use in homepage contact us page title only. For products/services page you can leave the brand name off of the title tag.
Hope this helps.
Thanks
Hi,
I have seen 'near me' and such other keywords giving great conversion in AdWords because in Adwords we can target specific city or even zip code level .
In case of SEO please check below thread that will helpful to know how to optimize such keywords.
Thanks
Hi,
Google says "Server Location Mostly Irrelevant For SEO".
I'm quoting Google's John Mueller in response to the question of SEO and server location:
"For search, specifically for geotargeting, the server's location plays a very small role, in many cases it's irrelevant. If you use a ccTLD or a gTLD together with Webmaster Tools, then we'll mainly use the geotargeting from there, regardless of where your server is located. You definitely don't need to host your website in any specific geographic location -- use what works best for you, and give us that information via a ccTLD or Webmaster Tools."
Q: Is the server location important for geotargeting?
If you can use one of the other means to set geotargeting (ccTLD or Webmaster Tools’ geotargeting tool), you don’t need worry about the server’s location. We do, however, recommend making sure that your website is hosted in a way that will give your users fast access to it (which is often done by choosing hosting near your users).
Original thread @ https://productforums.google.com/forum/#!topic/webmasters/k6po9mnpI8c/discussion
Hope this helps.
Thanks
Hi Matt,
301 redirect should not have effect on QS becaus the landing page is the web page the user lands after all redirects. Both the pages are same (design, content) so landing page below average is not due to 301 redirects.
Remember that the most important "component" of the QS is CTR.*
Hope that helps
Thanks
Hi Michael,
I don't think so and as far as i know we should concern only when important images blocked e.g products images.
Thanks
Hi David,
There is no guideline from Google on this, I would like to quote Rand here from his very old blog post but it is still very relevant.
He says "There's no hard and fast answer. The number can be as low as 1 and as high as 15 (maybe 20). You can target As many as makes sense for a visitor, a potential buyer, and those who will link."
I would like to suggest you to use for at least 2000-words when targeting competitive keywords and various long tail keywords.
To know how you can perfectly optimize your webpage check below articles .
https://moz.com/blog/visual-guide-to-keyword-targeting-onpage-optimization
http://backlinko.com/on-page-seo
Hope this helps.
Thanks
Hi,
To check broken links/URLs (the http response ‘404 not found error’) on your website) you can use Screaming Frog SEO spider which is free in lite form, for up to 500 URLs.
http://www.screamingfrog.co.uk/broken-link-checker/
Gary Illyes from Google says "Whoever came up with the idea that having 404s gives a site any sort of penalty, you're wrong. Utterly wrong."
Please also read this post @ https://plus.google.com/+JohnMueller/posts/RMjFPCSs5fm
Hope this helps.
Thanks
Hi Justin,
I have also gone through the same case that you are exactly facing now now I'm telling you how I handled this.
I have started PPC campaign with purpose of keyword research and search volume so I have started with very low daily budget and my aim was to appear on page one only at (e.g position 5-7) because I don't want users to click on my ad frequently, just wanted to know the number of impression for each keyword.
I have run campaign for 3 months and I spend less that $100 and I got several new keywords to target with high search volume and low competition.
So If you can run campaign only for new keyword research that will be helpful for you and it won't cost you much.
I presume you are aware of search term list in Google search console (Google webmaster tools).
If you have semrush paid account you can find organinc and paid keywords of your competitors and many more things.
I'm sharing one article how can you use semrush tool effectively.
http://www.robbierichards.com/review/competitor-research/
Hope this helps you.
Thanks
Hi,
There is only one way to find missing meta description from any website is by seeing view source. I hope you are aware of the facts that if any sites don't have meta description Google picks meta description from content on that page.
So in order to find missing meta description you can use https://chrome.google.com/webstore/detail/meta-seo-inspector/ibkclpciafdglkjkcibmohobjkcfkaef?hl=en or you can do manually.
I would also request to wait for more answer from top community member.
Hope this helps.
Thanks
Hi,
There should be no effect as long as your http pages 301 redirect to their https equivalents. For second part of your question see below thread.
https://moz.com/community/q/proper-301-redirect-code-for-http-to-https
Thanks
Hi,
I'm not Andy but answer is you can use either of them both are absolutely fine. There is no disadvantage from an SEO perspective.
Thanks
Hi Amelia,
If you are running PPC campaign on Google Adwords you should try Call-only Campaign launched by Google in Feb 15.
Call-only campaigns allow you to focus on getting more people to click-to-call you straight from your ads.
In case of organic you can define CTA in meta title / meta description & on website as well.
Thanks
Hi,
First of all I would like to let you know that having keywords in URL won't give any ranking boost. Second thing I would go with 'yourdomain.com/business/personal-trainer-software' it will be helpful for user to know about what the page is about.
Thanks
Hi,
There are no direct SEO effects of having favicons. However, there may be indirect benefits and they are as follows:
Full article is here @ https://goo.gl/7l5tYV
Thanks
Hi Ryan,
5 Tools to Help You Identify a Google Slap @ http://www.iacquire.com/blog/5-tools-to-help-you-identify-a-google-slap
Hope this helps.
Thanks
Hi,
You can set up non e commerce goals in GA.Please check below post to set up a Goal in GA.
https://support.google.com/analytics/answer/1032415?hl=en
Hope this helps.
Thanks
Hi,
You can set Goals for Events (instead of Pages). The only thing you need to do is send an event after success.
An event can be:
Read more: http://www.optimizesmart.com/event-tracking-guide-google-analytics-simplified-version/#ixzz3g3R0zV6V
Hope this helps.
Thanks
Hi,
I would track both. Post-Hummingbird, Google has gotten much better at understanding synonymity - if you choose to optimize for one consistently, Google is going to understand your pages are still relevant for related queries, even when you may not have specifically optimized for them. This is why a lot of people are noticing that Google is modifying their how their Title Tags display in the SERPs - they're tailoring them to the queries for improved CTR but they can do that only because they know the modifications are still highly relevant to the listed page.
Hope this helps.
Thanks
Hi,
IMO no need to include LLC in meta description, just create meta description in a way that after reading description user click on your website listing in SERP.
Thanks
Hi Rich,
Patrick already recommended few articles and give you suggestion that which link to disavow. I would like to add one more article in which you will find step by step method to find which link to disavow @The Ultimate Guide To Cleaning Up Your Link Profile Step By Step
Hope this helps.
Thanks
Hi Matt,
They are certainly using some black techniques to get backlinks otherwise it is not possible to get such number of backlinks. Google sometime miss but Google will penalize that website when they caught. After penguin 4.0 I assume they will caught soon.
If you want you can report Google about this by filling this form @ https://www.google.com/webmasters/tools/paidlinks?pli=1
Hope this helps.
Thanks
Hi Alan,
cost of per click depends on many factors like the keyword that you are going to use, target location, competitors bidding. So as far as I know without running a campaign it is very difficult to predict about exact cost per click.
As I can see the budget that you want to use is low so IMO you should start campaign with long tail keywords and you should not bid on broad match.
You can use BMM to target long tail keyword.
e.g +Manhattan +office +space +rates
+Manhattan +office +space +lease
You should also consider Call-Only Campaigns, in this campaign there are chance of getting more leads in low cost.
Hope this helps.
Thanks
Hi,
Please follow all the steps mentioned @ https://support.google.com/sites/answer/100283?hl=en
I would also suggest submit sitemap in Bing if you haven't submitted yet @ https://www.bing.com/webmaster/help/how-to-submit-sitemaps-82a15bd4
Hope this helps.
Thanks
Hi Rajiv,
Use this ga('send','event','category','action'); tracking will work, you can test by clicking yourself on the link.
Hope this helps.
Thanks