Local Search ROI = ((Lead Value X Campaign Value) – Cost of Campaign) / Cost of Campaign
http://streetfightmag.com/2013/09/18/calculating-the-roi-of-local-search-campaigns/
Hope this helps,
Tom
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Local Search ROI = ((Lead Value X Campaign Value) – Cost of Campaign) / Cost of Campaign
http://streetfightmag.com/2013/09/18/calculating-the-roi-of-local-search-campaigns/
Hope this helps,
Tom
Hi Chris,
I agree with Moosa,
Use a 301 redirect and if you have valuable content you can check that using Moz or Google Webmaster tools look for top pages or your analytics.
check out the method described here where you use page to page not domain to domain
http://moz.com/learn/seo/redirection
You'll want to 301 redirect instead of point them out your name servers because if you do that you are not really going to help your site out at all. This is a great article regarding merging to websites
http://moz.com/blog/2-become-1-merging-two-domains-made-us-an-seo-killing
here is some more helpful data
http://blog.woorank.com/2013/05/the-flow-of-link-juice/
http://www.seoblog.com/2014/06/link-juice-lost-301-redirect/
hope this helps,
Thomas
Hi Ilka,
http://www.feedthebot.com/mobile/configure-viewport.html
by placing your sites URL in the link below you will be given instructions on how to configure your exact site so it is mobile friendly as well as some other helpful tips.
http://www.feedthebot.com/mobile/
Hope this helps,
Thomas
EGOL just nailed that answer.
Everybody knows EMD carry a small amount of weight but I think the deciding factor should be what else were you going to spend at capital on? you can create a very good site for $42,000 in less it is literally Huber competitive and you're going to not sacrifice the rest of your website and think your domain is going to hold your site up by itself Then go for it.
I honestly would really investigate that traffic. I would want to see not just traffic but how much money in conversions is that site able to make? WHAT IS THE ROI & if it was making money hand over fist why are they selling it?
What Is the back link profile? when you update the whois Google may take this into account
I don't know enough about your particular niche or what you plan to do with the site to tell you what to do but if it's the better half of your budget I would spend it on making sure my site is the best possible site compared to my competitors.
look for a domain that contains the word or words you want and has enough space for a name or abbreviation. It'll cost you about 12 bucks on most domain registrars. Google's domain registration is awesome at showing alternatives.
See https://domains.google.com/ I would have to know much more about what you're doing to give you any more device than what I have.
https://domains.google.com/about/index.html
Hope this helps,
Tom
hi I want to tell you I could not agree more with what EGOL has stated if you are not losing traffic and your conversions more importantly are staying up there regular domain fluctuation is normal and nothing to worry about unless it gets very serious.
You can utilize tools like MozCast as well as SERPS to figure out if it is just Google making minor changes that don't really affect your business or if it is something more serious. If you were to tell me you dropped down to a 25 I would be worried however as EGOL said much better than I can normal fluctuation is nothing to worry about and it is a normal thing.
Here are the links to the tools if you want to utilize them.
&
https://serps.com/tools/volatility
Sincerely,
Thomas
Hi David,
I personally him very against using a exact match domain or a made-up name that you think will do well in the SERPS in order to rank well when you have access to your companies real name.
Unless your company has a name like Viagra or something equally as frowned upon by Google I would simply go by the domain name for your company.
HEMA DESIGN CENTER
Could become
remember you are going to have a lot of people that are going to go and attempt to circumvent Google and just type your company name In as a URL.
you will also have a huge problem with dual names people will not know what to call you it will be a mess.
if you legally change your company name then I have no issue with it however you should be able to rank well without having to change your company name when you have the word design in your Company name already.
I am not a lie and say that if your cabinet company the word cabinets is not going to give you a boost. But honestly there is a strong chance in the very near future that you will lose 100% boost it gives you ( and please do not take this as me saying it is going to help you a lot because it will not)
keep your existing company name trade market brand it and then run a solid search campaign doing the right things. By starting off not trying to trick Google by using partial exact match domain simply to get more visitors to your site is the right way to start your business online. Then remember never try to trick Google.
Just think of the confusion you will cause by having two names.
Sincerely,
Thomas
Yandex is Russia's answer to Google. Can give you data and you can rank and it just like any other search engine however it is specific to Russia.
If you do a lot of international search you would care quite a bit. It also has some unique local tools that are not as good as MozLocal however not bad at all. Compared to most local toolsets. If you find yourself in Europe or Russia and or need to target that area you will want to be friends with Yandex.
I use it it has some pretty cool tools.
Hope that helps,
Tom
I believe featured snippets will be almost impossible to change it seems once that they are cemented in for more than a month or two they are not going to be volatile anymore. Thus we cannot capture them as easily.
Google will continue to give more SERP real estate to Ads / PPC to take it away from organic especially on mobile. Also, no click answers will become more and more prominent.
Content will have to get better and better and better because the Internet is growing (and please don't quote me on this) something like 10 times every three months more websites with great content means less space at the top of the SERPs higher-quality more expertise authority and trust E-A-T.
I agree with you that voice search is a huge game-changer. I want to see it adopted a little bit more but I do agree with you.
Honestly, money spent on high-quality content and high-quality work will go up not a very sexy answer but an honest one.
All the best,
Tom
run it through Moz OSE, Ahrefs, majestic & GWT's
that is what you will need to see before purchasing something so expensive.
If it has been sitting waiting for somebody to purchase it I don't think you're going to see anything close to the traffic they're talking about. Their name servers are hosting it changed Google is very smart and they may want to take it from the top. Run the domain through http://who.is see what the NS's are
• No icebergs you have complete control of the code. If you know HTML you have 2 choices you can either edit from inside the WordPress site itself on the backend. Or you can do it over FTP whatever your favorite option is. As far as things that it cannot do that are search engine related there's nothing I can think of that is a negative about WordPress aside from the fact that it has a database and in my opinion needs a managed WordPress hosting company After being placed on a managed WordPress host they are able to be faster than 100% HTML website.
Some advice use a managed WordPress host the security, speed and access to incredible free knowledge regarding WordPress is well worth it.
Read this fantastic post by a true WordPress expert Dan Shure of evolving SEO
http://moz.com/blog/setup-wordpress-for-seo-success
WordPress is ideal for search engines and if you want to see how quick they are use,
Check out any of the links I've given you below their all hosted on an assortment of different hosts however they will run very quickly.
This is worth the extra attention simply put these managed WordPress hosting companies will answer these questions as they are WordPress experts as well as a host.
My advice is to go forward with WordPress is an excellent platform Matt Cutts uses WordPress for his blog.
For creating the
& more use http://yoast.com/wordpress/seo/ it is free and the best. If you combine that with powerful hosting along with a good developer & preferably Genesis 2.0 HTML framework you will have one hell the site. The site the plug-in is found on above his made from Genesis 2.0
Check out the feature list:
http://yoast.com/wordpress/seo/
I would use the Genesis 2.0 framework it is 100% HTML 5 and will give you the ability to make whatever you want your site to be. I think it is the best framework on the market and well worth it. You can find it along with themes at studiopress.com
Also when you're in the studio press website you can learn quite a bit. If you are commissioning a designer and then a developer to make the design into a site. I would stick to Genesis 2.0 HTML 5 framework. Wootheams is also HTML 5 however if you're making a custom project I would strongly recommend making a Genesis-based website. In fact I have given you information on other WordPress sites below and all of them are built on Genesis 2.0
If you are looking for a developer I can recommend Gregreindel.com you can go to his site and see quite a bit about all aspects of WordPress code itself that he actually was instrumental in building the Genesis 2.0 framework making it available far before anyone else had it and even Yoast.com was built using his own code that was ahead of everyone else.
I strongly recommend him as a developer. He has just started his own agency literally the other day and has not completed the site yet however it is called AuthorityDev http://www.authoritydev.com/
Some other themes and framework companies to look at http://www.woothemes.com/ is a solid company as well. Look at Moz.com/perks for big savings on Woothemes the fact that Moz is such a Word press Pro site especially when it comes to perks 2 of the 3 hosting companies are managed WordPress hosts.
(If you use a Moz.com/perks host for WordPress I would not use any other host than Pagely.com 30% off for life or WPengine.com 4 months free hosting WordPress even though site ground says it is a WordPress hosting company & is a great host it just does not have the same WordPress only centric advice and performance as the other two.)
It has many excellent attributes for search engine optimization.
One thing I would recommend if this is your 1st WordPress site is using a managed WordPress hosting company.
WordPress has a database so it is slightly slower than HTML sites however by using not only will you a managed WordPress host not only will you gather back the speed you lost you will have a company with a tremendous amount of WordPress knowledge when you need it.
I would recommend Pressable (ZippyKid rebranded recently), WP Engine, Get Flywheel, Pagely and web synthesis as everyone of them will help you out with a lot more than just hosting your website.
Cost wise getflywheel.com only is $15 also look at moz.com/perks for discounts from the WP engine as well as Pagely. ZippyKid renamed as Pressable gives you a lot for the money web synthesis gives you free Genesis framework allowing you to buy themes for roughly $30. However you can purchase a theme for $100 that comes with the framework and if you want to buy additional themes you do not have to repurchase framework ever so it is your best fit. I can tell you get flywheel is very eager to help and the least expensive.
If you're worried at all about search engine optimization and WordPress being a bad thing for it I cannot think of anything other than what I've stated that is negative about WordPress. Remember it might be slower on a regular host and it should not in my opinion be hosted on a regular shared hosting platform, however it definitely can be if needed.
You will want to install the Yoast WordPress SEO plug-in it is one of the best on the market if not the best. There are 2 versions the premium and free version. The only differences the premium comes with an automatic 301 redirect ability and support. Other than that the free one is probably your best bet because the premium is $80.
If you would like to learn more about WordPress I would suggest checking out Yoast.com , GregRenidel.com, Webdevstudios.com
If I can help you in any other way with this please let me know. I prefer WordPress over any other platform currently and have tons of sites on many different hosts. So if you want to see what you can do with with WordPress check out a client's site Garrisonbespoke.com this was placed in studiopress.com as one of their top picks for Genesis-based websites. In addition the site stood up to 100,000 visitors last month when we made Time magazine and GQ. It is hosted on http://getflywheel.com/ , however I know that every host I just mentioned host capable of far more than that.
Sincerely,
Thomas
Under no circumstances if you want to rank for local results should you use the same telephone number for multiple locations or different companies that will destroy your ranking as you will compete against yourself and confuse Google. There are many excellent methods of getting additional phone numbers that are legitimate
do not mess with your NAP
Name
Address
Phone number
Level 3, jive communications, grasshopper, trilo, 8x8, Vonage, & Ringcentral are just a few that come to mind.
I believe you can save money on grasshopper in Moz.com/perks I can personally vouch for jive, grasshopper, trilo & 8x8 for inexpensive phone numbers trilo is tough to beat but may not be the exactly what you're looking for. I encourage you to look at all the phone systems I can tell you I think
http://grasshopper.com/blog/should-you-have-a-local-number-for-business/
http://grasshopper.com/blog/local-numbers-have-benefits-too-ya-know/
Can you give me a snapshot of your analytics? Also I don't want to alarm you but Google's pigeon effects sites having to do with real estate really hit the United States hard. Now it's been rolled out to Canada and the UK I am not sure if it is an NDA had but I'm guessing it might have just hit.
http://www.homeadda-sobhaaspirationalhomes.in/
The URL itself is extremely hard to remember but that probably did not do it. You do not have very many back links and local real estate is becoming similar to organic search
https://www.seroundtable.com/googles-local-pigeon-algorithm-global-19620.html
http://searchengineland.com/google-pigeon-update-rolls-uk-canada-australia-211576
http://moz.com/local-search-ranking-factors
I hope that you have not been hit and I wish you luck let me know if I can do any more help,
Tom
Thank you Effect digital great post!
i agree E-A-T showed a big difference in 2019 & I think it is the start.
All the best,
Tom
Hi Gary you're more than welcome. I am happy to be of help. I would say the exact match to me is not enough to stand on its own. Meaning if it's an extremely competitive marketplace you will have to put quite a bit into the site in order to keep it competitive.
If you search for the keywords that you want the domain to show up for those keywords already. will the site come up? That is something I was looking to use it to like SEM rush and add words to get your keywords. Run it through every tool check the search visibility of the domain.
You will still have to do a lot of work to stay competitive unless this is something you really spectacular you are still going to have to do a lot of work.
I have compiled a list of things that are needed before you launch. Run the Moz campaign mode and see if there are any errors do this by allowing Roger bot to index but not Google - http://moz.com/learn/seo/robotstxt
Allow a specific web crawler to visit a specific web page - Disallow: /no-bots/block-all-bots-except-rogerbot-page.html User-agent: rogerbot Allow: /no-bots/block-all-bots-except-rogerbot-page.html and http://moz.com/tools/crawl-test
Then use https://moz.com/researchtools/on-page-grader and https://marketing.grader.com/
I would buy screaming frog SEO spider to index your website if over 500 pages and be certain there is no possible duplicate content and a simulated Google bot can index the site properly. This is most important. You can also Check the Way, Google would be your website through the eyes of Google bot using http://feedthebot.com or http://www.screamingfrog.co.uk/seo-spider/ . This will show you any errors when you run it through all the tools available.
Well this may sound very obvious, make sure no index no follow is in place right now if you do not want to go live & you are using a staging server. If you are using a staging server make sure to 301 redirect that link to your site use a rel canonical link to point to your new domain's site pages that were on the former staging server for instance. If you have an alias/staging server named example.staging–server.com who would want to follow the example below on the staging server to make sure there is no duplicate content. This is rare but does happen - https://example.staging-server.com/shoes/hightops/
You can use this if running Magento - https://yoast.com/tools/magento/canonical/. Everything else is listed below:
Sincerely,
Thomas
#1
read this
Do you have a business that has a location in the towns Locations you wish to do business in?
You'll need to do face-to-face business in those towns which it sounds like you're doing to use Google business local.
The scribe something unique about the area in which your company is going to be doing business show them that you know the neighborhood and town. Is your business near the ocean? Are there any significant geographical differences that would do your company of being an electrician different for instance sea air might call for a different type of metal. Or you do boats?
Think of something unique about every town and make sure it is within their and authentic.
optimize
http://www.localvisibilitysystem.com/2014/01/02/50-examples-of-title-tags-that-rock-at-local-seo/
http://www.whitespark.ca/blog/post/49-3-words-that-define-local-seo
http://www.whitespark.ca/blog/category/12-citation-building
Then incorporate Moz local to have citations built or your business.
Use schema or structured data rich snippets see
https://www.otexelectrical.co.uk/windsor-24-hour-emergency-electrician
Like you did here:
I hope this is of help,
Tom
I agree with Patrick we need more info
there's no way to run them side-by-side efficiently without using Hreflang
you will have to pick a www.or non-www.
domain
using a tool like https://www.deepcrawl.com/ is going to make this easy to check. You can do this many ways including via site map
http://moz.com/learn/seo/hreflang-tag
http://www.themediaflow.com/tool_hreflang.php
http://www.internationalseomap.com/hreflang-tags-generator/
http://www.internationalseomap.com/
http://www.searchenginejournal.com/getting-a-better-understanding-of-hreflang/60468/
Let's assume you were to pick website.com as your default
http://moz.com/blog/hreflang-behaviour-insights
By Geo targeting your subfolder e.g. /ca/ in Google Webmaster tools you can accomplish your goal that would look something like this.
If you end up choosing to 301 redirect your existing site to a new site make sure it is page to page not just redirecting the entire domain.
http://moz.com/learn/seo/redirection
I hope this helps and please inform me more about what you are actually planning on doing because it is kind of tough to guess.
All the best, Tom
Thank you EGOL!
That is kind of you to say & you improved my answer.
I agree with everything you said my friend!
All the best,
Tom
This is not my strong suit however. I do know Google is coming down hard on people pinging their links all at once. I would use a proxy if I were doing it. If I were you I would release the links over a course of the next 2 to 3 weeks. If Google sees a spike think that's not natural. Here's some info below to I hope I was of some help to you.
Sincerely,
Thomas Zickell
"I think about 6 months to 1 year+ ago, mass pinging tons of backlinks doesn't work anymore... anyone notice if you bulk ping thousands of backlinks all point to your website to mass ping servers doesn't work any more?
It seems like Google has implement a filter to strip away ping spam, and if you find that pinging with a lot of backlinks in the shortest time doesn't work, how about spreading them 100 sites per ping to fewer ping servers, would that work?
Yes, it does, and my recommendation is - ping to less than 10 ping servers at a time, and ping only 100 backlinks or less (that link to the same money site) at a time work much better! Anyone had similar experience?"
unfortunately
Dmoz
The Open Directory Project (ODP), also known as DMOZ (from directory.mozilla.org, its original domain name) is and severely understaffed I've heard of people waiting 2 years to get there site listed. I would not bother to be completely honest with you and listen to something very important to you. However I do know it will not have a big effect on your rankings whatsoever all they can really do is change your description of your website which Google still honors for some reason. I honestly would put my time in 2 doing something other than trying to get in that particular directory with the site that is more productive than playing with Dmoz I've literally known people that have gotten jobs there to try to get their site listed and that did new work. Sorry about the bad news but I hope this is of help, Tom
Armor.com HHVM, Redis, FAST
https://www.engineyard.com/magento HHVM, Redis, FAST
mgt-commerce.com HHVM, Redis, FAST
rackspace.com PHP7 redis fast
peer1.com fast
http://www.peer1.com/cloud-hosting/mission-critical-cloud HHVM, Redis, FAST
hope this helps,
tom
Please do not take this as anything negative. However I think the boards are meant for actual questions you could maybe post under a discussion. Better yet if you really want to get on the radar do a YouMoz Post.
I hope that helps,
Tom
You have a 9 MB homepage with 199 request the server that is way too big of a page for anyone on a mobile device and extremely slow for people even on the fastest connections.
You are not blacklisted and you will want to evaluate the amount of links that you have as you normally do not rank immediately with 10 back links from directories with exact match anchor text.
I have put together a list of problems with your site and ways to fix them.
“despite trying to "clean the slate" and continue to build better backlinks (mostly through reputable directories) I'm still getting little to know rank increases.”
Just directories for back links with what Google does not want people creating anchor text that's “exact match” not a good sign to Google.
Yes relevant and high-quality directories that add something to the web may be worth something. But that is where you got all 10 of your back links so it means very little in the eyes of Google.
Would you be kind enough to explain how you went about moving the domain?If you had back links from other places it would show up in less you swap domains less than a week ago.
You have a domain authority of 15 out of 100 that combined with 10 total low-quality directory back links.
What you need to do is a whole lot of work to build this into a site that is flat-out better than everything else in your niche. That includes essentially taking the best site you know of in your industry and topping it.
You do not have not a lot of content plus lack of exceptional content (e.g.X10 content) See https://moz.com/blog/why-good-unique-content-needs-to-die-whiteboard-friday
When changing the domain you should expect a drop in traffic for at least 3 to 4 months after if you did every single thing right with your domain migration.
May I ask you how you went about the migration? I do not see the old domain redirecting anywhere.
These are the reasons you site is not ranking well.
What you can do about it is start by opening a Google Webmaster tools account if your site was blacklisted you would know it. as long as you have a Google Webmaster account if you do not get one below using the links provided.
https://www.google.com/webmasters/
http://www.bing.com/toolbox/webmaster
If you're talking about Being actually blacklisted I ran a check
https://sitecheck.sucuri.net/results/viplane.com.au/
I found one reason the site is extremely slow when checked out of Australia using pingdom here
http://tools.pingdom.com/fpt/#!/ekOK16/http://viplane.com.au/ Australia
http://tools.pingdom.com/fpt/#!/uAw1A/http://viplane.com.au/ New York
Out of New York you're about 15 seconds it appears that the time I got in Australia is the fastest your site has performed according to its history. cloudflare is a inexpensive /free CDN I found an agency that specializes in WordPress in Australia that has a deal with WP engine based out of Australia I would ask them if you can purchase hosting from them. As well as get them to fix your site if you feel you need help.
You Need to use a plug-in to combined your JavaScript and your site will be faster.
A quality host out of Australia
http://www.anchor.com.au/managed/wordpress-hosting/
http://www.quora.com/Are-there-any-Wordpress-only-hosts-with-data-centre-s-in-Australia
because WP engine offers only a enterprise-level offering in Australia I would ask this company how much they charge to host one site. They look like they can help with the development issues as well.
http://thedma.com.au/wordpress-hosting-and-maintenance/
for images you need to crop them appropriately then run them through JPEGmini in http://www.jpegmini.com/ addition to a tool like https://kraken.io/
that will take care of the PNG's & GIF's
afterwards keep a plug-in installed I recommend
you may prefer something with more or less options I would look at the plug-ins that are updated to WordPress 4.2.2 and do some trial and error here is where you would look
https://wordpress.org/plugins/tags/minify
I hope this helps,
Tom
http://viplane.com.au/wp-includes/js/jquery/jquery.js?ver=1.11.2
http://viplane.com.au/wp-includes/js/jquery/jquery-migrate.min.js?ver=1.2.1
http://viplane.com.au/wp-content/plugins/SocialGallery/js/socialGalleryEpic.js?ver=4.2.2
http://viplane.com.au/wp-content/plugins/SocialGallery/js/jgestures.min.js?ver=4.2.2
http://viplane.com.au/wp-content/plugins/pa-faq/assets//js/default.js?ver=4.2.2
http://viplane.com.au/wp-content/plugins/pa-faq/assets//js/custom.js?ver=4.2.2
http://viplane.com.au/wp-content/plugins/slider-pro/js/slider/video.min.js?ver=4.2.2
http://viplane.com.au/wp-content/plugins/mashsharer/assets/js/mashsb.min.js?ver=2.3.5
http://viplane.com.au/wp-content/plugins/menufication/js/jquery.menufication.min.js?ver=4.2.2
http://viplane.com.au/wp-content/plugins/menufication/js/menufication-setup.js?ver=4.2.2
http://viplane.com.au/wp-content/plugins/wordpress-social-stream/js/jquery.social.stream.wall.1.6.js?ver=4.2.2
http://viplane.com.au/wp-content/plugins/wordpress-social-stream/js/jquery.social.stream.1.5.11.min.js?ver=4.2.2
http://viplane.com.au/wp-content/plugins/nextcellent-gallery-nextgen-legacy/js/owl.carousel.min.js?ver=2
http://viplane.com.au/wp-content/plugins/gravity-forms-placeholders/gf.placeholders.js?ver=1.0
http://viplane.com.au/wp-content/plugins/SocialGallery/js/socialGalleryPlugin.js
http://viplane.com.au/wp-content/themes/vip/nggallery/libraries/js/jquery.mousewheel.js
http://viplane.com.au/wp-content/themes/vip/nggallery/libraries/js/jquery.jscrollpane.min.js
http://viplane.com.au/wp-content/themes/vip/nggallery/assets/js/jquery.dop.NextGENThumbnailScroller.js
http://viplane.com.au/wp-content/plugins/slider-pro/js/slider/jquery.videoController.min.js?ver=3.9.3
http://viplane.com.au/wp-content/plugins/slider-pro/js/slider/jquery.easing.1.3.min.js?ver=3.9.3
http://viplane.com.au/wp-content/plugins/slider-pro/js/slider/jquery.advancedSlider.min.js?ver=3.9.3
http://viplane.com.au/wp-includes/js/jquery/ui/core.min.js?ver=1.11.4
http://viplane.com.au/wp-includes/js/comment-reply.min.js?ver=4.2.2
http://viplane.com.au/wp-content/themes/vip/assets/js/bootstrap.min.js?ver=1.0
https://platform.twitter.com/widgets.js
http://assets.pinterest.com/js/pinit.js
http://ajax.googleapis.com/ajax/libs/jqueryui/1.10.3/jquery-ui.min.js
http://widgets.easyweddings.com.au/scripts/widget.js
Hi Marisa,
Thank you for the kind words. It seems not many people seem to understand the point system. I am not blaming anybody to this point it out
hope you're doing well
your friend,
Thomas
Link building is something that is getting harder every day. I think only you will know the answer of rather not it is time to move to a professional search engine optimization service and request link building.
Mostly you can earn links with content things like schema and video help out quite a bit when I say video I mean wistia not YouTube.
I would look at evolvingSeo.com
Internet marketing ninjas
seer interactive
any recommend companies shown in the link at the bottom of the page
link building obviously has any place but is done differently now. To simply asked somebody to build links is a gamble and you want to make sure that company is a extremely reliable company it is worth spending the money on the best if you are going to go down this path.
Sincerely,
Thomas
I Strongly Suggest **YOU DO NOT REDIRECT TO THE HOMEPAGE the Only Way You're Going to Keep the Backlinks From Not Being Devalued by Google is to Add HR Content With the Plant ** CONTENT OR HAVE A VERY RELEVANT ALMOST IDENTICAL PAGE TO REDIRECT TO.
Sorry for the very long post. I just found this and realized it explained everything very clearly.
"IMPORTANT
Redirecting pages to somewhere relevant is key. Google treats irrelevant 301 redirects as soft 404’s, so there’s no real advantage of redirecting unless you’re doing so to a similar and relevant page.
Google’s John Mueller explains more in this video. https://youtu.be/nIDZmac_rMI
If you don’t have a similar or relevant page, and you still have a 404 page with lots of high-quality backlinks then, honestly, it may be worth republishing the content that used to exist at that location.
Think of it like this:
If the dead page was valuable enough to attract high-quality backlinks in the first place, then it’s worth questioning why it no longer exists. I mean, it’s clearly a topic people are interested"
This case often happens when you are lazy to investigate all of your 404 URLs and map them to the appropriate landing page.
Cite: https://www.searchenginejournal.com/technical-seo/redirects/
"According to Google, they are still all treated as 404s.
@JohnMuReplying to @p3sn @jdevalkYeah, it's not a great practice (confuses users), and we mostly treat them as 404s anyway (they're soft-404s), so there's no upside. It's not critically broken/bad, but additional complexity for no good reason - make a better 404 page instead.
205:10 AM - Jan 8, 2019If you have too many pages like this, you should consider creating beautiful 404 pages and engage users to browse further or find something other than what they were looking for by displaying a search option."
It is strongly recommended by Google that redirected page content should be equivalent to the old page. Otherwise, such redirect may be considered as soft 404 and you will lose the rank of that page."
If you do anything do not redirect to the homepage that would be the worst possible outcome.
Doing that with simply create a soft for for for all your backlinks and they will be devalued by Google.
In order to continue to read the benefits of your 500 root domains pointing to your plant URL you should
Start writing about your software's User interface keeps people Relaxed. That business will grow? Spend one week with a memo in the entire office saying we will get $500 to whoever rewrites this the best way it will be done very well I promise.
link from it to one of the pages that you need the most boost with most needed pages**,**
Simply add this type of information into what you already have. I'm not going to say it's going to be the easiest thing but it will help your site. If you'd like some help I'm happy to have one of my Writers come up with something that would work
You should also take the rewriting/merging of two pages as an opportunity to better serve search intent and give searchers what they’re looking for. If there are a lot of top 10 lists ranking for the target keyword, make your new revamped post a top 10 list. If there are a lot of how-to guides, well… you get the idea!
NOTE. That has nothing to do with 301 redirects, but it’s worth doing if you want to maximize the ROI of your efforts.
Publish your revamped page and implement the 301 redirect(s)
Now it’s finally time to publish your revamped post/page.
If either of the old URLs is a good match for your new post, then feel free to republish at the same URL. You can then delete the other post/page and add a 301 redirect to the new post.
You may recall that’s what we did with our_ skyscraper technique_ post. We reused the /skyscraper-technique/ URL.
If neither of the old URLs is a good match for your new post/page, then it’s also perfectly fine to 301 redirect both pages to a totally new URL.
**I would concentrate on HR's role in keeping people happy and barely breathe clean. You have so much ammunition use it **About how you're trying to actually help people's mental state.
Human resources provide care for workers and make sure that they are healthy and have a good mental state. You do everything you can to try to keep people from getting sick and making sure that they are happy plants are doing the same thing this is a slam dunk you can make this very very HR with plants. Do you want to get a real lot of attention do you your own COVID-19 and plants in your home office or workplace study****
Information that is positive about plants and working.
You Have so much good stuff that you can mix.
https://www.thebalancecareers.com/how-hr-provides-career-help-4125267
**Example: **While performing the responsibilities of the job, these work environment characteristics are representative of the environment the job holder will encounter. Reasonable accommodations may be made to enable people with disabilities to perform the essential functions of the job.
While performing the duties of this job, the employee is occasionally exposed to moving mechanical parts and vehicles. The noise level in the work environment is usually quiet to moderate. The passage of employees through the work area is average and normal.
Research shows that plants improve air quality, health, mood, and productivity. Here’s how:
in addition, you should probably look at some of the research and consider what HR does and you are providing a beneficial service for people in offices as well as for her people working at home during a pandemic. Make sure you use COVID-19Update the content to reflect the home office and then add a bit of your software into the content in a very clean and easily made relevant method.
Talk about how your company cares about workers this is in fact advice and it is actually beneficial to them at this time. This is not taking away from your site and overpowering your other pages.
THIS IS JUST ANOTHER OPPORTUNITY TO MIX ONE AND TWO TOGETHER.
I know I added a lot of content here and I'm sorry that I did that. But I want you to understand that you can combine these two things because human resources and healthy workplaces go hand-in-hand and so dose the prevention of Covid-19 and plants clean the air
Do not We direct the plant page to the homepage. Combined the plant URL Content with something HR and publish it.
I hope this was of help
Tom
I agree with Umar perfect place for it.
You want to know no if there is a way of grading a landing page vs grading a keyword?
Absolutely you can use the campaign who in seomoz and you can find the individual pages and find how well the page is ranked for that keyword.
If the landing pages rank very high for the keyword and is not optimized well keyword then it is most likely something other than the landing page
However if the landing page is optimized well for the keyword is most likely because of your good work with the landing page
It is pretty easy to determine as long is the keyword is not a exact match domain even still you can use the on page SEO and link tools to determine how well the page is made.
I hope I have been of help.
Sincerely,
Thomas
I think you have so much good information already stated that the only thing I can recommend is a service called removeem http://www.removeem.com/
They are owned by Virante a company that is recommended by Moz the reason I think their good pic is they have 2 choices one you can have them do the work which I think you might want to lean towards that this is a very serious thing that you're doing. Other than that you can save money and use their system but make the work a little bit easier on you. When you are removing links in this manner you want to be extremely careful that you do not hurt yourself in Google's eyes.
I would wait anymore but everything has already been said.
All the best,
Thomas
because your site is responsive it will supply the same description and title tag rather or not you are using mobile Or desktop.
The best tools for fixing this issue are in my opinion
deepcrawl.com ( $80 a month)
&
http://www.screamingfrog.co.uk/seo-spider/ (free for up to 500 URLs or I recommend the Pro version at £99.00 (Excl. VAT for one year)
each will help you find and fix
Page Titles – Missing, duplicate, over 65 characters, short, pixel width truncation, same as h1, or multiple.
Meta Description – Missing, duplicate, over 156 characters, short, pixel width truncation or multiple.
Page Title & Meta Description Width – This option provides the ability to control the character and pixel width limits in the SEO Spider filters in the page title and meta description tabs. For example, changing the minimum pixel width default number of ‘200’, would change the ‘Below 200 Pixels’ filter in the ‘Page Titles’ tab. This allows you to set your own character and pixel width based upon your own preferences.
Other – These options provide the ability to control the character length of URLs, h1, h2 and image alt text filters in their respective tabs. You can also control the max image size.
See
One way to check is using https://varvy.com/mobile/
https://varvy.com/titleandalttags.html
with screaming frog you can change the title tag and description knowing it will fit inside mobile then export it and imported into magento.
I hope this is of help please let me know if I can help anymore.
Sincerely,
Thomas
if you have a pharma hack should use Sucuri even if you have had your site cleaned up by Google Webmaster tools I would run it through the free site check at the link below. Then look at all the advantages to just $89 the year plus if you get hacked ever again during that year who is fixed 100% free. They're fast very good and will have your site running normally again.
run your website through this free scanner and read below
http://sitecheck.sucuri.net/scanner/
rather or not you are running a CMS like WordPress does not make much of a difference however in this tutorial or discussion regarding tarmac is focused on WordPress however all of these things would be happening to any website.
http://blog.sucuri.net/2010/07/understanding-and-cleaning-the-pharma-hack-on-wordpress.html
&
http://blog.sucuri.net/tag/pharma
I even have a live one right here for you I would not give you the actual URL but this is the cleanup URL
http://sitecheck.sucuri.net/results/worldluxurynetwork.com
july 13, 2010 by david dede 52 comments
In the last few weeks, the most common questions we’re receiving are related to the “Pharma” (or Blackhat SEO Spam) Hack on WordPress sites.
This attack is very interesting because it is not visible to the normal user and the spam (generally about Viagra, Nexium, Cialis, etc) only shows up if the user agent is from Google’s crawler (googlebot). Also, the infection is a bit tricky to remove and if not done properly will keep reappearing.
Because of this behavior, many sites have been compromised for months with those spam keywords and no one is noticing. A quick way to check if your site is compromised is by searching on Google for**“inurl:yoursite.com cheap viagra or cheap cialis”** or using our security scanner.
For example, this is the result of our scanner against wpremix.com (which was infected at the time we were writing this post):
The Pharma Hack has various moving parts:
1 – Backdoor that allows the attackers to insert files and modify the database.
2 – Backdoor inside one (or more) plugins to insert the spam.
3 – Backdoor inside the database used by the plugins.
If you fix one of the three, but forget about the rest, you’ll most likely be reinfected and the spam will continue to be indexed.
As always, we recommend that you update your WordPress instance to the latest version. This goes for all of your plugins, themes, etc. WordPress is typically very secure, it’s when you’re running old versions, and/or out of date plugins/themes that run into trouble. Keep your stuff up to date, and it will minimize the risk of infection significantly.
This is the first step in the infection. Generally attackers do large scale scans and try to inject the backdoors into compromised sites. They do this by searching for vulnerable WordPress installations (older versions), vulnerable plugins, hosting companies with known security weaknesses, etc.
When the backdoor is added, it is not immediately executed. Sometimes it stays for months without ever getting called. The common places for these backdoors are:
wp-content/uploads/.*php (random PHP name file)
wp-includes/images/smilies/icon_smile_old.php.xl
wp-includes/wp-db-class.php
wp-includes/images/wp-img.php
Characteristically in the past, these files have had an “eval(base64_decode”, ultimately that’s what most people recommend searching for. However, on the pharma attack, the backdoor starts with:
< ? php $XZKsyG=’as’;$RqoaUO=’e';$ygDOEJ=$XZKsyG.’s’.$RqoaUO.’r’.’t';$joEDdb
=’b’.$XZKsyG.$RqoaUO.(64).’_’.’d’.$RqoaUO.’c’.’o’.’d’.$RqoaUO;@$ygDOEJ(@$j
oEDdb(‘ZXZhbChiYXNlNjRfZGVjb2RlKCJhV1lvYVhOelpY… (long long string)..
So, it still calls “eval(base64_decode”, but using variables making it harder to detect. In fact, none of the WordPress security plugins are able to find it. Our suggestion is to search for “php $[a-zA-Z]*=’as’;” also. After decoded, this is the content of the backdoor: http://sucuri.net/?page=tools&title=blacklist&detail=3ec33c4ab82d2db3e26871d5a11fb759
If you do an inspection of the code, you will see that it scans for wp-config.php, gets the database information, acts as a remote shell and retrieves a lot of information about the system.
That’s the first thing you have to remove before you do anything else.
This is the second part of the attack. After successfully creating a backdoor into the system, a file will be created inside one of the existing plugins. Example:
akismet/wp-akismet.php
akismet/db-akismet.php
wp-pagenavi/db-pagenavi.php
wp-pagenavi/class-pagenavi.php
podpress/ext-podpress.php
tweetmeme/ext-tweetmeme.php
excerpt-editor/db-editor.php
akismet/.akismet.cache.php
akismet/.akismet.bak.php
tweetmeme/.tweetmem.old.php
Note that they will infect one or more of your enabled plugins and use names like wp-[plugin].php, db-[plugin].php, ext-[plugin].php, or something similar. We do not recommend you rely only those samples for your search, but try looking for any plugin file with the “wp_class_support” string on it.
$ grep -r “wp_class_support” ./wp-content/plugins
If you are infected, you will see things like (full content of the file here
./wp-content/plugins/akismet/db-akismet.php:if(!defined(‘wp_class_support’)) {
./wp-content/plugins/akismet/db-akismet.php: define(‘wp_class_support’,true);
Make sure to remove those files. To be 100% sure your plugins are clean, I would recommend removing all of them and adding from scratch (not possible for all sites, but this is probably the most secure way of doing it).
This is the last step, and equally important. This is where the spam itself is hidden. They have been using the wp_options table with these names in the “option_name”:
wp-options -> class_generic_support
wp-options -> widget_generic_support
wp-options -> wp_check_hash
wp-options -> rss_7988287cd8f4f531c6b94fbdbc4e1caf
wp-options -> rss_d77ee8bfba87fa91cd91469a5ba5abea
wp-options -> rss_552afe0001e673901a9f2caebdd3141d
Some people have been seeing “fwp” and “ftp_credentials” being used as well, so check there too.
These SQL queries should clean your database:
delete from wp_options where option_name = ‘class_generic_support’;
delete from wp_options where option_name = ‘widget_generic_support’;
delete from wp_options where option_name = ‘fwp’;
delete from wp_options where option_name = ‘wp_check_hash’;
delete from wp_options where option_name = ‘ftp_credentials’;
delete from wp_options where option_name = ‘rss_7988287cd8f4f531c6b94fbdbc4e1caf’;
delete from wp_options where option_name = ‘rss_d77ee8bfba87fa91cd91469a5ba5abea’;
delete from wp_options where option_name = ‘rss_552afe0001e673901a9f2caebdd3141d’;
Tricky stuff! The attackers are getting better and we have to learn how to protect our sites and our servers. If you need any help cleaning up the mess or you need a partner to help with your security needs, Sucuri is here to assist.
Protect your interwebs!
Our removal process uses our proprietary engine. It has been collecting malware definitions since 2004. Its history can be traced to early open source projects we released before becoming close source in 2008, and later formed into a company, Sucuri, in 2010. You can find information on the early incarnation of the engine by looking at Owl, version .1, and the Web Information Gathering System (WIGS).
The cleanup process has been refined over the past few years. It’s very effective, but continues to evolve. The process is both manual and automated. The automated elements are quite restricted. Every cleanup is handled by a malware analyst whose responsibility it is to look through the results, identify anomalies and clean manually as required. The beauty of it is that the cleanup is included in every package for the no additional fees.
Yes – cleanup is included in every plan!
As malware evolves, so will our service. Under the current cleanups we include remediation for the following:
In most instances our cleanups are conducted remotely, using preferably SFTP, but also HTTP and FTP. Because of the challenges with HTTP, specifically time-outs and other connection issues, we may request secure shell (SSH) access.
Once we have access to your server we load tools that allow us to authenticate with the mothership. This connection allows us to traverse your server files and databases.
The internal ticket system uses the same notification options set in the alerting section. When a ticket is updated you are notified via email, you must log in to the system and update the ticket.
Unfortunately, no, not at this time.
Here’s why:
Sincerely,
Thomas
Think about the industry to your talking about advertising you can simply create brilliant content easier said than done yes about advertising on your website. This will separate you from the junk and if you're very good at it one day you may be considered something like NOLO.com compared to a legal directory.
the first and most powerful reference in my opinion is behind a pay wall you will have to purchase a https://www.distilled.net/u membership to view it. I think it's well worth the money.
Look an example of a music site and how you can get tons of organic links from out-of-the-box content.
http://www.concerthotels.com/worlds-greatest-vocal-ranges
For wonderful place to create content please see
https://moz.com/blog/why-good-unique-content-needs-to-die-whiteboard-friday
** Other great references**
http://contentmarketinginstitute.com/topic/building-audience/
http://contentmarketinginstitute.com/2013/02/b2c-content-marketing-vs-b2b-big-debate-video/
I really this is of help to you and you believe that you will learn a lot from the content guides
Daniel is 100% right about that. Rather not it's hreflang or simply stopping a bunch of links into every page on your website.
It kills your crawl budget.
See; https://www.deepcrawl.com/knowledge/webinars/top-5-tips-successful-seo-audit/
Daniels advice is excellent I have a client who received 20 as much traffic after approximately seven months of removing links that were on every single page. This led to googlebot indexing much deeper and being able to index things that it was ignoring before.
I know this is not a domain migration however it should be treated almost exactly like one
I can if you change tell you from personal experience on quite a few jobs the link structure of a website Google long with redesigning it meaning put it on a CMS like WordPress not that WordPress is that I love it however the links are knocking to be the same. Meaning let's say you had 40 links on example.com/example.php now you will have a URL example.com/example/ or example.com/keyword-example/ only if appropriate
now you have a new website that are looking, more user-friendly, the URLs are easy to understand because of the way the written. You would think I should be better ranking.
Unfortunately the gods that Google will not allow that every time. If you do not map the 301 redirects correctly your site will lose an immense amount of traffic in fact I would say don't make the change and less you can take the time to create new 301 redirects. That point to whatever the new version of that pages is and it must be done correctly with the right amount of time.
Make sure to tell Google that you're making a change
thumbs up to Jeff He has told you exactly the way you should conduct the 301 redirects and how important they are.
Please read the URLs below as well.
What Robert says in this link is right on the money and the URLs below are regarding the SEOmoz.org to Moz.com transition while I understand that's not a cheap or quick transition it does have a very effective method for not losing value when making changes like you are doing.
Please read these 3 links
http://moz.com/community/q/changing-domains-how-much-link-juice-is-lost-with-301-redirect
http://moz.com/blog/domain-migration-lessons
Although Robert does post this URL I think it should be posted again it's about telling Google what you're doing ahead of time
https://support.google.com/webmasters/answer/83106?hl=en
http://moz.com/blog/achieving-an-seo-friendly-domain-migration-the-infographic
if don't know the photograph I attached is big enough however here is a link to it that is large enough and I will link to the author is Aleyda Solis it is her page to give credit where it's due.
this is full-size info-graph about what will affect the domain during a immigration
http://www.aleydasolis.com/images/seo-website-domain-migration.gif
You said You are doing this for local SEO the same woman who made the photo in the link above Aleyda Solis a local SCO expert these are some other things you should be concentrating on
http://www.aleydasolis.com/seo-local-google-places/
You will want to do a complete audit of your website before you move so you're not losing links. That means to me at least using majesticSEO, aRefs & Moz OSE
I love Moz however if you want to utilize all 3 of those tools and you're on a budget you might want to try Raven's tools & SERPS free 30 day trial out it allows you information from all 3 sources. Though I do prefer Moz
once you have done this use a free tool for up to 500 pages http://www.screamingfrog.co.uk/
this will audit allow for you to make 301 redirects and map your site if you pay for the Pro version which is about €100. Well worth the money though.
Last but not least. When you make this transition you are going to need to possibly change from Apache to Nginx I'm basing this on Nginx being close to the 2nd most popular Web server in the world right now. Here are two tools that will you to drop in your Apache code and convert it to Nginx code you take what was in the .htaccess file and dropped the converted code into the Nginx config file all of these tools are excellent however the 1st one is probably the most proven that and the 3rd one lets you see both code changes made simultaneously So you can and right in Apache and watch it come out and Nginx in real time all 3 work your choice.
I hope this helps and I think everyone here has put some good information About this in here.
Best of luck to,
Thomas
Just in case all these excellent answers are not compounding enough I want to let you know I concur with everyone.
Today you want to be careful of using anchor text. I'm not going to lie and tell you that it is a more potent however the best way to do it is naturally most people will not give you incredible anchor text when linking to you if it's done naturally.
It's fine to get a lot of "click here" check out the site" for site's name or the URL. Also, Google is not stupid they have an excellent idea of rather not it's a little bit over optimized and may be a buddy helped hypothetically let's say you sell tennis shoes For clay courts. "Clay-court tennis shoes" looks like it would fine only If rarely done and people will do that on their own so don't do it yourself.
Backlinks and anchor text are still a way to get you removed from the SERPS by Google, but if it is over optimized, it's so easy to tell that you have manipulated the links.
Put it this way Google looks at every link with its AI machine learning algorithm it knows what normal is and what normal is not. You will always get some very optimized URLs and its natural.
One rule I have with Google if you think it's possibly going to be an issue then stay away from it or do what you did here and got good information.
** excellent references on Natural Anchor Text vs Unnatural Anchor Text**
What you have to deal with if you do not pay attention to what Google tells you to do.
Tools For checking your back links.
Tools for checking your external links on your site:
** I hope that this helps and you have avoided some issues with Google.**
** All the best,**
** Tom **
Hi Donna,
Thank you.
The report is something you can customize to be completely white labeled very easily in less than 10 minutes. The tool is https://www.deepcrawl.com it starts at a very reasonable rate $80 and is capable of so much.
I would recommend Deep Crawl to anyone there fantastic.
All the best,
Tom
"Now that https is the canonical version, should I block the http-Version with robots.txt?"
Absolutely not GWT will handel all of it think about backlinks both https:// & http:// urls you will not want to lose the flow of link juice that you would cut off
Remake robost.txt with
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/
But use https:// for the xml sitemap.
I cannot imagine a situation in which one site needs to link to another 5 million times and another 3.2 and then 1.5. Million if you are talking about time or Newsweek or the Huffington Post linking to each other?
With actually related content and links should be in place then no worries but if you're talking about a site like
I Would be very concerned and start disavowing things but without understanding the size of the sites and how many other links it has from what sources and being able to look at the link profile.
Getting that many links from one source make this sound incredibly look very suspicious in my opinion. Unless that site is a well-known brand.
I would need to look at the domain and understand what type of site you are talking about two tell you, but reciprocal linking is considered a no-no unless it is warranted.
It sounds to me like somebody either had negative SEO or spent a lot of money on fiver. Creating bad back links.
How many of the URLs are actually related to what the site is about or the pages they are linking to? If they're all related you have nothing to worry about if the site has 5 million pages it still sounds suspicious but I'm giving you the best case scenario.
I hope that is of help,
Tom
Here is a photo showing that deep crawl is hard at work.
I've also added this URL which has a zip file that contains all your URLs your screaming frog crawl review and the pages that contain Google analytics the pages that do not and what UA number is attached to the page. http://d.pr/f/1f1AL
I want to clear one thing up you cannot get duplicate hit's using Google analytics twice on one page as long as the UA numbers are different the only thing that will happen is your bounce rate will go down dramatically. ( I am talking about full JavaScript snippets) can have two UA's in the same snippet and be okay.
I started to crawl here is a bigger photo I will post it to you when I'm done I will look at the information I've sent you so far I think you will find it very valuable.
http://i.imgur.com/i0F739b.png
All the best,
Tom
lazy loading images is not as good as deferring an image. Because lazy loading images can cause issues can cause JavaScript issues that will not cause problems if you deferred the image instead of lazy loading.
Defer images you will have a easier time the method discussed here does not hurt search engine optimization in fact it will help it because increased load speeds or what people perceive as an increased load speed always helps the end-user.
Here is the best way
https://www.feedthebot.com/pagespeed/defer-images.html
In the scenario of a one page template, there is no reason to do all the things that lazy loading does (observe, monitor and react to a scroll postion).
Why not just defer those images and have them load immediately after the page has loaded?
To do this we need to markup our images and add a small and extremely simple javascript. I will show the method I actually use for this site and others. It uses a base 64 image, but do not let that scare you.
The html
The javascript
All done use the URL provided to go through every issue on your site. remember you can click on anything to see more there are thousands of problems you can look directly at and see a fix for.
IN the URL
You will see green arrows to click on you can also click on anything highlighted in red or if there is a word for a number member you can click on it and see a lot more than what you see in the very beginning.
You have a lot of broken links it also appears your canonical URL is non-www but you 301 redirect to www.
USE THIS URL TO SEE EVERYTHING:
UES THIS FOR A summary http://d.pr/f/bvxB
Sorry it took me so long
Tom
Hi Chris sorry for the late reply absolutely you can do this by using a plug-in cloudfare or PHP code
Another plugin that does this solution but providing an administration area to configure it manually is Autoptimize, that allows to define a specific CSS code in a independent way of your theme CSS stylesheet
The solution of these problem is removing those render-blocking scripts. But if you remove them, some plugins may not work properly. So, the best solution for the smooth rendering is:
1. Remove them from your website source page.
2. Use a single script, hosted by Google as the alternative.
3. Push down the new script at end of the page ( before “” tag).
Here is how to do it.
Copy the code from the following link and paste at your theme’s function.php file.
function optimize_jquery() {
if (!is_admin()) {
wp_deregister_script('jquery');
wp_deregister_script('jquery-migrate.min');
wp_deregister_script('comment-reply.min');
$protocol='http:';
if($_SERVER['HTTPS']=='on') {
$protocol='https:';
}
wp_register_script('jquery', $protocol.'//ajax.googleapis.com/ajax/libs/jquery/1.9.0/jquery.min.js', false, '3.6', true);
wp_enqueue_script('jquery');
}
}
add_action('template_redirect', 'optimize_jquery');
Save the file and you are done! Now recheck the source of any page and you won’t see those two scripts at the head section. Alternatively, you can see the Google hosted JavaScriptscript source at the end of the page.
That’s all! Now the visible section of your page will be rendered smoothly.
Another suggestion from Google Page Speed tool is “Defer JavaScripts”. This problem happens when you use any inline JavaScripts like the scripts for Facebook like box or button, Google plus button, Twitter button etc. If you defer the JavaScript then the scripts are triggered after loading of the entire document.
1. Create a JavaScript file and give the name as defer.js.
2. Place the JavaScripts codes that you want to defer into the defer.js file. For instance, if you want to defer Facebook like box script, paste the following at that file.
(function(d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s); js.id = id;
js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1&appId=326473900710878";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
3. Save the file and upload at your theme folder.
4. Now, copy the following code and paste at the head section of the source page. Here in WordPress, open header.php file of your theme and paste the code before the closing head tag.
Make sure to put the correct path of defer.js. For example, the source path should be like this:
/wp-content/themes/theme_name/defer.js
______________________________________________________________________________________________
I hope that helps,
Tom
It sounds like a case of all the URLs pointing to your homepage. Do you have data on your back links? Use Moz Open site explorer and make sure that you have some URLs pointing to these pages. Otherwise with the new site for a week site you will not rank well for any page regardless. There are so many factors take a look at the learning section here on Moz the beginners guide to SEO is a great read for anybody even a seasoned professional.
It sounds like you need page authority.
Hope this helps,
Tom
I think that Tom gave you one of the best answers possible.
However I hope this helps your site structure should be very similar to one contained in the two URL's
If I may add a little bit of information that I thought was helpful
WHERE TO ADD YOUR HREFLANG TAGS
You can add hreflang tags to your sitemaps, in the HTTP response headers, or on the page itself.
The best place to add hreflang is in your sitemap as including them in the headers or on the page adds weight to every single page request.
The following example will inform Google about the English version from the German version of the website:
<url> <loc>http://www.example.com/deutsch/</loc></url>
<xhtml:link< span=""> rel=”alternate” hreflang=”en” href=”http://www.example.com/english/” /> <xhtml:link < span="">rel=”alternate” hreflang=”de” href=”http://www.example.com/deutsch/” /></xhtml:link <></xhtml:link<>
This method would need to be repeated in full for every page on the site and for all the international websites.
Hreflang tags can also be added to the HTTP header:
Link: http://www.example.com/english/; rel=”alternate”; hreflang=”en” Link: http://www.example.com/deutsch/; rel=”alternate”; hreflang=”de”
Or in the tag in the HTML:
http://www.example.com/english/” /> http://www.example.com/deutsch/
& because you will be creating a new site
https://www.candidsky.com/blog/the-seo-2015-guide-to-website-migration/
it would come down to your backlink profile if it were me I would use
Moz open site Explorer, Majestic, Ahrefs and Google Webmaster tools to determine whether or not I will be receiving a enough Backlinks for a subdomain or separate TLD otherwise I would use a subfolder and an extremely fast method of hosting the site Fastly is excellent or many other great methods as well.
Hope this helps,
Tom
PS use
http://hreflang.ninja/ to check
To crawl using a different user agent, select ‘User Agent’ in the ‘Configuration’ menu, then select a search bot from the drop-down or type in your desired user agent strings.
http://i.imgur.com/qPbmxnk.png
&
Video http://cl.ly/gH7p/Screen Recording 2016-05-25 at 08.27 PM.mov
Or
Also see
http://www.seerinteractive.com/blog/screaming-frog-guide/
https://www.screamingfrog.co.uk/seo-spider/user-guide/general/#user-agent
https://www.screamingfrog.co.uk/seo-spider/user-guide/
https://www.screamingfrog.co.uk/seo-spider/faq/
Ryan gave an excellent answer. Google is using other clues to pick up on new pages I'm not saying don't submit your sitemap I'm just saying add structured data/schema into your store as well. Check your crawl budget and see if your site is eating up too much of it and not being indexed properly by Google.
A simple test to see if something is being blocked is to run your site through https://varvy.com/
If you do not know, I would stress using Deep Crawl or screaming frog SEO spider
Navigation Is often one of many can cause problems where your site will not be crawled correctly.
To determine whether or not you have to crawl budget issue we got each can make independent image sitemaps and index sitemaps as well just to be sure that Google is getting what you want to.
Like Ryan said check out
https://support.google.com/webmasters/answer/183669
Here is a Magento dynamic site map http://i.imgur.com/QKS0bgU.png
validate your sitemap check it for problems
http://tools.seochat.com/tools/site-validator/
https://moz.com/learn/seo/schema-structured-data
http://www.searchmetrics.com/news-and-events/schema-org-in-google-search-results/
https://blog.kissmetrics.com/seo-for-marketplaces-ecommerce/
JSON-LD Microdata
https://builtvisible.com/micro-data-schema-org-guide-generating-rich-snippets/#json
I hope this helps,
Thomas
Yes, it is relatively easy to do there are specialized tools that will make it extremely clear what content has changed or added.
Moz Content is outstanding for doing this.
You can run deepcrawl, and it will keep track of all URLs and how much content has was added or changed.
You can also keep track of added URLs using
https://www.screamingfrog.co.uk/seo-spider/
Kapost Is a good way of grabbing all the content and seeing the differences.
https://app.kapost.com/auditor
You can also keep tabs on how many pages or posts are in Google's index which is not necessarily the best way of showing content was created but will say if it's in Google's index by typing in
site:http://www.example.com into Google's search.
I hope this helps,
Tom
I use Wistia as well and recommend them I do not recommend using their plug-in
You can defer loading of the video and make it so that the site very quickly and is almost not affected at all.
To do this we need to markup our embed code and add a small and extremely simple javascript. I will show the method I actually used for this page.
The html
<iframe width="560" height="315" src="" data-src="//www.youtube.com/embed/OMOVFvcNfvE" frameborder="0" allowfullscreen=""></iframe>
In the above code I took the embed video code from Youtube and made two small changes. The first change is that I made the "src" empty by removing the url from it as below.
src=""
The second change I made is I put the url I cut from "src" and added it to "data-src".
data-src="//www.youtube.com/embed/OMOVFvcNfvE"
The javascript
This code should be placed in your HTML just before the tag (near the bottom of your HTML file). So "**defer.js" is **the name of the external JS file.
I hope this helps, Tom
A free content delivery network the best one would be https://www.cloudflare.com/ is completely free and very high-quality & very simple to set up and will accelerate your site quite a bit. with a network that keeps growing, I would say if you're looking for completely free this is the only way to go. Here's a larger photo of the one below.
http://i.imgur.com/CikgNp4.png
**The same goes for Incapsula.com free plan, but if you do not log in often enough, they will turn it off on a free program. If you get their base paid offering, it is an incredible CDN as well. **
If you're looking to spend tiny bit more money but want a pure CDN to check out keyCDN CDNFi, CDN 77 Max CDN Rackspace CDN uses Akamai, and it's a bargain.
If Looking for great content delivery networks fastly, CacheFly.com, edge cast, Turbobytes
http://www.cdnplanet.com/cdns/
If you are looking for a higher quality CDN unique just content delivery CDN or information on CDN's, this is a list of reviews capabilities of each CDN and the cost and comparison
I hope this helps,
Tom
In **apache **"permanent" "RedirectPermanent" is the same as "Redirect 301"
By default, the "Redirect" directive establishes a 302, or temporary, redirect.
If you would like to create a permanent redirect, you can do so in either of the following two ways:
- Redirect 301 /oldlocation http://www.domain2.com/newlocation
- Redirect permanent /oldlocation http://www.domain2.com/newlocation
https://www.aleydasolis.com/htaccess-redirects-generator/
If no <var>status</var> argument is given, the redirect will be "temporary" (HTTP status 302). This indicates to the client that the resource has moved temporarily. The <var>status</var> argument can be used to return other HTTP status codes:
<dl> "permanent" & "Redirect 301"</dl>
<dl>
<dd>Returns a permanent redirect status (301) indicating that the resource has moved permanently.</dd>
"temp"</dl>
<dl>
<dt>Returns a temporary redirect status (302). This is the default.</dt>
"seeother"</dl>
<dl>
<dd>Returns a "See Other" status (303) indicating that the resource has been replaced.</dd>
"gone"</dl>
<dl>
<dd>Returns a "Gone" status (410) indicating that the resource has been permanently removed. When this status is used the <var>URL</var> argument should be omitted.</dd>
</dl>
**https://httpd.apache.org/docs/2.4/mod/mod_alias.html **
To 301 Redirect a Page:
RedirectPermanent /old-file.html http://www.domain.com/new-file.html
To 301 Redirect a Page:
Redirect 301 /old-file.html http://www.domain.com/new-file.html
https://i.imgur.com/PTEj5ZF.png
https://www.aleydasolis.com/htaccess-redirects-generator/
Permanent redirect from pageA_.html_ to pageB.html.
.htaccess:
301 Redirect URLs.
Redirect 301 /pageA.html http://www.site.com/pageB.html
https://www.aleydasolis.com/htaccess-redirects-generator/page-to-page/
<ifmodule mod_rewrite.c="">RewriteEngine On
Redirect 301 /pageA.html /pageB.html</ifmodule>
https://www.htaccessredirect.net/
//Rewrite to www
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^site.com[nc]
RewriteRule ^(.*)$ http://www.site.com/$1 [r=301,nc]//301 Redirect Old File
Redirect 301 /pageA.html /pageB.html
You asked about Regex
https://mediatemple.net/community/products/dv/204643270/using-htaccess-rewrite-rules
.htaccess
Regular expressions
Rewrite rules often contain symbols that make a regular expression (regex). This is how the server knows exactly how you want your URL changed. However, regular expressions can be tricky to decipher at first glance. Here's some common elements you will see in your rewrite rules, along with some specific examples.
See for more regex
Hope this helps
Tom