Cheers Guys, i've looked into the methods before, just always struggled integrating into the sites poor architecture. Will try again. Much appreciated.
Tim
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Cheers Guys, i've looked into the methods before, just always struggled integrating into the sites poor architecture. Will try again. Much appreciated.
Tim
Hey all,
I have GTM set up on a site and I am trying to enable the Ecommerce tracking functionality.
My problem is this, I have access to the basket variables all they way through the checkout funnel to the payment page, but they are then lost on the payment completed page. Does anyone know how I can pass the value of the basket across to the payment completed page so I can mark the transaction completed with an actual value present. I cannot post hidden input fields or change the url get variables as the third party system that us used fails.
I have thought maybe javascript calls, but not managed to get to work as yet. Can the Google session pass and hold them in the cookies that are set?
Any help greatly appreciated.
Cheers
Tim
I would recommend taking a look at the following if you are struggling with ghost referral type traffic. Its a great post that was touted around last year when referral traffic spiked for a lot of people. - Hopefully this is what you are after.
https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
I use MaxCDN at my place of work, it has proved itself to be very good, easy to set up and relatively inexpensive depending on the plan. As per Thomas and Thomas - CloudFlare is great as a free version.
it still looks to be rolling out, lots of turbulence still. I have been lucky so far to only be seeing small movements and not the big drops some have seen. Fingers crossed it stays that way.
I don't think you are the only one.... it would seem that there has been a core major Google update. See here from SEO Round Table. and it is still rumbling on...
All of the major search watch sites are also reporting a lot activity, some showing gains/recoveries, some showing massive drops.
OK Moz peeps...
Right then, I have just been reading an article over on SEO RoundTable from Barry Schwartz.
NEW Local review guidelines for businesses - take a look.
It in effect alludes to Google stamping all over review schema and snippets, third party review solutions/providers and really trying to limit how they are used. I have interpreted the new guidelines to say that you can no longer mark up and use external stats on your own site in the form of aggregate ratings from the likes of TrustPilot, Feefo, Revoo (some uk review sites) and more....
These were the two key lines for more
What does everyone else think? and how soon before people get penalised (if ever) for marking up external stats to make your own site and services look more favourable...
Could definately be a slap in the face for Serp CTR and onpage conversion optimisation.
Also how do people expect this to affect PPC review rating going forward. Will Partner sites become a thing of the past?
Looking forward to a good discussion here
PS - I am not staff at Moz just have a t-shirt which is my avatar. I am not sure why below my avatar it suggests I am Staff due to the tag added to it. Is anyone else getting that on their profile too?
This has been banded around so many times, in general I would advocate a sub folder as opposed to sub domain. purely due to the value it adds to your sites main chosen domain, if it were on a sub domain this is where the value would sit and it would not pass any value around as well as it could a sub folder. It is only potentially a small benefit, but a benefit none the less.
A good recent on it was had back on Jan https://moz.com/community/q/the-great-subdomain-vs-subfolder-debate-what-is-the-best-answer
Personally, I find it easy enough - simple as a couple of quick clicks. The homepage of Moz.com is in my opinion more of a landing page aimed at attracting new clients and is about their products than existing users. Login and other elements are where I would expect them to be placed on the screen / menu. I would imagine most people are also advanced web users and have bookmarks and alike to make it a one click process as opposed to two/three.
How would you improve on them? Surely that is more constructive
I agree with both Dan and Linda on this - hot linking in general is not the way to go. However, like Linda mentioned they may own both domains and be using the second like a content delivery network to enable them to parallelise there requests helping to speed up the site as multiple browser requests can take place at the same time and reduce load time. You could also use a third party CDN for greater speed if you are a global brand.
Hope the below helps.
30x in .HTACCESS #PERMANENT REDIRECT
redirect 301 /url-linked-to.html /new-page-you-want-to-link-to.html
#TEMPORARY REDIRECT FOR SHORT TERM CHANGES
redirect 302 /url-linked-to.html /temporary-page-you-want-to-link-to.html
Canonical Tag
Cheers
Tim
Are these new links? It may be a case of Moz needs to rerun its crawl of your website in order to pick up any changes. Check back after a little while or maybe try a recrawl test - https://moz.com/researchtools/crawl-test
Failing that, send them a polite message
Snap, me too!... any ideas as to when it will return been down a few hours now. Didn't know about the Moz Health page - thanks for that.
As per the other guys in the discussion I too have seen this taking place, as a national brand the 4 pack and not so much the local pack has caused a drop in our organic traffic across brands.
We also noted a large than normal increase in Ad Spend when some of our ads switched to the main ad block area from within the sidebar, this is however, now back under control.
Unfortunately the Organic nature of the online searches is being pushed further away from the search user whom is now in many cases just getting ads.
Have you considered also adding an image sitemap.xml to your site to make it easier for Google to crawl all of the files you want etc. You could use something like Screaming Frog to generate one for you.
Hi Bondara, this is a common occurrence and is simply representative of the size of the index that Moz is working off, you will find that it happens quite often as the data is refreshed and nothing is likely wrong at all and will probably be affecting more sectors and people than just yourself.
There is nothing to worry about in this instance unless you find your own stats doing something vastly different.
I can definitely see where you are coming from. Personally if it is a product which has expired I would try and 301 to a similar product, provide a 404 or if possible keep the page active with a message about the product no longer being in stock with some cross sells for similar products if it has expired.
The issue is how many of these pages are still indexed and now point to nothing if no 404 is set up. I think it is about managing user expectation when they land, would you rather they had something to action or nothing but an error.
Also why delete and not simply disable, what happens if the product comes back in stock or a trade agreement is reignited etc.
Personally, if these other domains are inactive and have no content on them they are of very little use and just using them to link back to your master site is a little spammy to me. I would suggest rather than spending time on these low level peripheral sites, the best thing you can do is concentrate your time and effort on your master domain where more rewards will be had in the long run.
If you are looking to disallow url parameters you could use something like the following as a convention.
Disallow: /? or Disallow: /?dir=&order=&p= if you wanted to be more accurate with specific parameters. There have been a few Moz questions of this type over the last few years, if you do look to remove the parameters.
Also try and ensure that the product pages you have listed are well canonicalised and point to the original product etc. A good review on how to do this can be found here. This will in most cases be enough to remove any indexation/duplicate issues.
Hey Moz,
Bit of a random bug bear of mine, each month I log in to download the latest invoice receipt to hand over to accounts for processing etc.
After I click the billing and subscriptions link from the profile drop down menu, I am presented with the subscriptions page. The payment history panel placed located on the right all ways shows my first 3 invoices. As this is a snippet in time, would it not be more beneficial from a UX perspective to show the most recent 3 invoices. That way I can get the info I require that tiny bit quicker rather than be presented with invoices that are 3 years old.
I know it is a tiny thing but wondered i it could be accommodated.
Many thanks
Tim
I am not sure if this would help, but I would use the needs, wants and demands method and determine at what stage you are with your customers.
As per your heaters/boilers example - Are you targeting all of them, or solely those whom you feel you can connect with emotionally - e.g. those whose boiler has packed in and "need" a new one etc or those whose boiler is a bit old and "want" improved efficiency. Or are you targeting someone that simply has no financial restrains and can "demand" the best they can afford.
Inevitablesteps has a great bit of content that helps to describe each segment a little further.
I hope this is of some use.
Cheers
As per Mark, if you are only planning on using one sole language then I would not bother creating a sub directory for your site to reside in when it would sit best in the root.
If however, eventually you plan to branch out again and become a more global operator I would keep your master site in the root and then branch out for supplementary languages e.g .com/fr or .com/de depending upon what you have planned for the future.
Again as per Mark, sub folders are my preferred option rather than sub domains, but both have their pros and cons.
If you do indeed look to branch out also look into serving alternate language set up and definitely check out this article from Search Engine Land on multilingual sites. a great resource.
I agree with Alick300, after looking at your source code it looks like your //www.gstatic.com/wcm/loader.js script is not closed properly and thus in effect the meta is not being found in the normal html flow. I would suggest checking your wordpress header.php file to see if there is an issue in the code that you can close off, alternatively if it is a problem plugin, maybe update or deactivate.
Cheers
Tim
Absolutely agree with Alick3000 on it depends what your profit margins are per product, by knowing what you have available post sale, tax etc have been accounted for then this is likely your maximum cost per aquisition which can be used to determine you bidding activity. If you do not already have in place, make sure you have conversion tracking in place so you can see how many click lead to a sale/lead.
Why not also do some of the Google Adwords trainng over at Google partners to give you a better understanding of bid strategies.
First port of call is the Beginners guide to SEO from Moz.... it genuinely gives you the best starting point. Along with lots more great articles here in the Learn SEO and Search marketing section of Moz.
Also see if the recent Webinar on how to perform a Website SEO audit from Moz is available yet, it will help you gain some real insight into numerous measurable facets of your site and its outreach.
Finally keep coming back to the Q&A section of Moz, there are lots of real people giving real and genuinely great insight into lots of topics.
Good luck
Tim
Have you quite literally just made it live? if so you it could take a while to naturally index. I would suggest ensuring that your site is linked up to Google Search Console where you can perform a crawl of your site and also submit a sitemap of all your new pages. This should help Google to index your site much sooner. Also ensure that your robots.txt or meta robots are being followed to ensure site crawlability.
Unfortunately, there is no instant way for your site to be shown, it will take a little time especially if it is brand new. Also do not forget if you are in a competitive niche it could also take time for your to rank higher than your competitors.
I would also suggest setting up all your local site links from yelp, google maps etc and further social links from facebook, twitter etc to help get your name out there. This can help https://moz.com/local/search
Hope that helps.
Cheers
Tim
As per Eric, it has indeed been turned off. To be fair I had not used it for a while due to it being unmaintained and I wasnt that sure if it was accurate anymore..
Other solutions that I have used for authority signals are developed by MOZ. I would suggest installing the MOZ toolbar extension and also take a look over the opensiteexplorer. You can get some information you want from there like Page Authority and Domain authority. plus spam score and more.
Hey Armen,
It can depend upon how you have it structured and what type of schema mark-up you have in place. Is it an aggregate review for the business or product as a whole or individual review from a customer/testimonial?.
Also just because it the review shows by page for other search queries this does not mean it is broken for branded search to the homepage, Google kind of has the final say on if it chooses to show the snippet. Sometimes it might, sometimes it may not, it may also not show yours in favour of other sites thus not flooding the serps with stars everywhere..
Hope that helps.
Tim
Based on what I understand this is a quite normal side affect of going to https:// and in many cases the benefits you may have gained from performing multiple page speed upgrades, responsive edits etc may be reduced in there effectiveness from a page speed perspective.
However, and this is a big however, you will also benefit from an SEO stance from Google for using an encrypted protocol which is now considered a ranking factor, albeit a small one at press. Second to this, before too long many of the browsers will have fully adopted HTTP2 which is based on SPDY developed by Google and has apparently massive speed gains. In order to adopt HTTP2 you have to have already made the leap to using an SSL. HTTP2 allow you to load in parallel which will in turn give you much greater speeds going forward due to the request loading at the same time nd hence speeding things up even further..
Based on your current scores, I would imagine there are also a few more additional changes that you could make to further improve your sites speed whilst still using and SSL. CDN's, caching, compressing and more.
Use sites like tools.pingdom, gtmetrix, webpagetest.org to see what else can be improved.
Hope this offers some further insight.
As per Dirk and Logan, personally I would try and not have two sets of content to be delivered, the point of responsive is really to have one page that is simply delivered visually consistent on all device types.
With regards to it being considered as duplicate content this is a little harder to determine and I would imagine Google may not penalise you for it really unless it is really spammy and used for keyword stuffing/cloaking etc. Here is an old video from M. Cutts, although it may not be totally relevant in today's SEO landscape and is probably more geared towards duplicate pages.
I agree, GA can be a little hit and miss but it at least it has given you some reason to look into it further and see if there is actually something wrong. As per Dmitrii, I would suggest running a series of tests to establish if your site is truly running slower. There could be a multitude of reasons as to why it is running slower so eliminating the issues one by one may help.
Places to test your site include
tools.pingdom.com
gtmetrix.com
Google page speed test
Webpage Test.
Also as per Dmitrii, in browser with developer tools
As it is a new site, are there any new elements that were not included before? new scripts, images etc. Consider also using CDN's to deliver your content for added speed gains. Check the waterfall tables to see if any elements are struggling.
Have you also moved to an SSL when updating redesiging your responsive site, this can also cause a slight reduction in speed.
Hope this helps
Tim
Personally I would doubt that you would get penalised for it especially as these are genuine reviews. We use a similar tactic here and post individual user reviews gathered locally on site and then also feature aggregate review for an overall score located on a third party site.
As you are using TrustPilot which is a Google Partner for reviews, I would imagine this will only benefit you in the long run, but check with them about featuring their logos on your site. Some review sites are quite particular about how you display their logo/artwork and may also require a link back Due to TrustPilot being a Google Partner, you also have the added benefit of having star ratings in any Google Adwords activity you may be doing, thus helping to further increase their positive impact on CTA's and conversions when landing.
Hope that helps. As per Robert, just make sure you get your schema correct to ensure accurate portrayal of the review data so it is picked up.
Finally, there is a chance that you will get your review snippet on Google, but this is not guaranteed even if your schema markup 100% correct.
Good luck
Tim
As per David above you can also use OpenSiteExplorer to delve into a competitor/relevant sites links to see which are passing the most value. This is great if you are trying to find relevant pages that are linking to your competitors and that will also be of benefit to your own site.
I think the hardest thing for you is to prove that the content is originally yours. I would suggest that when creating content make sure that you log the original date created and before making live inform any parties involved on the policing of your industry that the content is yours.
If the other party is simply copy and pasting for offline use it can be a bit harder to prove, but should you be able to get your hands on any hard copies that were created after your publication date you could possibly prove breach of copyrighted material, that is providing they have not simply rewrote the content to make it completely bespoke to them which again make it harder to prove they did not come up with the content.
If they are copying and pasting to their website make sure you add in a few links back to you as a source but also include any details/image with your own meta, watermarks and exif image data etc again tying it back to you. Eventually they will slip up and you will be able to prove they have been plagiarising your work.
If it is online you could also file a spam report with Google to try and get their copied content removed the serps etc. https://www.google.com/webmasters/tools/spamreport This may also help about copyright infrigement in the US - http://copyright.gov/help/faq/faq-infringement.html / http://www.copyright.gov/
Failing all that I would imagine a court ruling would be the final steps to stop them using your content.
Personally unless you are linking to some other content or maybe offering a larger image to view I would not bother adding a link to the image. To ensure the images is optimised make sure you add a relevant Alt tag that is related to the contents of the image. That's my opinion on it anyway. Others may see it differently.
Just doing a quick test in Opensite explorer it would appear that http://www.scadgells.co.uk has a couple of external backlinks links from a website called furniturebuyer.info, where as http://scadgell.co.uk has no backlinks. The links are from a relevant source, but doubt will provide you with a huge boost.
If you can access OSE, I would suggest looking at the following link for the www. domain too just incase there are a few more elements.
https://moz.com/researchtools/ose/pages?site=http%3A%2F%2Fwww.scadgells.co.uk As well as the backlinks it also appears to show a possible 7 indexed pages which you may want to redirect or even 404 depending on what they were originally about. None of the pages deeper than the homepage appear to have any backlinks.
Hope that also helps.
Cheers Tim
I would suggest having a bit of a read over this old blog post which gives you the necessary info to implement the rel=canonical tag correctly.
https://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questions
I do not think that having the rel=canonical tag pointing to itself would necessarily harm your site, but it is probably best to avoid this if possible as is redundant code. If you have a dynamic meta / header include this might be the best solution for you if you cannot control it manually or by editing the code. I often have the rel canonical running in numerous pages, especially to help reduce MVT pages from being indexed.
Have you performed any more disavow processes since? If you have not, simply login to your Search Console and head to the disavow section.
https://www.google.com/webmasters/tools/disavow-links
If you have not added any more, simply click the profile you are working with, and it should open up a box with a link to your most recent added file. See my linked image which shows the pop up. You can then download the text file that you/they added.
Hope that helps.
If it is an older file, I would suggest talking to google about seeing previous versions of the txt file, that is if they hold onto them.
I don't normally tend to put a time-scale on being competitive, for me it is more a case of if you site is built and optimised to the best of its ability then you should soon begin to climb. There is no reason why it could not be a as little as a few weeks if you nail all of your items down well.
Ensure you follow some of the quality guides that you can find on Moz, to get your site in the best condition possible to give you the best opportunity to climb well.
1. https://moz.com/beginners-guide-to-seo 2. Moz guide to link building - Paddy Moogan
3. How to Rank - Cyrus Sheppard
4. Lots more in here and all over the web.
Very cool.. be nice if it could be extended to the top 100
Note just seen the Edit - great to see it is now showing pretty much everyone that is active.
Yes lots still going on, whether it is linked to the previous update or some other activity we will no doubt find out in time. However, for now I'd potentially expect more movements both up and down until things settle.
Good article and insight from SEORoundTable
Hi Nicolas, the dropbox link 404'd, the above schema you are showing is for reviews, and not for aggregate reviews.
Reviews are per individual item and should be related to the product on the page etc. This will not display any HTML stars in your Organic serp results in Google.
For aggregate results you will need to use another method that in effect calculates a total score based on all of the reviews accrued. You can definately use this for service based products etc.
<div itemscope itemtype="http: schema.org="" product"=""><span itemprop="name">Service
<div itemprop="aggregaterating" itemscope itemtype="http: schema.org="" aggregaterating"="">Rated 3.5/5 based on 11 customer reviews
<./div>
Something like this should do the trick. If you are storing your reviews via an external site I would also add in an itemprop for url to source them back to the original source for justification.
The above should results in you achieving a star rating in your organic serps.
You can visually add in your own stars to your site to make it look more appealing.</div itemprop="aggregaterating" itemscope itemtype="http:></span itemprop="name"></div itemscope itemtype="http:>
The comments above are right, you do indeed need at least 30 reviews in the last 12 months that have been posted to one of Google's review partner websites. See here for a list of locations you could use. Once you have accrued a number of reviews they will be appended to your adverts.
Secondly, providing you are competitive in your SERP positions you could also look to implement Aggregate Rating Schema into your sites code, this will allow you to also display star ratings in your organic listings helping you to again stand out from your competitors when people search. You can look into this further here.
There are a couple of methods to help reduce any ghost/referral spam you may be getting, one is to use your htaccess, (this reduces crawled traffic only) and the other is to set up filters in GA.
This post from Carlos Escalera, is one of my favourites. https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
Hope this helps.
There are a few, in a few roles I was exposed to hootsuite which is good, but I have recently moved over to Buffer which so far to me seems far more intuitive to use. Also with the benefit of a mobile app and chrome extensions for quick publishing and scheduling.
Have you tried getting in touch with the people at Buffer to see if there is anything they can do to help?
I would imagine it is due to the results being driven from two seperate indexes. Moz will be returning error results from its set and Google from theirs. There will also be differences due to the regularity of being recrawled and indexed.
Ah, so you already have something in place, from the initial question it sounded like you had list but not grid view. In that case I would definately run a bit of a split test that defaults 50% to list and 50% to grid, you can then measure a series of selected metrics to see which variant is the most likely to return further engagement etc.
I did a few things like this from my time at VOW, Caboodle and the Post Office Shop. It is really more of a user preference thing than either selecting one over another.
What works best for your content - is it text or image intensive when it comes to your product listings, I often found if there was more image than text - a grid worked better, and vice versa if there was more text.
Also consider mobile - a list solution may be the best option on a smart phone and grid on a tablet.
The optimum solution in my opinion would be to allow the consumer/user to select their preferred layout. You could implement this as a checkbox/button that simply switches the css to display in either grid or list format.