I don't know what the PrestaShop CMS is like. I was more referring to the Google Search Console if you have that set up. I'm afraid I can't really comment on what PrestaShop is like.
Good luck, I hope you find what you are looking for!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I don't know what the PrestaShop CMS is like. I was more referring to the Google Search Console if you have that set up. I'm afraid I can't really comment on what PrestaShop is like.
Good luck, I hope you find what you are looking for!
Hello,
It's worth having a look at Webmaster Tools, within the Crawl Errors section. It is possible that, if your URLs changed with the site migration, the old URLs no longer direct to the correct pages. This would result in a lot of old referral traffic hitting 410 or 404 pages. This would also help work out if the traffic is getting reclassified or not.
I hope this gives you a place to start.
Hello,
I believe that in URLs % signs plus letters/number can be translated into different characters. For instance %20 is a space. %21 is a !. WC3 have a guide here http://www.w3schools.com/tags/ref_urlencode.asp.
I don't know why it would translate into oriental characters, but that may give you a place to start your investigation.
Hope this helps.
Cheers,
Luke
That is a very fair point. It is a completely new site and I hadn't even thought about things like the domain age. It does show up under a "site:http://www.____.com" search, I was just wondering if this is one of those things Google keeps a memory of, if that makes sense.
Thanks for your response Mike.
That is a very good suggestion. I'll try it (a useful URL also so thanks for sharing).
Thanks for the response Matthew.
A client has a Wordpress blog to sit alongside their company website. They kept it hidden whilst they were developing what it looked like, keeping it un-searchable by Search Engines. It was still live, but Wordpress put a robots.txt in place. When they were ready they removed the robots.txt by clicking the "allow Search Engines to crawl this site" button.
It took a month and a half for their blog to show in Search Engines once the robot.txt was removed.
Google is now recognising the site (as a "site:" test has shown) however, it doesn't rank well for anything. This is despite the fact they are targeting keywords with very little organic competition.
My question is - could the fact that they developed the site behind a robot.txt (rather than offline) mean the site is permanently affected by the robot.txt in the eyes of the Search Engines, even after that robot.txt has been removed?
Thanks in advance for any light you can shed on the situation.
Hello,
I don't know a way of finding every page in a list but there is a brilliant Chrome plug-in for finding out the Meta-Data and Tags for a specific page. It's called Meta SEO Inspector (currently in v.2.0.8 - the icon is a yellow lightbulb over a blue one).
What it does is list the Meta-Title, description, number of
One method you could use, depending on the size of the site, is to go through the sitemap page by page, checking which pages have H1 tags and which ones don't. It's a bit labor intensive, and it could take a while depending on the site size, but it would certainly give you a definitive answer. Alternatively you could look at the source code of each page, but I imagine that would take forever.
Someone else may be able to refer a specific piece of software to you, but that's how I think I would do it. I hope this helps.
Cheers,
Luke
Hello,
If you have access to the analytics of a site you can look in Behaviour and then see the Behaviour Flow. This will break down the pages people land on, how many click through, and how many drop off at each point in their journey.
I hope this is of some help,
Cheers,
Luke
Hello,
Completely agree with Logan. He raises a very good point about Google Analytics being good at subcategorising traffic.
There is another potential solution that may be worth considering. It requires looking at the timescale and quality of the traffic that is coming through from email.
Google uses all kinds of things as ranking factors for SEO, including site usability. If email contributes a large amount of traffic to your site, but has a high bounce rate/low pages viewed/reduced time on site, it is possible that these could be negative ranking signals to Google. This would mean that although it could be getting more traffic a page is losing ranking.
I would recommend having a look in Webmaster Tools and, under Search Analytics, have a look at Pages and Position. If you see a decrease then I would suggest looking at the quality of the traffic from email. You could even look at the conversion optimisation of your landing pages.
I would recommend this blog for a fairly in-depth look at ranking factors. If you find you want to look at Conversion Optimisation then there is a fantastic book called "Thinking Fast and Slow" by Daniel Kahneman (if you haven't already read it).
It's always hard to recommend SEO solutions but I hope this helps. If nothing else it should give you somewhere to start.
Cheers,
Luke
Hi James,
I recently came across this blog that has several different solutions to solving spam on Google over on Optimize Smart. It covers so many different kinds of spam it may be of some help. It's mainly Referral but does touch on Organic.
Hope you find what you need,
Luke
Hello,
Completely agree with Logan. He raises a very good point about Google Analytics being good at subcategorising traffic.
There is another potential solution that may be worth considering. It requires looking at the timescale and quality of the traffic that is coming through from email.
Google uses all kinds of things as ranking factors for SEO, including site usability. If email contributes a large amount of traffic to your site, but has a high bounce rate/low pages viewed/reduced time on site, it is possible that these could be negative ranking signals to Google. This would mean that although it could be getting more traffic a page is losing ranking.
I would recommend having a look in Webmaster Tools and, under Search Analytics, have a look at Pages and Position. If you see a decrease then I would suggest looking at the quality of the traffic from email. You could even look at the conversion optimisation of your landing pages.
I would recommend this blog for a fairly in-depth look at ranking factors. If you find you want to look at Conversion Optimisation then there is a fantastic book called "Thinking Fast and Slow" by Daniel Kahneman (if you haven't already read it).
It's always hard to recommend SEO solutions but I hope this helps. If nothing else it should give you somewhere to start.
Cheers,
Luke
I don't know what the PrestaShop CMS is like. I was more referring to the Google Search Console if you have that set up. I'm afraid I can't really comment on what PrestaShop is like.
Good luck, I hope you find what you are looking for!
Hi James,
I recently came across this blog that has several different solutions to solving spam on Google over on Optimize Smart. It covers so many different kinds of spam it may be of some help. It's mainly Referral but does touch on Organic.
Hope you find what you need,
Luke
Hello,
100% agree with everything Patrick put here.
I just thought I would add that Meta-Data is also relevant and important for conversion purposes. Without inputting Meta-Data Google will just crawl the title on the page and use the first 155 characters in the body in the SERP. Having Meta-Data replaces this and, depending on how engaging it is, it can also increase the CTR. A higher CTR is considered a ranking factor.
Luke
Hello,
I don't know a way of finding every page in a list but there is a brilliant Chrome plug-in for finding out the Meta-Data and Tags for a specific page. It's called Meta SEO Inspector (currently in v.2.0.8 - the icon is a yellow lightbulb over a blue one).
What it does is list the Meta-Title, description, number of
One method you could use, depending on the size of the site, is to go through the sitemap page by page, checking which pages have H1 tags and which ones don't. It's a bit labor intensive, and it could take a while depending on the site size, but it would certainly give you a definitive answer. Alternatively you could look at the source code of each page, but I imagine that would take forever.
Someone else may be able to refer a specific piece of software to you, but that's how I think I would do it. I hope this helps.
Cheers,
Luke
Looks like your connection to Moz was lost, please wait while we try to reconnect.