Hi Andy
This is why I have asked the question as the wiki is on its own domain so these aren't internal links.
Andrew
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Andy
This is why I have asked the question as the wiki is on its own domain so these aren't internal links.
Andrew
Hi everyone
This may seem a bit obvious but I am getting conflicting answers on this, we have a client that has a wiki that is basically an online manual of their software.
They do it like this because the manual is so big and is constantly developing, there are thousands of pages with loads of links that are pointing to various sections of relevance on the main site as well, the majority of these are No Follow but I have noticed that they have a single link on the navigation that is a direct link to their main site that is a follow link, obviously this is a sitewide.
Would this be seen as being detrimental to the main site, should I have this set as No Follow as well.
Thanks in Advance
From my understanding if you are using domains just for the sake of redirecting them they don't help much and can even hurt your rankings. This used to work years ago but in this day and age if the domain has had no content or gained any real value in the eyes of the search engines why would you redirect it.
Hi Tymen
It's not really my area of expertise but I have read a really good article on Moz about 'Enabling https without sacrificing your Web Performance' by Billy Hoffman that may be of some assistance.
https://moz.com/blog/enabling-https-without-sacrificing-web-performance
Hope it helps and good luck with improving things to your satisfaction.
Andy
Hi
You will certainly have to update your profile in Search Console (Webmaster Tools) on both Google and Bing, most modern analytics can handle https without any change but if you have older code on the site you may have to update the script.
There is a really good post on switching to https here on the Moz blog that may help with other considerations you may have forgotten to check.
https://moz.com/blog/seo-tips-https-ssl
Hope this helps
Andy
There is a first rate post on this subject here that could explain this far better than most of us could.
https://moz.com/learn/seo/page-authority
Hope this helps.
Andy
I wouldn't particularly class it as thin content but it is almost certainly going to be classed as near duplicate content as the pages only vary by a small amount on each one, even though your descriptions appear varied and well written.
It may be better in this instance to focus on one of the pages as the main page and then canonical or no-index the others.
Andy
Hi Davit
If as you say you would not get anything from this mention in the way of a paying customer because as you say in your own words 'her audience is not our target customer' I would have to ask yourself if her audience is not relevant how would Google see the relevance of a mention to your site in her blog etc.
Are all the other posts on her blog in a similar vein or is she a genuine blogger discussing a particular topic consistently, if its the former then I probably wouldn't bother.
Hope this helps.
Simple answer is yes if you build a landing page based on that keyword then it will be competing with your homepage for that term, worse case scenario is that it could even cause the ranking of the homepage for that term to drop.
Ideally every page on your site should have a single keyword targetted and every page should have a different one, ultimately you should still be creating your pages for the users and not the search engines.
Hope this helps
Hi Tom
Agree with Martijn that it depends for example, the robots.txt is generally the first port of call for bots as it allows them to understand where you want them to spend their finite time crawling your site. You can aslo give direction to all bots at once or specify a subset. It is generally the best option for blocking pages such as you /cart/ etc were they don't need crawling.
The problem with robots.txt is that it doesn't always keep pages from being indexed especially if there are other external sources linking to the pages in question.
The meta tag noindex on the other hand can be applied to individual pages and you are actually commanding the robots to NOT Index the relevant page in serps, use this option if you have pages you don't want appearing in Google (or other search engines) but the page may still be relevant for authority or able to acquire links (make sure to use Noindex follow) as you still want the robots to crawl the page. Otherwise use Noindex Nofollow hope that this helps.
Thanks Eric, lot better said than I put it
From my understanding Google Trusted Stores is a way on informing your buyers that you are trusted by Google as a good place to actually shop online and as such I don't believe you would be able to get this for your website.
It can happen that PA is higher than DA on some sites but from my understanding of this both factors can be important and it depends on the situation. They are both measures of ranking power.
Thanks for clarifying this Rand it is a question that often crops up from savvy clients and you explain it so much better than I could
Hi Radi
No I don't believe any of the above tools do what you are looking for, not sure if there is anything that will do exactly what you want anymore, there used to be many tools that did this like 'Its Trending' but most of these are no longer available.
Maybe some of the Social Monitoring/Listening Tools will do what you need, hope you find a solution and would be interested to know if you do.
Andrew
Hi Peter
Can only agree with Dirk on this, I have only ever seen the markup in one area when it has been implemented using JSON-LD, we did recently do some markup for a client using Magento which uses a third party search system.
We implemented the code in the normal manner which shows up in the html blocks as Dirk mentioned this was rewritten by the 3rd party search system to use JSON-LD and the code was then all in a nice neat block. But I have never heard of this being a requirement.
Andrew
I have checked all of the urls that you have added in your question above and I can confirm that they are all clean and green according to the Structured Data Testing Tool as highlighted by Dirk above https://developers.google.com/structured-data/testing-tool/ so not sure what the SEO Company is looking at.
There are quite a few tool out there that can assist with finding trending content but a few that we have used in the past or still use are:
Buzz Feed: Can be a bit annoying at times but the editorial staff do try and stay on top of trending news.
BuzzSumo: Is another good one especially for looking at what content has done well in the past year.
Google Trends: Which we find is invaluable for tracking trends and content across the internet and too be fair is an undervalued and underrated tool.
You might also want to take a look at such things as Reddit.
Hope this helps and good luck with your search.
Just as an update to where we was with this I have seen this article this morning where many others are seeing significant activity around the 20th 21st but still doesn't appear to be clear what is causing this.
Still it is an interesting read and update.
http://searchengineland.com/google-update-their-ranking-algorithms-some-webmasters-believe-so-234185
Hope it helps
Would agree with Silkstream on this one, this app used to be called Just Unfollow so you have probably heard of it before, it is quick to setup and easy to use allowing you to search on various aspects as described above.
Definitely a useful tool to have in your arsenal.
No problem and good luck with your endeavours.
Even though the products are identical in what they do I would still go down the route of having a separate page for each. We have a client that sells many flea products but have still managed to give them unique product descriptions.
If you are concerned about keyword cannibalisation make the focus for the manufacturer on their brand name rather than the product or what it does.
The only other mention I can see of this is at rankranger where they are discussing the recent Hacked Sites Algorithm where there is also a distinct Rank Risk indicator for Oct 20th.
https://www.rankranger.com/google-algorithm-updates
Can't see any affect on any of our sites at the moment, one to keep an eye on maybe.
I can't honestly speak for Bigcommerce as it is a platform that I have never used, however we do use Wordpress and recommend it all the time to may of our clients.
This isn't really your issue though as to get the best of using Wordpress and its full functionality and access to its many apps etc and to be of benefit to your store on an SEO level it would need to be on the same domain as the Bigcommerce site which I understand can't be done directly on their servers as is the case with many hosted stores.
In this instance you would have to put the Wordpress blog on a sub-domain and there is a good article about how this can be achieved here http://inkblotdigital.com/2014/wordpress-blog-bigcommerce-store/
Whichever solution you choose I wish you well with your endeavours.
We had a similar thing happen with a client a while ago and it was to do with point 3 as mentioned by Tom, it turned out that the site had been hacked and had some very adverse and unwanted links added to the footer that were invisible to the naked eye or by searching the code.
We were recommended a little plugin for chrome called User Agent Switcher which identified and revealed these hidden links on the site, once they were dealt with the site recovered to where it was previously.
Have always found Cognitive SEO to be a reliable and very indepth tool for this kind of work, it is a paid for model but you can have a free trial to see if it will do what you need it to.
There is a great article on Link Audits here as well https://moz.com/blog/link-audit-guide-for-effective-link-removals-risk-mitigation
Hi Darcy
Looking at what has been mentioned previously I would agree with the train of thought that a more focussed sitemap would generally be advantageous.
Andrew
Hi Darcy
I don't know about scaling the sitemap down but you could make use of an area of the sitemap to optimise and make it a crawl more efficient.
The area in question is the Priority area that basically tells the search engines which pages on your site are the most important. The theory is that pages with a higher priority (say 100%) are more likely to get indexed by the search engines than pages with a lower priority of say (10%), although not everyone in the industry agrees.
Firstly I cannot actually think of any legitimate reason for hiding or making a category invisible, however if you had one that means you don’t want the content to be indexed then you would be best blocking that Category within your robots.txt file.
If it is for any other reason and the content is indexable by ANY search engine not just Google then you run the risk of being penalised.
In fact, Google’s guidelines state that, “hiding text or links in your content to manipulate Google’s search rankings can be seen as deceptive and is a violation of Google’s Webmaster Guidelines