Some years ago people used to make this claim about W3C validation badges too. A badge is a badge is a badge. It really is unlikely to affect your search rankings one way or another.
Posts made by AlexMcKee
-
RE: Does DMCA protection actually improve search rankings (assuming no one's stolen my content)
-
RE: Clarifications on the Moz Analytics package (Medium - $149 per month)
Due to the extraneous text at the bottom of your question I am answering each question with the subject of the question prepended to the answer in parentheses. So in answer to your questions:
1. (tools available) Moz Analytics, Followerwonk, Open Site Explorer, Fresh Web Explorer, Rank Tracker, Keyword Analysis (including keyword difficulty), On-Page Grader.
2. (crawl entire site) Yes, add your site as a campaign and Moz will crawl your site and inform you of any problems and factors affecting your rank. It will also track changes over time without manual intervention.
3. (10 campaigns) Yes, each domain is a separate campaign.
4. (subdomains) This is configurable. You can set in campaign settings whether to track only this subdomain or all subdomains or even just a specific sub-folder.
5. (keywords) The number of keywords relates to how many keywords can be tracked in your account as a whole across all of your campaigns. 750 over 10 campaigns gives you 75 keywords per campaign. The rank of your site's pages is tracked for each keyword, pages are graded for their ranking ability against specific keywords and you can check the rank of any page for any keyword you are tracking.
6. (social accounts) You can track the performance of your social media accounts by connecting them with Moz Analytics. You can also track your competitors accounts to measure your performance against competitors in some aspects such as the level of interaction (comments and shares).
7. (branded reports) Branded reports allow you to export reports on your data with your own branding, a useful feature for agencies and consultants.
I am a Moz Pro subscriber and highly recommend it, Moz Analytics and the various other tools are extremely useful.
-
RE: Need help in understanding why my site does not rank well for our best content?
Your site is competing in a very highly competitive field. I did a check in Moz Keyword Difficulty on your example page's targeted keyword (nokia lumia 830 review) and that keyword is highly competitive. The Google India SERP for the keyword is dominated by high domain authority sites with high page authority pages.
-
RE: Weird problems with google's rich snippet markup
I'm afraid it is the obvious. Ensure the rich snippets are relevant to the content of the page, ensure that your page is ranking for relevant queries to raise the chance of the rich snippet being shown.
-
RE: Weird problems with google's rich snippet markup
Google only shows rich snippets when it thinks it will be useful to the searcher.
As you have said you have had some issues with maintenance, check the structured data against Google's structured data testing tool. However it is more likely that Google isn't showing the rich snippets because it believes your page quality to be low or that the structured data is not relevant to the user's query.
-
RE: Csv file for moz local
Great to know that Moz is working on support for other countries!
-
RE: Csv file for moz local
Moz Local presently only supports the United States. I ran into this myself a while ago. Hopefully they'll get around to supporting UK and Germany soon.
-
RE: Structured Data + Meta Descriptions
Very interesting! I don't recall seeing that before but I checked the Internet Archive's Wayback Machine entry for that URL and the quoted extract has been there since at least 2013.
Elsewhere Google has been pretty insistent on structured data being part of the document itself as much as possible so it does seem somewhat contradictory advice. As you say perhaps they've simply forgotten to update that particular entry to reflect current thinking.
-
RE: Structured Data + Meta Descriptions
Once upon a time it was possibly a good use of the meta description to include some salient structured data but today we have a proper way of marking up structured data. The meta description is best used for compelling, relevant copy to attract the user to click through to your site as the meta description is your one best hope of affecting what is shown to the user in the SERPs.
Search engines haven't shown any inclination to parse the meta description and I doubt they would do so in future. Structured data belongs in the document itself, marked up accordingly.
-
RE: How many redirects on a redirect can you have?
In 2011 Matt Cutts advised that Google does limit on redirect chains - he indicated the Googlebot won't follow more than around 3 or 4. No limit on the total number of single-level 301s.
In your specific situation I would redirect both the original and the first replacement to the new replacement so that users and bots can reach the new page in a single redirect hop.
-
RE: GWT does not play nice with 410 status code approach to expire content? Use 301s?
410 means "Gone" and is used to indicate a resource no longer exists. If it exists use 200 (OK). If it is out-dated place a notice to that effect but still leave it as 200. Putting a noindex instruction in the robots meta element should be sufficient to remove it from the Google index, though it may take some time. Nofollow is probably not what you want to do as this will destroy any link value flowing through those pages. If it is so out-dated that it is considered valueless then it should be deleted and 410'd. A 301 redirect can be used where a new resource that substantially replaces the old resource has been created.
Not sure why you would want to keep Google's index of your site 'lean' unless you have a lot of resources competing for the same keywords and are concerned about cannibalization.
-
RE: How can you perform a simulated search query from another location?
You could use a VPN perhaps but it might get a bit expensive if you need multiple locations.
-
RE: Language Usage for SEO in Hong Kong
My apologies for overlooking your response. A good tool is https://github.com/jpatokal/script_detector. It is a library for Ruby and you can use the interactive Ruby console (irb) to use the function to identify script in an easy way. If you need help/guidance with that drop me a reply here or by private message and I'd be happy to help.
-
RE: Cached status(date & time) not showing
The cache info box shown in Google's cached pages is actually present but is being hidden by the website's design, specifically the div element with the classes "wsb-canvas body".
-
RE: Dashboard shows low number of visits
When you say that you know you've had over 100 visits, how do you know this? Server logs, for instance, record hits not visits or unique visitors.
If the visitor stats in Google Analytics and Moz match - which they should - then it seems unlikely there is a problem. Remember to filter by the same period in Google Analytics as your timeframe in Moz Analytics.
-
RE: Rel = no follow?
I wouldn't advise a client to link out to one site so much unless it was adding significant value to the page. This kind of "list of links" page is not particularly useful to visitors who could probably have found the webmd pages anyway and it doesn't add much to the content of the website. I suspect it was created in a misguided attempt to satisfy a criterion of linking to authority sites.
The page is also titled "Saint Louis Links" but AFAIK webmd is not a St Louis company.
You need to ask "am I doing this purely for the search engines/SEO purposes?" or for the visitors. If the latter you should nearly always go ahead with it but of course optimize it for SEO. If the former you should probably drop it as you should be building for your audience, not for search engine algorithms.
Edited to answer your direct question: if links are adding value to your content you should make them followed.
-
RE: For SEO... - Display Graphs in HTML5 or Image?
Great question. Search engines presently don't index highcharts or other graphs presented using HTML+JS combinations. However it can't index the information in images either, just the image itself.
Search engines have become increasingly sophisticated at indexing content rendered/presenting using Javascript so the day may well come when these charts do become indexed. Extracting information, especially structured information, from images is probably going to remain a harder problem to solve than traversing the DOM and interpreting the structure of the charts.
Another factor is the native format of the data. If you use a dynamic charting solution like highcharts to render data present in the document then search engines will already be able to index the table and access the data. That isn't going to be the case for images. So I would recommend, wherever possible, putting the data in the document as a HTML table and using Javascript to present this as a dynamic chart. This will also mean those folks who browse without Javascript enabled will get to see your data, albeit in a different presentation.
-
RE: 404 Pages. Can I change it to do this without getting penalized ? I want to lower our bounce rate from these pages to encourage the user to continue on the site
There's no law that says a 404 page has to be dull and unengaging. Back in the palaeolithic era of the web if we saw a lot of hits to the 404 page in the server logs we rarely knew why (finding broken links was a lot harder in those days) so we tried to capitalize and added engaging graphics and search boxes, copy designed to improve the retention of all these poor lost souls.
Working on your 404 page can actually be a really good experience. With the tools at developers disposal today it should be super easy to work out the context of the 404 error and show something useful to the user and win them over.
All that said if you find yourself relying on this technique in 2014 it is probably a sign something has gone wrong with the site's information architecture. Restoring the category page but serving a 404 is probably a no-no - you're essentially saying "no, this doesn't exist" to automatons (user agents and search crawlers) but you are showing the user the page they were presumably looking for. Finding yourself in a situation where you are sending deceitful HTTP headers is a clear sign something is wrong.
If the pages are useful and visited, restore them and work on making them better. If they aren't useful enough then you should probably 301 to a relevant useful page. Don't worry about having too many 301s, redirecting is the technically correct thing to do in such situations and your search engine of choice can hardly penalize you for using HTTP features correctly.
-
RE: WordPress Slideshow Gallery Not Showing Alt Tags or any relavent info ?
This will depend on what slideshow plugin you are using. If you respond with that information it will be easier for someone to give you a definite answer, and I will try to do so myself.
Edit: for what it is worth, I am seeing the alt tags on your photography website.
-
RE: Is there a better way to wrap schema on a testimonial page? Example website shown
Shawn, a quick question to open up possible avenues: did you extend the Flawless theme or use it as-is?
-
RE: What are the most common reasons for a website being slow to load
Your question is somewhat ambiguous. "Too many requests" could mean that your website's design is pulling in a lot of resources and this, particularly with Javascript, can cause load speed issues. If it is related to this you can look to merging some of your Javascript files, loading Javascript asynchronously where advisable, look at reducing CSS bloat (stripping unused rules, etc).
"Too many requests" could also mean that the server is suffering under load. If it is this latter meaning reducing any design-related resource loading issues may actually help somewhat but the site will still not reach reasonable speeds. To resolve this you'll need to look at reducing load on the server. This can be accomplished by making use of caches where appropriate.
Use the PageSpeed extension in Chrome or Firefox to diagnose some of the problems that might be affecting your site. If you suspect server load and if you have SSH access assuming it is a *nix machine try the "uptime" utility and have a look at the load averages.
-
RE: I want to track product click so how to create project object or how to pass project object ?
Hi Mitesh, the product object needs to be a javascript object with the following properties expected:
'name': productObj.name, // Name or ID is required. 'id': productObj.id, 'price': productObj.price, 'brand': productObj.brand, 'category': productObj.cat, 'variant': productObj.variant
Creating such an object and passing it to the analytics could be done with a simple function like the following.You say you have a list of products. Let's assume for the purposes of a demonstration that the list item is formatted like so:
Example product
An Example product
You could listen for clicks on the basket add (which you are probably doing already, so your developer should be able to hook the new functionality into the same event handler, but for the purposes of a demonstration I've included an example of the event listener too). Then find the product related to the current "add to basket" link. I've included 2 very simple functions using pure Javascript to demonstrate how this can be done.
var basketLink = document.querySelectorAll('.addtobasket');
for (var i = 0; i < basketLink.length; i++) {
basketLink[i].addEventListener('click', function(event) {
console.log('Basket link clicked');
event.preventDefault();
product = getSchemaOrgProductData(this.parentNode);
console.log( product.name );
console.log( product.brand );
});
}
function getProductData(context) {
var product = {};
product.name = context.querySelector('.product-name').textContent;
product.brand = context.querySelector('.product-brand').textContent;
return product;
}
function getSchemaOrgProductData(context) {
var product = {};
product.name = context.querySelector('[itemprop="name"]').textContent
product.brand = context.querySelector('[itemprop="brand"]').textContent;
return product;
}That should be enough for your developer to understand how to create a product object that can be passed to the analytics function. I haven't implemented methods for fetching and setting all relevant properties but the developer can copy the example.
-
RE: Ever seen this tactic when trying to get rid of bad backlinks?
They've messed up in general really. They should be blocking robots to what appears to be the CMS for their clients use as there are surely numerous effects on their clients (cannibalization caused by the duplication of pages, for instance). As Mike said they've not taken into account the SEO aspects of the way they've implemented their system.
-
RE: International SEO Domain Structure
Good post. However be aware that "UK" is not the ISO 3166-1 Alpha-2 country code for the United Kingdom - it's GB so en-GB for English speaking users in the United Kingdom.
-
RE: Ever seen this tactic when trying to get rid of bad backlinks?
Michael has it right. Online Agency (onlineagency.com) build websites for travel agencies. In the URLs you gave, Patrick, you can see some sort of ID for the site (starmandstravel.com). I guess that this content.onlineagency.com subdomain is the content management system to allow the travel agencies to update their content.
Google may be interpreting lots of similar/related websites on the same infrastructure as an attempt to game its algorithms (they have the same nameservers, although different c blocks but many of the other sites built by that agency also share the same c block [..170.140]).
-
RE: Language Usage for SEO in Hong Kong
Interesting question. The Government of Hong Kong uses standard Chinese in simplified characters on its website while the population largely speaks Cantonese which is now found in a written form frequently used online and in social settings.
I suggest you have a look at Google Trends for Hong Kong and see if the bulk of searches are being made with simplified characters or traditional characters. The demographic your website will be targeting will also be a factor.
-
RE: Google Structured Data Verses Yandex Structured Data Validator Query
I think it is important to distinguish the purpose of the automated validation services offered by Google and Yandex, which is to ensure that the properties utilized by the respective search engine are present, and the actual schema.org structured data initiative which doesn't place many requirements on publishers.
With that in mind when Yandex states that the address and telephone properties are required for http://schema.org/Organization, it doesn't mean they are required by schema.org but rather that they are required by Yandex. Google's Structured Data Testing Tool doesn't state that these are required because for Google's purposes they are not.
So both are correct but for purposes of ensuring your structured data is showing up you do need to test in all of the relevant tools. For less mission-critical structured data it is OK just to follow the Schema.org documentation and wait for the providers to implement support.
-
RE: Rich Snippets Not Displaying - Price Error?
I've seen quite significant delay before Webmaster Tools and the Structured Data Testing Tool show the same results. I just searched on the URL http://www.evo.com/skis/line-sir-francis-bacon.aspx and for me the price is showing in the rich snippet so if there is still an error showing in Webmaster Tools at this time it can probably be safely ignored.
-
RE: Will a GEO Localization site create thousands of duplicates?
For the most part I'm going to restrict my answer to the technical part of your question but do have a think about whether the pages are actually duplicates. If they represent different franchises of companies then they may well be relevant content pages. Think in terms of resources.
You can't assume the query string will remain undiscovered by Google. It only takes someone linking with the query string attached for it to be found. Over time many of the near-duplicates may be discovered in this way. You can instruct Google on how to handle URL parameters.
-
RE: YouTube Filtering Business Videos as Inappropriate Making them Unavailable When Safety Mode Enabled
This is a known problem with YouTube's safety mode filter but there is not much information about it. There is no official appeals process but you can report it on the support forum thread where one of the community volunteers, Epontius, has been helping to escalate some users false positive reports to YouTube's staff. You could report the false positive there and ask for Epontius help. Hope this helps.
-
RE: Can you market to someone 30 days AFTER they visit your site via PPC?
For best results Google recommends that all the rules should be the same except the membership duration. Then combine them, yes.
-
RE: Can you market to someone 30 days AFTER they visit your site via PPC?
Hi Cole, OK. The remarketing service can be used to show adverts to visitors at a set duration of days after they visited your website. Visitors can be in multiple lists. So you can target those who visited or converted with one list (say "Converted") that has a duration of 60 days then use another list "Sales Cycle" with a membership of 30 days. By excluding the "Sales Cycle" list from the "Converted" list you will arrive at a list of visitors who converted 30 or more days ago and are ready for being shown the offer/coupon.
-
RE: Can you market to someone 30 days AFTER they visit your site via PPC?
Yes. Use remarketing with a custom audience by using a 90 day list, for example, and a 30 day list for your customers that have made a purchase and by excluding the 30 day list you will be targeting them at +30 days.
-
RE: Duplicate Content Home Page http Status Code Query
Try using something like LiveHTTPHeaders to view all of the HTTP requests and responses involved. You should see the request going to the redirected domain (GET domain.co.uk) and then the response such as HTTP/1.1 301 Moved Permanently followed by a new request to the new domain (GET domain2.co.uk) and the response will naturally be HTTP/1.1 200 OK because the server at the new domain has answered the request successfully.
-
RE: Magento webshop ranks bad on Category pages
Quite a few things going on I think. Firstly it is an extremely common keyword so ranking well is going to be hard anyway. I did a search and see that a lot of the results that came up for me contained the keyword in the domain name. Do you have relevant inbound links to the category pages?
I had a look at the example category page you posted. I suspect your category pages may be being seen as relatively thin pages. The main text is far below the fold and the links to the product pages are probably sending signals that the category pages are not going to answer the searchers query directly (click and click again) when we know that Google prefers to direct people to the most relevant endpoint for their query.
I notice that the paragraphs of text on the supplementary pages are not contained within HTML paragraph elements. This could be contributing to a loss of ranking potential, especially when combined with the text being low down on the page.
-
RE: Non-Unicode Fonts and SEO
I would suggest using the Padauk font from SIL instead. This uses the Unicode standard and you should be able to use CSS font embedding to maximise the support for other browsers.
Google should be able to understand the content as long as it is encoded in a standard character set (UTF-8, for example). Even though Zawgyi is not compliant with the Unicode standard your pages should be being served with the UTF-8 character encoding so Google should be able to index the content. I don't think Google pays any attention to the font in use by the design but it is an interesting point which I haven't particularly considered before and which I had to think about carefully before posting this response.
-
RE: Confused About Problems Regarding Adding an SSL
Ruben, I had a look at your website and your URLs all have HTTP in them so these would need to be updated all across your site before you make the switch to HTTPS. Because you are using WordPress this should be as simple as updating the site URL to https://www.kempruge.com.
The tip by @Highland about using Firebug is excellent. This will allow you to quickly debug if there are non-HTTPS links remaining - in the WordPress theme or template, for example.
Have a look at the WordPress HTTPS documentation also.
-
RE: Confused About Problems Regarding Adding an SSL
Implementing SSL should be straightforward for the most part
You need to ensure that links around your site (including canonical links) are updated to use HTTPS (so https://example.com/link as opposed to http://example.com/link where example.com is your domain name). If you are already using a protocol-less linking pattern (//example.com/link) you don't need to update the links.
You can also configure your web server to only serve HTTPS. If your web server is Apache you can do this with the SSLRequireSSL directive.
<code><location>SSLRequireSSL</location></code>
HTTPS also causes a significant slow-down as the browser and the server negotiate a secure connection. If your site has already been optimized for speed it should not cause a problem but if in doubt revisit that process and ensure that you are getting the best possible speed for your visitors.
The article by Cyrus has a great checklist to double check everything.
-
RE: Top content keyword in WMT is crap
That looks to be a good solution. Behind the scenes the plugin will be adding "
Options -Indexes
" to your .htaccess file, which you could do directly and skip the need for a plugin if you so wish.I am glad I was able to help you.
-
RE: Is it ok to correct someone who spelled and styled our name incorrectly in a blog post?
It is absolutely fine to reach out and gently ask for a correction, in my opinion. I have done it myself several times recently and all individuals contacted were quite happy to oblige me. The important thing is to be sincere and genuine in your message and remember that the person on the other end is likely busy so keep your message to the point and give precise instructions to minimise any need for back-and-forth email clarifications. Be patient - but chase after a month or so, if necessary - and do follow up to thank them for correcting once they have done so.
-
RE: Duplicate Meta Descriptions in Press Releases
That's a tricky situation. I think you need to assess whether the duplicates are doing more harm than having no meta descriptions on the press release pages. Obviously the best solution would be to ensure unique content in the meta description - could content be extracted from the press release? The press release itself has presumably been reviewed and approved for distribution.
-
RE: Top content keyword in WMT is crap
Your uploads directory is viewable. I would configure the hosting to disallow browsing directories.
Because WordPress creates a directory for each month you have two dozen pages being indexed by Google that show nothing but filenames ending with the .png extension. This is most likely where the keyword in Webmaster tools is coming from.