Hi Graeme
Are you saying you want to add URLs manually or automatically? If automatic, then http://www.xml-sitemaps.com but manually then no I don't sorry.
Peter
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Graeme
Are you saying you want to add URLs manually or automatically? If automatic, then http://www.xml-sitemaps.com but manually then no I don't sorry.
Peter
Hi
To answer your last question first, yes, Google can generally understand different words joined together, but hyphenated is better for human readability.
Regarding your domain name choice, it really depends what keywords you are using in the domain as to what will work the best and what may or may not appear spammy. The spammy feel, if there is one, will really be to do with how people read the domain - and also maybe how long-winded a long domain name can be to type in.
Re Google, I'm not sure you are going to get much advantage from an SEO perspective by using multiple keywords in your domain, especially in the move towards semantic search, but as I say, it really depends on what those keywords are.
I hope that helps,
Peter
Hello Matei
I don't what you mean by "auto"? Please can you explain.
Peter
Hi Rajat,
I don't think you need to be concerned about the these links all going to the same landing page. Essentially, it is a navigation button that points people to a page. The same thing, albeit not as an image typically, is true of an item on a menu, e.g. to an About Us page. That menu link will appear on every page as it is part of the main menu system for the site. Your subscription button is really no different in terms of helping users get to where they need to go to.
Peter
Hi Rajat
In terms of your site having 1000+ images in reality it is just one. In my view, if your pages are 2000+ words long then the re-use of the subscribe button shouldn't be a problem. All I would say on that is does that button need to appear on every page or just some types of pages? Only use it where needed.
Yes you could add a nofollow link but again ask yourself why the button is there. Is it there just as a call to action for users or does the page it directs to have more related information and therefore more value to Google to index and provide extended information. If just a call to action then probably best to make the links nofollow and just have follow links on one page.
On the legal disclaimer, I say keep it as text. If the repeated content is small then it isn't an issue.
I hope that helps,
Peter
Hi Bill
Yes, those two pages are competing with each other, but then so are all the other results on that page. The fact that those two pages are yours makes no difference. By removing one of them won't push the other up. Instead it would allow another result to come in its place.
So, yes, be glad you have two results turning up on the first page. Now see what you can do to push them both up a peg or two higher.
I hope that helps,
Peter
Hi, I am not familiar with Yoast SEO, but it's not what Google is doing as Google is just indexing the pages it finds on your website. As I said, it is to do with both of those pages using rel="canonical" code that points to themselves. I suspect that is the same with other pages. Follow one of the suggestions I have already given and you should be able to resolve this.
I don't know what you mean that your Google+ is not showing in the results for some of the Keywords you rank for this already. The Google+ profile listed as the rel="author" of your Jerry Billett page links to a profile that has a total of 2 posts. It's not going to rank for anything.
Peter
Hi Catherine
I am really sorry to hear this and it's obviously been very distressing to yourself and your family.
If you go to the user's profile page there is a little down arrow underneath their picture which allows you to Report/Block them and then gives you a range of options to say why. That may be a start and hopefully Google will be able to respond to you.
But if you think the approach was possibly a criminal offence I would also take it up with your local police
I hope that helps - a little.
Peter
OK, I think I understand now.
Essentially, you have two URLs that answer the same query of "Jerry Billett". In that case Google makes a decision on which page they think best answers the user's query.
Looking at the two pages, I would be inclined to agree that the non-tagged page does, but for some reason Google doesn't. If it's important for you that the other page is the one that surfaces in search then you have some options:
Remove the tag page from your URLs and set up a 301 redirect to the non-tagged page. That will mean that the existing link in Google's index will straight away link through to the non tag page and once they have updated their index, so their link will be updated.
Alternatively, change the rel="canonical" code you have in your tag page to point to the non tag page. At the moment it is pointing to itself. This is what Google's support page says on using rel="canonical":
Adding this link and attribute lets site owners identify sets of identical content and suggest to Google: "Of all these pages with identical content, this page is the most useful. Please prioritize it in search results."
Source: https://support.google.com/webmasters/answer/139394?hl=en
Whilst the two pages are not identical (hence my scoring through the word 'identical' in the extract, by using rel="canonical" to point to the non tage page you are suggesting to Google that the non tag page is the most useful and asking them to prioritise it in their search results.
I hope that helps,
Peter
PS. I don't know if it is just me, but I don't think the audio that automatically plays on your site when you visit a page will help people to stay on your site. I think it's a bit bewildering as it is not clear where it is coming from and you don't know how to turn it off which may prompt people to quickly click away. If they do, that will negatively affect your SEO. I understand the desire to promote the free 60 day course, but I think you would be much better to give people the option to hear the audio play by allowing them to click a button to choose to play it (e.g. "Listen to our free offer to turn you into a marketing rock star") rather than feeding it to them automatically. I think you could find your conversion rate would go up. Sorry to mention this, but I know you want to make your site work better for you so I thought it better to share what I think could help you with that too.
Hi David
I'm not sure there is a way to give visibility to your blog fast unless it is remarkable and, for example, you get PR through the media picking up on it.
Apart from being fortunate to get the above, I would give one word for my recommendation: Google+, but, with a big caveat..
Getting people to engage with your blog content through social media, and in particular Google+ starts when you engage with their content. Just turning up on to Google+ and posting your blog will not work. Instead, seek out communities that would fit the target market you want to reach, and join them. But again, don't push your own content to start with. Instead get to know the people there and allow them to get to know and trust you. They you will have earned the right to share what you have to say.
In truth, unless you are fortunate as I said above, I don't think there is a quick way to get your blog content out fast. As good as your content may be, your best route to success will be to put give time to doing so - which sometimes requires putting into some hard miles to get there.
I hope that helps,
Peter
Whoops! I forgot about that one! One day I will go to that conference - but it would be good if they held it in London too one day.
I'll shift my vote
Peter
I am still not clear really. So are you saying you are wondering why it is ranking for the page linked to a tag in your blog rather than to the actual article page itself? I couldn't find another untagged copy of this page if there is one.
Peter
Hi Nick
If you are ranking well for a number of private equity definitions and terms and have been increasing rank in those terms, then I would consider that a win. That are you are losing rank for "private equity firms" is possibly down to Google understanding the intent of search queries better and deciding that your site is better served and more authoritative for the other terms.
I know that sounds strange as your domain is exact match, but Google won't be ranking because of that. As your site offers a service via subscription to private equity firms then that takes your site one step away from Google providing a result for that query which probably hampers your site compared to other sites that possibly rank higher.
I hope that helps,
Peter
Hi Brad
If I could, but I can't and it would make no sense for me really (as I am UK based) I would go to the SMX conference. Details here: http://searchmarketingexpo.com/east/
Peter
Hi Matt
Sorry, but I am unclear about what question you are asking that you need help with. Please explain more.
Thanks,
Peter
Hi Arben
Google's definition of a canonical page is "the preferred version of a set of pages with highly similar content". (https://support.google.com/webmasters/answer/139394?hl=en)
Looking at the two pages you have linked to I don't think they fall into the category of having "highly similar content".
Similar content isn't just defined as what is the same textually but also functionally. These two pages serve different user needs to find first aid courses in two different locations. To my mind that makes them different.
I hope that helps,
Peter
Typically, press releases are issued by companies not people, unless the person is representing their own brand.
So in the case of a company, I would say no, because whilst you may have written it a press release is really a statement and not something you would consider as 'authored'.
For blog articles however, definitely yes. Link the article to the individual's Google+ account in the page code by using the rel=author tag in the page to do that.
I hope that helps,
Peter
Hi, I don't think there is any SEO benefit that's been proven. If you had asked the question a couple of years or so ago, the answer would have been make sure you use nofollow on your links.
But the web is changing and Google is rewarding authenticity in what you do online.
If you were to write a technical article in a magazine for example, you would typically cite anyone you referenced in your article to give them credit for the piece you referred to. So, if you write a blog post for your site, why shouldn't you do the same? It seems normal and authentic to do that and if you are going to credit them, why wrap a nofollow around it?
Technically, you are passing SEO value from your page to theirs and diluting your own page's SEO value. But I don't know now if Google sees it and treats it that way.
So, that may not have answered your question but it may give something to discuss further.
Peter
Hi
To answer your questions:
For a site that is only one to two months old, what is considered a natural amount of inbound links if you're site offers very valuable information, and you have done a marketing push to get the word out about your blog?
This is really a "How long is a piece of string" question. It depends. If the site is for an established brand launching say a new site, then inbound links during that time could escalate to 1000s or tens of thousands wouldn't be unrealistic to expect. For an unknown, who knows? In one sense, it doesn't really matter. What matters is that those inbound links are producing results, both from people clicking on them and it benefiting your site from an SEO perspective.
Even if you are receiving backlinks from authority websites with high DA, does Google get suspicious if there are too many inbound links during the first few months of a sites existence?
Again, it would depend on whether or not the new site was for an established brand or for an unknown, but suspicion isn't necessarily based on numbers - although it would be fair to say that the higher the number the more it might flag up an issue. The main thing though is that Google's algorithms are sophisticated and able to detect link quality on the basis of a number of metrics, e.g. the social profile of a site. You could just have 10 links and it could flag an issue.
I know there are some sites that blow up very fast and receive thousands of backlinks very quickly, so I'm curious to know if Google puts these kind of sites on a watchlist or something of that nature. Or is this simply a good problem to have?
As I said above, the more links accrued in short space of time, the more likely a yellow or red light might start flashing on Google's dashboard, but again it comes down to link quality which is evaluated on a number of metrics that will determine if there is an issue.
I hope that helps,
Peter
Hi Guy
Giveaways are definitely a good thing to encourage inbound traffic and help to raise the profile of your brand, but that does not necessarily correlate to a higher rank in organic SERPs.
Pages in search results are delivered on the basis that the page is relevant to the search query, but where a page ranks in SERPs - if relevant to a query - is also dependent upon the competition for that query and a lot of other metrics.
I hope that helps,
Peter
I'm not really clear on why you would want a query using either variation of this to be the same.
If the page is ranking differently for each of those queries, I don't think there is anything you can do to make sure they both achieve the same ranking. Where the page ranks based on a search query is down to Google and will be affected by all sorts of metrics including where that query is entered and who entered it etc.
My advice would be to concentrate on ranking for the term that is most commonly used of the two of them and optimise for that. Inevitably, you will see the page ranking for the other spelling and that is good if you do, but you cannot really control where it ranks.
Peter
Hi Steve
I don't think it would make any difference if you used one or the other of these keywords. If they mean the same thing - which I assume they do - then its likely that Google would rank a query for both of them. If you have a concern, I would just be inclined to use the two spellings in the meta Title of the page these need to appear on.
I hope that helps,
Peter
Hi Mike
I think this is unlikely just because they have an exact match domain.
The domain you have given as an example is nearly 5 years old so it is likely they have had their site working for them for some time. There are a decent number of pages on the site with information all based around the same subject and they have a registered address in Birmingham AL.
There's likely to be other factors at work too, but as I said, having an EMD won't be the main factor.
I hope that helps,
Peter
Hi Michelle & Blake
SEO has been changing a lot, especially over the last few months in particular. I would encourage you to optimise your product pages with clear structured categorised titles. They will inevitably incorporate keywords, but don't focus on the keywords as such, focus on clearly explaining the products to your site visitors in the subsequent product description.
In the future, sites that use multiple keywords per product are not going to gain an advantage by doing that - in fact it may even have a negative impact. SEO needs to be about writing for your audience, your site visitor, and making sure the content of your pages answer the questions they may be asking about your products and perhaps what related products you have.
It may hep your SEO in future to use structured data markup with your products (see http://schema.org/) but at this stage focus on getting your pages right as they are.
Using the same keywords for multiple products under the same category will be fine, but just distinguish them by using other words that describe them. For example, red sofa, black sofa, white sofa all repeat the word "sofa" but they are different.
Duplicate content will only come into play if you repeat a large proportion of the content of a page across two or more other pages on your site. If you focus on providing good clear descriptions for each product, you won't go far wrong.
I hope that helps,
Peter
I think Anthony's idea on this is a good one and would be worth considering. I guess it depends on how many of these you have rotating at any one time and how many URLs you therefore have to maintain.
My thought on it would be to create a custom 404 page for your site that is a more pleasant landing page than regular 404 pages are. It could provide a nice apology that the content the link took them the person to is no longer available and show a number of anchored links for where the visitor may like to go to next, e.g. where to view other promotional flyers or other pages on the site which may be important to the visitor.
By doing that you provide a better visitor experience. Here are the recommendations from Moz's page on HTTP status codes. I think these make a lot of sense:
----------------------------- When visitors reach 404 pages, they should be given navigational options so they do not leave the given site. Web optimized 404 errors pages should contain:
-----------------------------
I hope that helps,
Peter
Hi, you are correct, this type of dynamic content does present a problem for search engines to index, but there is a way to help Google and other search engines to do that.
You should find the following blog post from Moz last year by Rob Ousbey helpful:
Create Crawlable, Link-Friendly AJAX Websites Using pushState()
Peter
Hi Robert
It shouldn't take that long. It's difficult to know why.
Do you have any warnings about anything that may have caused it to still be pending in your Google Webmaster Tools? Is it possible that access to it is blocked in some way?
Peter
Hi Sarah
I have seen some odd bounces around of local pages since the late summer, but whether they could be attributed to Hummingbird it's difficult to be sure. With Hummingbird designed to refine search results based on the search intent I think it's possible that Google are looking for 'richer' content for local results and more guarantees from a semantic analysis of pages in a locality that they meet what local searchers are looking for.
Based on that, I think your thoughts on the link ratio between .com and .co.uk would seem to have some basis for concern. However, I don't think it is the ratio so much but rather the genuine value or lack of it of a link in a local search result.
Google will be looking at citations for the service your business provides in each locality and it would seem reasonable therefore that they will discount links coming from .com sites and therefore possibly non-UK locations. They may even treat those links negatively, but on that I am speculating. A link from a non-UK site for a local search result does seem to have little relevance semantically so if your pages are relying heavily on non-UK citations then that is likely to affect the performance of your pages in local search.
Miriam Ellis has a great deal of knowledge on local search ranking factors. I recommend reading her excellent, "Top 20 Local Search Ranking Factors: An Illustrated Guide".
I hope that helps,
Peter
Hi Kim
Moz's On-page grader tool is used to give you an understanding of how well optimised the page is for SEO. As you will have seen, it covers a range of different tests checking each one against the keyword you have told the tool you are optimising that page for, and gives it a score A to F based on how well optimised it is for the technical on-page SEO factors.
However, a grade A score does not give you any guarantees of how well that page ranks on Google. What it means is that most if not all things on page that are set up correctly for SEO. How highly a page ranks in Google is based on over 200 elements some of which are covered by the on-page grader but many won't be such as off-page ranking factors including back links, social etc. Consider the on-page grader as one of the tools in your SEO toolkit.
In terms of worse pages than yours that rank higher, it can be confusing and frustrating, but there is no simple answer. However, I recommend looking at doing a competitor analysis for the keyword to compare your page with what competing pages are doing. Look at the keyword difficulty tool to do this: https://moz.com/researchtools/keyword-difficulty.
You may also find this Whiteboard Friday by Rand Fishkin useful: Why You Might Be Losing Rankings to Pages with Fewer Links, Worse Targeting, and Poor Content.
I hope that helps,
Peter
Hi Andrea
It will depend on which system / CMS you have used for your site.
For the blog, you need to install that in a sub-folder off of the root folder of your site (e.g. /blog sub-folder).
Then, within your main site you should be able to create a blog menu item type which allows you to point it to an external link. Although the link you are creating is not external to your domain, it can still be used internally.
Your link should be something like http://www.yoursite.com/blog which will point to the home page of your blog site (for the example above). Within your blog site you just need to provide a navigation link to go back to your home page at the root of your domain (i.e. http://www.yoursite.com). If you wanted to make the links back to your main site more user friendly you could add more menu items in your blog navigation to replicate those that you have as the main links on your main site.
I hope that helps and the above makes sense.
Peter
Hi Luke
You don't need to as essentially the 301 redirects you put in place will address that. So, for example, where you have had duplicate content pages you can redirect both old pages to one new page on the new site.
I hope that helps,
Peter
Hi Luke
I wouldn't say keyword density is totally irrelevant, but what I mean by that is that you would expect to see on any page the keywords related to the subject of that page. But attempting to add keywords to a page to increase density to make it more indexable is not what you should be doing.
The focus of a page for semantic search needs to be the subject as a whole so content should be written for the whole in much the same way as you would write offline and include related content where relevant.
I'm not sure if there really is a safe percentage as such for keyword density, but suffice to say that the higher the percentage the more likely a page will be seen as spammy. I would have thought in most cases though <3% should be fine.
Peter
No problem. The JCE Editor doesn't set the Meta Title Tag, just the content that is contained in an article. It sounds though that you have now found the relevant place. So is it working for you now?
Peter
Hi Iain
In Joomla you can control the meta Title tag through the menu item by going to the Page Display Options for the relevant menu item and entering your Title into the "Browser Page Title" field. You do not need to set "Show Page Heading" field underneath to Yes as that is a separate function.
I hope that helps,
Peter
Hi Luke
No problem. You asked: How do you manage onsite keywords in content these days?
I am not clear what you are asking. Please can you clarify?
Peter
Hi Luke
For sure, carving away 2/3rds of your previous site is a big chunk, but I don't think that should overly concern you.
If you had said you were thinking of doing this a couple of years ago, I would have encouraged you to think again on the basis that the more pages your site had, the more weight it had, the more pages could be optimised and the more entry points there were from search.
With changes in recent months to Google search, in particular the move to semantic search and away from Boolean search, then having a keyword rich site, with many well optimised correct keyword density pages, shouldn't be the focus any more.
I'm not suggesting that having 35 pages compared to 107 pages is better. What I am saying is that it is better to have 35 sharply focused, high quality pages than 107 pages that don't have the same definition and focus. The measure should most definitely be quality over quantity, both on a page count basis and even on a word count basis.
What I would focus on with your 35 pages is making sure they are well structured (so many on-page SEO rules still apply - so make sure the faulty parts you mentioned are fixed) and the navigation is clear.
I am sure you know this, but make sure that your pages are customer-focused, so that they answer the type of questions your customers are asking in the language of your customer, and where related questions could occur, make sure there are good internal links between related content pages.
Finally, when you do the switch, I would just make sure that you think about your 301 redirects. Where an old page no longer exists on the new site, then redirect it to the closest related page.
I hope that helps,
Peter
OK, I see what you mean now. I am not sure how they are doing that. You would normally see that if the content of the page was output through an iframe but I couldn't find one in the source code. It's possible it is something OpenCart is doing
It's not a great idea from a user experience basis really, but search engines are not going to read the URL in the address bar. They will use the links that are in the pagination links at the bottom. For example, this is page 2:
http://www.manhattanfruitier.com/index.php?route=product/category/shopall_Data&osc=0&page=2
So that is what will be indexed.
Peter
I see you have posted the same question earlier. Is there a reason you asked the same question in two separate posts?
Hi, that's a lot of variables and answers may therefore vary from different people, but here's my best shot:
**1. If we don't offer the PDF option, people would have to visit our site to read the content (unless they bought a hard copy). **If the goal is getting more readership and attention for these books then I think you should provide a PDF.
**2. If visitors were able to download a free PDF, they wouldn't need to return to our site to read it. **No, but the benefits of (1) above apply I think.
**3. If our corporate clients (nearly all of our clients are corporations) could download a PDF, they could then post it on an intranet instead of posting a link to our site. **I think you could put information in the PDF and on the page where it can be downloaded from to say that you do not give permission for the PDF to be hosted on another site. You cannot guarantee that they wouldn't but I think it is unlikely a corporate would want to be seen to be going against your wishes on that. Alternatively, you could say that the PDF can be hosted on other sites, providing they include a Follow link to your site attributing your site as the source.
**4. In general, do you think a visitor would be less likely to link to our site if he or she were able to download the PDF? Or would the appeal of the PDF option make it more likely that people would visit and link to the site? ** Yes - the second option
5. Also, if we offer the PDF option, are there any SEO issues related to duplicate content? Not really, if posting on other sites they include a link to the source of the PDF on your site
**6. Finally, if we did offer the free PDF download, would you recommend that we ask for an email address before giving the PDF? **It depends what your motivation for doing so is. If you want to build an email list that you can then send newsletters and special offer emails to, then yes - as long as you make it clear that is what you will do and you get the visitor to click a checkbox confirming they accept that condition - although of course you have to give them the option to unsubscribe at a later date when you send newsletters and offer emails.
I hope that helps,
Peter
Hi,
I'm not sure what you mean on this as every time I refresh the page you linked to I get the same page content. It could be that though from time to time the page content changes. That will be produced by an extension of some sort for the OpenCart shopping system being used on this site. For example: http://www.opencart.com/index.php?route=extension/extension/info&extension_id=5490
Is this the best way to handle it from an SEO perspective? If it was every page on your site then I would say yes, but I would guess this only affects the home page and in that case it doesn't really matter. If anything it may help a little because the search engines will see the content changed every time they visit that page.
I hope that helps.
Peter
Don't you just love SEO sometimes Robert?!
It is very frustrating when you think you have done everything right, but don't see the results you are expecting. Obviously, I don't know the full details of the keyword and the page you are trying to rank it on, but I don't think I have ever had what you have described having done what you have described.
Recently I have seen some odd bouncing around in SERP rankings, nothing very dramatic but bigger bounces then I would expect and negative ones at that when I would have expected positives. My only thought is I wonder, with the shake up in search by Google and the growing move to semantic search and away from a boolean keyword-based algorithm, whether this is the cause of such volatility of which we can expect to see more.
You mention 'keyword'. I know 'keyword' tends to be synonymous with what is more accurately a keyword phrase, but is the keyword you are trying to rank for just one word or a multi-word phrase? If one word, then I would expect to see a big swing in ranking results because in many/most cases how can you draw a semantic understanding of the searcher's intent through one word. I don't think it's linear, and although this is a generalisation, I think the longer your keyword phrase the less volatility.
I don't think the above is really an answer for you, but I hope it helps maybe in some way.
Peter
Hi, you need to fix this by using the rel="canonical" in the Head section of pages so that where there are duplicates they all point to one of them as the original.
You can see more information here: https://support.google.com/webmasters/answer/139394?hl=en
In Joomla and other Content Management Systems, duplicate pages can sometimes be an issue because of the way search engine friendly URLs are created. With these systems you can generally find an extension / plugin to help resolve the issue. For Joomla there doesn't seems to be much on offer in their Extensions Directory, but you can see them here: http://extensions.joomla.org/extensions/site-management/seo-a-metadata/url-canonicalization-
I hope that helps,
Peter
Hi, the occasional server error won't affect your site in SERPs, but persistent issues over a shorter period of time are likely to. With this you would also be likely to see those errors reported in Google Webmaster Tools.
Essentially, if the search engines cannot index a site then they don't have anything to include in SERPs.
I hope that helps,
Peter
You could split them up based on where they are needed but that would become complicated. The advantage of splitting CSS on a large site is really to better organise the functionality of the CSS, e.g. system.css.
Peter
I don't really think it's possible to calculate. A multiplier for one keyword will be different to another. I don't the nk there is an average figure that anyone could calculate with any degree.
Peter
It really depends on how big your site is and how complex your CSS. On a small site or if it has minimal CSS one is perfectly adequate. On a larger site with lots of pages and CSS it makes sense to break down the the CSS around their function.Peter
Hi, don't forget that search results in Google these days are personalised so what you see someone else will see differently. I am seeing it 10th, but then I am in the UK. Moz's rank checker which provides (as much as possible) de-personalised results, shows it as 5th.
The reason is the site probably ranks higher is likely to be in some part due to the Title tag "Bail Bonds Redondo Beach" as Redondo Beach is in Los Angeles.
I hope that helps,
Peter
Hi Jason
Well I have run a few checks and I must admit I cannot explain why your site is not ranking in the top 50 SERPs for some of the key phrases you are targeting. Some of the content is a bit thin in places but, I have checked quite a few pages I would have thought you should be in the top 50 for as a minimum and you are not.
So, like you I am scratching my head a little. I presume you have a Google Webmaster Tools account for this site. Is there anything reported in there that would indicate an issue?
Peter
Hi Jason
Are you able to give more information? What do you mean specifically by your "site not progressing at all"?
A lot has changed with Google over the last year. If you started with a new site, then that site needs to build up its authority. When you say you have focussed on producing good content and social that is good. But it's important to check if you have optimised that content correctly, is it customer-focused and targeted at the questions and information your target market are asking and looking for. With social are those you are engaging with sharing it through likes, tweets/retweets, +1s etc? Have you checked your site against competitors? How do they compare?
There's a lot of things to look at and check so with what you have said I can only give you ideas about where the issues may lay, but it's likely to be an issue with one or more of the above.
Peter