Perfect! Thanks for your help Kristen
Posts made by RG_SEO
-
RE: Are stackoverflow links follow or nofollow?
-
RE: Are stackoverflow links follow or nofollow?
Thanks for the tip. I've never spotted that before!
-
Are stackoverflow links follow or nofollow?
I've tried to find the answer to this question myself, but I've found differing opinions. The conclusion I've come to is that Stackoverflow allows follow links when the user that posts the link has sufficient reputation or if the link receives user validation.
Has anyone else here used Stackoverflow.com that knows the answer to this question?
-
RE: If I put a piece of content on an external site can I syndicate to my site later using a rel=canonical link?
Thanks for answering my question Dirk! I found the deeper follow up conversation interesting as well.
-
If I put a piece of content on an external site can I syndicate to my site later using a rel=canonical link?
Could someone help me with a 'what if ' scenario please?
What happens if I publish a piece of content on an external website, but then later decide to also put this content on my website. I want my website to rank first for this content, even though the original location for the content was the external website.
Would it be okay for me to put a rel=canonical tag on the external website's content pointing to the copy on my website? Or would this be seen as manipulative?
-
RE: How bad is it to have duplicate content across http:// and https:// versions of the site?
Thank you both - and sorry for not replying earlier. It sounds like we have some work to do
-
How bad is it to have duplicate content across http:// and https:// versions of the site?
A lot of pages on our website are currently indexed on both their http:// and https:// URLs. I realise that this is a duplicate content problem, but how major an issue is this in practice?
Also, am I right in saying that the best solution would be to use rel canonical tags to highlight the https pages as the canonical versions?
-
RE: Clarification around 301 redirects.
Thanks for your reply Monica. The blog is a landing page where the separate blog posts were listed, which is what I think you are suggesting so I'll go ahead and recommend that we do the re-directs to the corresponding page.
Thank you all for your replies - it's helped to get my thinking right
-
Clarification around 301 redirects.
I’ve come across numerous blogs recently that suggest that SEOs should NOT do bulk re-directs to a category page. This has come as something of a surprise (doh!!) and I feel like I should already know this. It does seem like there is lots disagreement here so I thought that I’d ask what people’s opinions were to make sure that I get my thinking straight. I've read all the main Moz blog posts on this topic and, although really useful, they've left me none the wiser around a few specific questions.
Here’s some more detail about the situation. We’re currently consolidating a lot of content into a main blog, which will be the focal point of new blogs posts that are created. This is different to the past, where we tended to create separate blogs for different products on separate domains. I’m currently considering how we move content across from one the older blogs to this new blog (which will soon sit on a subfolder of our main domain).
I have three (!) questions:
1) Could you confirm that doing bulk re-directs a category page is bad? I already know that doing them all to the homepage is an error.
2) Should I re-direct the home page of the old blog on a separate domain to the relevant category page on the new site? The category page is related, but does not cover the EXACT topic. The category page covers our replacement product offering. It I shouldn't do this, where should I re-direct the old blog domain to?
3) I’ve recommended that we set up 301 redirects on a one-to-one basis, redirecting each piece of content to its new location on the old site. What about content that has been earmarked for removal and for which there is no obvious alternative? My previous recommendation has been to re-direct these pages to the most relevant category page on the new blog. Would it be better to let this 404 or, as an alternative, create a custom 404 for the users on the new blog highlighting the new content that we offer?
Any help would be appreciated
-
RE: Training events - optimisation and avoiding cannibalisation
Thanks for your thoughts Linda - much appreciated
-
Training events - optimisation and avoiding cannibalisation
This is quite a broad question I’m afraid – any help would be appreciated.
I’m trying to find the best way of optimising our new training pages. These events are aimed at teaching our customers how to use our software to do different tasks. Inevitably, the themes and naming of these training workshops overlap with some of our products. A close example would be, to make up a product, ‘Keyword Ranker’ and ‘Keyword Ranker Training’.
Someone has raised the concern that the training pages might start outranking the pages for our main tool, particularly as the training will be heavily promoted via social media. Also, the on-page content talks about similar topics. They’ve suggested that we use rel=canonical tags pointing from the each training page to the related product page to prevent this from happening.
I myself don’t think this is a good idea as this is not what the rel=canonical tags are designed for. I think that they might prevent the events pages ranking for any query at all, which is not what we want. Also, I believe that the training pages and the products are different enough that Google will work out which to rank for relevant queries. Has anyone else had an experience of doing this? Are there any approaches that people would recommend? Or is this something that we shouldn’t be worried about?
A few other thoughts that I’ve had:
-
Using schema.org event markup to emphasise what the events pages are about.
-
Making sure to remove old events once they have expired. I thought it best to let these 404 as I’ve read that 301s to a category page than cause Google to penalise content.
-
Putting internal links from the product pages to the relevant training workshop pages.
-
Using the meta unavailable tag on events pages, so that when the event has happened then it will be removed from Google’s index.
-
-
RE: Does paying a reviewer for an impartial review violate Google's guidelines?
Yes I do, I mean a review where the writer can make up their own minds about the product. It sounds from both your answers that it is best to be careful in these circumstances and make sure that everything it up front.
Thank both for your help!
-
Does paying a reviewer for an impartial review violate Google's guidelines?
When a company pays for an impartial review from a website, should these links be no-followed? I am confident that paid positive reviews are seen as a manipulation of search, but is paying for an impartial review okay?
-
RE: Does a subdomain benefit from being on a high authority domain?
Thanks for the help Ryan
-
Does a subdomain benefit from being on a high authority domain?
I think the title sums up the question, but does a new subdomain get any ranking benefits from being on a pre-existing high authority domain. Or does the new subdomain have to fend for itself in the SERPs?
-
RE: Should I buy a keyword rich domain to prevent competitors from buying it
Excellent - thanks for your responses guys, that's a great help!
-
Should I buy a keyword rich domain to prevent competitors from buying it
Some people in the company I work for have suggested that we buy a keyword rich domain that matches a new product line that we're planning to release.
I've advised that this in itself is not a good idea, as we'll need to produce high quality content for that site rather than just having it exist for ranking purposes. We already have a section on our main site focussed on this product line, so I don't think having the keyword match domain would really add anything unless we worked out what we'd use this site for.
That said, I was wondering whether it might be worth buying the exact match domain anyway, in order to prevent a competitor from using it?
-
RE: What is better for web ranking? A domain or subdomain?
Thanks for the clarification Don - much appreciated
-
RE: What is better for web ranking? A domain or subdomain?
Thanks for your response Donford and for the link.
The question I am asking is slightly different (I think!). If you were setting up a completely new website, would that website rank better on it's own separate domain.
For example, if I setup a website called www.widgets.com would Google prefer that to widgets.maindomain.co.uk? Or is there no difference at all?
I guess what I'm trying to find out is whether there is any difference at all between setting up a new website on a subdomain or domain, or whether Google treats these as the same.
-
What is better for web ranking? A domain or subdomain?
I realise that often it is better put content in a subfolder rather than a subdomain, but I have another question that I cannot seem to find the answer to.
Is there any ranking benefit to having a site on a .co.uk or .com domain rather than on a subdomain? I'm guessing that the subdomain might benefit from other content on the domain it's hosted on, but are subdomains weighted down in any way in the search results?
-
RE: .ac.uk subdomain vs .co.uk domain
They are a society that is linked to an academic institution. I don't know if there are any other factors pushing them from ac.uk to co.uk I'm afraid. It sounds like this is something that I should clarify with them when I talk to them.
Thanks for the advice!
-
.ac.uk subdomain vs .co.uk domain
I'd be grateful if I could check my thinking...
I've agreed to give some quick advice to a non profit organisation who are in the process of moving their website from an ac.uk subdomain to a .co.uk domain. They believe that their SEO can be improved considerably by making this migration.
From my experience, I don't see how this could be the case. Does the unique domain in itself offer enough ranking benefit to justify this approach? The subdomain is on a very high authority domain with many pre-existing links, which makes me even more nervous about this approach.
Does anyone have any opinions on this that they could share please? I'm guessing that it is possible to migrate safely and that there might be branding advantages, but from an actual SEO point of view there is not that much benefit? It looks like most of their current traffic is branded traffic.
-
RE: URL Optimisation Dilemma
Makes sense - I understand now. Thanks for the clarification
-
RE: URL Optimisation Dilemma
Thanks for your response Sheena, it's great to hear that I'm on the right track with this!
I was wondering if you could further explain the following part of your answer:
"What I can say is that the 'better way' depends on what words might already be in the domain, as I try to not be redundant (when possible) so it doesn't appear spammy/kw stuffed."
Are you suggesting that you'd tend towards not including a keyword if it appears elsewhere on the site and so search engines have enough context? Also, what do you mean by 'redundant'?
-
URL Optimisation Dilemma
First of all, I fully appreciate that I may be over analysing this, so feel free to highlight if you think I’m going overboard on this one.
I’m currently trying to optimise the URLs for a group of new pages that we have recently launched. I would usually err on the side of leaving the urls as they are so that any incoming links are not diluted through the 301 re-direct. In this case, however, there are very few links to these pages, so I don’t think that changing URLs will harm them.
My main question is between short URLs vs. long URLs (I have already read Dr. Pete’s post on this). Note: the URLs I have listed below are not the actual URLs, but very similar examples that I have created.
The URLs currently exist in a similar format to the examples below:
http://www.company.com/products/dlm/hire-ca
My first response was that we could put a few descriptive keywords in the url, with something like the following:
http://www.company/products/debt-lifecycle-management/hire-collection-agents - I’m worried though that the URL will get too long for any pages sitting under this.
As a compromise, I am considering the following:
http://www.company/products/dlm/hire-collection-agents
My feeling is that the second approach will give the best balance between having the keywords for the products and trying to ensure good user experience. My only concern is whether the /dlm/ category page would suffer slightly, but this would have ‘debt-lifecycle-management’ in the title tag.
Does this sound like a good approach to people? Or do you think I’m being a little obsessive about this? Any help would be appreciated
-
RE: Google Analytics: Different stats for date range vs single month?
I agree with Bendall that sampling is what is most likely causing your problem here. You may find this post useful: https://support.google.com/analytics/answer/1042498?hl=en-GB
I suspect that the reason that your sampled total closely matches the total of individually downloaded monthly data is because sampled data is Google's 'best guess' when the data in the date range becomes too large. This best guess is based on sound statistics, but ultimately there will be some variances. I've found these variances get bigger the larger your site and the larger your date range. Google do this to reduce load on their servers as crunching the numbers for large data sets can be very resource intense. You COULD upgrade to Google Analytics Premium which gives you un-sampled data as standard, but this is very expensive and only really suitably for large organisations.
There is no easy way round this I'm afraid. I'd suggest that you think about what level of data you are comfortably using - sampled data can still give you valuable insights / trends. Some are comfortable with using sampled data (I prefer now too). I believe there are some tools that allow you to download un-sampled data from Google Analytics via their API but I have not tried these.
-
RE: User generated content - manual warning from Google
Thanks both - I wasn't expecting that answer. I suppose you learn something every day. I have now submitted the reconsideration request so hopefully that will go through fine!
-
User generated content - manual warning from Google
Over the weekend our website received large amounts of spammy comments / user profiles on our forums. This has led to Google giving us a partial manual action until we clear things up. So far we have:
- Cleared up all the spam, banned the offending user accounts, and temporary enabled admin-approval for new sign ups.
We are currently investigating upgrading the forum software to the latest version in order to make the forums less susceptible to this kind of attack. Could anyone let me know whether they think it is the right time for us to submit a reconsideration request to get the manual action removed? Will the temporary actions we have taken be enough to get the ban lifted, or should we wait until the forum software has been updated?
I'd really appreciate any advice, especially if there is anyone here who has experienced this issue themselves
-
RE: Are ALL duplicate title tags bad??
Thanks for those answers, that's really useful. It sounds like this is not something to worry about too much, but something that is not ideal for the site's appearance in the search results!
-
Are ALL duplicate title tags bad??
We’ve had some success recently by reducing the number of duplicate title tags on our website. We have managed to fix all the simple cases but there are a number of stubborn examples that we don’t know how to fix.
A lot of the duplicate tags come from the website’s forums. Many questions have been asked multiple times over the years where the user has phrased the question in the same way. This has led to many cases where different forums posts have the same title tag. For example, there are six title tags with the words ‘’need help”! These are being highlighted as duplicates and currently we have several thousand of these. Would this be a problem? I’d be tempted to say that we should leave them as they don’t seem unnatural to me.
One solution other solution we are considering is to append the forum name to the question to any post after the original, falling back to appending the date if that doesn’t distinguish it.
Do people think that this is a good solution to implement or would it be better to leave these duplicate title tags as they are?
Any help would be appreciated
-
RE: Should I disavow a particular site (no warnings in WMT)?
Thanks all that's good advice. It sounds like this is a fairly grey area. Based on this, we've decided to run some checks on the domain first before proceeding.
-
Should I disavow a particular site (no warnings in WMT)?
I’m currently getting a lot of external links to my website from ‘xyz’. This a series of sites with near identical content and duplicate URLs (xyz234.com, xyz63.com, xyz456.com etc). There are 15 of these sites which are contributing 6236 external links.
Would you agree that these URLs are candidates to be disavowed? I currently have no unnatural link warnings in GWT but I’m concerned with watching out for negative SEO and keeping our link profile healthy. Would pruning spammy links like these be a good step?
Any help would be appreciated!
-
RE: Will I lose traffic from Google for re-directing a page?
Thanks both - it's interesting that there is no 'standard' method, but it makes sense that this would very much depend on the situation.
-
Does Google penalise content that sits behind a read gate?
Does Google penalise content that sits behind a read gate? Currently, most of the content on our site sits behind a read gate. People have to register before they can view the detailed content. Currently, our forums are accessible to all which draws a lot of long tail traffic.
Google does seem to be indexing some of our gated content, but can someone advise me how they view this content more generally please?
-
Will I lose traffic from Google for re-directing a page?
I’m currently planning to a retire a discontinued product and put a 301 redirect to a related product (although not identical). The thing is, I’m still getting significant traffic from people searching for the old product by name. Would Google send this traffic to the new pages via the re-direct? Is Google likely to display the new page in place of the old page for similar queries or will it serve other content? I’d like to answer this question so that I can decide between the two following approaches:
1) Retiring the old page immediately and putting a 301 redirect to the new related pages. This will have the advantage of transferring the value of any link signals / referring traffic. Traffic will also land on the new pages directly without having to click through from another page. We would have a dynamic message telling users that the old product had been retired depending on whether they had visited out site before.
2) Keep the old product pages temporarily so that we don’t lose the traffic from the search engines. We would then change the old pages to advise users that the old product was now retired, but that we have other products that might solve their problems. When this organic traffic decreases over time, then we will proceed with the re-direct as above. I am worried though that the old product pages might outrank the new product pages.
I’d really appreciate some advice with this. I’ve been reading lots of articles, but it seems like there are different opinions on this. I understand that I will lose between 10% - 15% of page rank as per the Matt Cutts video.
-
Best practice for retiring old product pages
We’re a software company. Would someone be able to help me with a basic process for retiring old product pages and re-directing the SEO value to new pages. We are retiring some old products to focus on new products. The new software has much similar functionality to the old software, but has more features.
How can we ensure that the new pages get the best start in life? Also, what is the best way of doing this for users?
Our plan currently is to:
- Leave the old pages up initially with a message to the user that the old software has been retired. There will also be a message explaining that the user might be interested in one of our new products and a link to the new pages.
- When traffic to these pages reduces, then we will delete these pages and re-direct them to the homepage.
Has anyone got any recommendations for how we could approach this differently? One idea that I’m considering is to immediately re-direct the old product pages to the new pages. I was wondering if we could then provide a message to the user explaining that the old product has been retired but that the new improved product is available. I’d also be interested in pointing the re-directs to the new product pages that are most relevant rather than the homepage, so that they get the value of the old links. I’ve found in the past that old retirement pages for products can outrank the new pages as until you 301 them then all the links and authority flow to these pages.
Any help would be very much appreciated
-
Will multiple domains from the same company rank for the same keyword search?
I'm trying to convince people that we need good marketing reasons for starting multiple domains, as it will be more difficult to rank multiple sites. Does anyone know if Google actively discourages multiple domains from the same company appearing in the search results for the same keyword? We are creating a separate content website which is related to an existing company website. Would you agree that is best to have these sites on one domain with the content site on a sub-domain perhaps? I'm worried about duplication of effort and cross-keyword targeting in particular.
These sites would not have duplicate content.
-
RE: Best server-side sitemap generators
Excellent advice Federico. My first reaction was, "but that's not a server-side sitemap generator". I just looked at their website though and it turns out that it is! Looks like I need to read things more carefully!
I'll look into that as an option but if anyone else has any server side sitemap generators that they'd recommend then I'd be really interested to hear about them
-
Best server-side sitemap generators
I've been looking into sitemap generators recently and have got a good knowledge of what creating a sitemap for a small website of below 500 URLs involves. I have successfully generated a sitemap for a very small site, but I’m trying to work out the best way of crawling a large site with millions of URLs.
I’ve decided that the best way to crawl such a large number of URLs is to use a server side sitemap, but this is an area that doesn’t seem to be covered in detail on SEO blogs / forums. Could anyone recommend a good server side sitemap generator? What do you think of the automated offerings from Google and Bing? I’ve found a list of server side sitemap generators from Google, but I can’t see any way to choose between them. I realise that a lot will depend on the type of technologies we use server side, but I'm afraid that I don't know them at this time.
-
RE: Should I remove all meta descriptions to avoid duplicates as a short term fix?
Thanks Marc for answering what is in many ways an unfair question.
I definitely agree that the long term objective should be different and relevant meta descriptions as you say. It's also good to know that each of the approaches I suggested were ultimately bad practice, even if one of them is less bad than the other.
-
Should I remove all meta descriptions to avoid duplicates as a short term fix?
I’m currently trying to implement Matt Cutt’s advice from a recent YouTube video, in which he said that it was better to have no meta descriptions at all than duplicates.
I know that there are better alternatives, but, if forced to make a choice, would it be better to remove all duplicate meta descriptions from a site than to have duplicates (leaving a lone meta tag description on the home page perhaps?). This would be a short term fix prior to making changes to our CMS to allow us to add unique meta descriptions to the most important pages.
I’ve seen various blogs across the internet which recommend removing all the tags in these circumstances, but I’m interested in what people on Moz think of this.
The site currently has a meta description which is duplicated across every page on the site.
-
RE: Submitting XML Sitemap for large website: how big?
Thanks Matt, that's really useful
-
RE: Submitting XML Sitemap for large website: how big?
Thanks for both your replies - I will check out the tools and recommendations you suggested.
I'm sure I remember somewhere reading a recommendation that it was only necessary to submit the basic site structure in a sitemap. It sounds like this is not the case and that a site map should , if possible, be comprehensive.
Would it be better to have a basic sitemap giving the main navigational URLs than having nothing at all?
-
RE: Having problems resolving duplicate meta descriptions
That sounds like an interesting suggestion and definitely something to look into, thank you. Sadly, the developer for the site is on holiday until next Monday, so I won't be to get an answer until next week.
Theoretically, if the changes were not possible, would it be better to have one single meta description on the home page and none across the rest of the site? Or would it be better to leave the site as it is?
-
RE: Having problems resolving duplicate meta descriptions
Hi there, thanks for the reply. We are using an in-house CMS.
-
Submitting XML Sitemap for large website: how big?
Hi there,
I’m currently researching how I can generate an XML sitemap for a large website we run. We think that Google is having problems indexing the URLs based on some of the messages we have been receiving in Webmaster tools, which also shows a large drop in the total number of indexed pages.
Content on this site can be accessed in two ways. On the home page, the content appears as a list of posts. Users can search for previous posts and can search all the way back to the first posts that were submitted.
Posts are also categorised using tags, and these tags can also currently be crawled by search engines. Users can then click on tags to see articles covering similar subjects. A post could have multiple tags (e.g. SEO, inbound marketing, Technical SEO) and so can be reached in multiple ways by users, creating a large number of URLs to index.
Finally, my questions are:
- How big should a sitemap be? What proportion of the URLs of a website should it cover?
- What are the best tools for creating the sitemaps of large websites?
- How often should a sitemap be updated?
Thanks
-
Having problems resolving duplicate meta descriptions
Recently, I’ve recommended to the team running one of our websites that we remove duplicate meta descriptions. The site currently has a large number of these and we’d like to conform to SEO best practice. I’ve seen Matt Cutt’s recent video entitled, ‘Is it necessary for every page to have a meta description’, where he suggests that webmasters use meta descriptions for their most tactically important pages, but that it is better to have no meta description than duplicates. The website currently has one meta description that is duplicated across the entire site.
This seemed like a relatively straight forward suggestion but it is proving much more challenging to implement over a large website. The site’s developer has tried to resolve the meta descriptions, but says that the current meta description is a site wide value. It is possible to create 18 distinct replacements for 18 ‘template’ pages, but any sub-pages of these will inherit the value and create more duplicates. Would it be better to:
- Have no meta descriptions at all across the site?
- Stick with the status quo and have one meta description site-wide?
- Make 18 separate meta descriptions for the 18 most important pages, but still have 18 sets of duplicates across the sub-pages of the site.
Or…is there a solution to this problem which would allow us to follow the best practice in Matt’s video?
Any help would be much appreciated!
-
RE: Can a website be punished by panda if content scrapers have duplicated content?
Thanks everyone - those are great responses
-
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening?
I'd really appreciate any help as I can't find the answer online!
-
RE: Impact of simplifying website and removing 80% of site's content
Great answers guys - thanks. It's good to know that my gut feeling was close to the mark!