Questions created by DougRoberts
-
Search Console - Mobile Usability Errors
A site I'm looking at for a client had 100's of pages flagged as having Mobile Usability errors in Search Console. I found that the theme uses parameters in the URLs of some of theme resources (.js/.css) to identify the version strings. These were then being blocked by a rule in the robots.txt: "Disallow: /*?" I've removed this rule, and now when I inspect URLs and test the live versions of the page they are now being reported as mobile friendly. I then submitted validation requests in Search Console for both of the errors ("Text to small" and "Clickable Elements too close") My problem now, is that the validation has completed and the pages are still being reported as having the errors. I've double checked and they're find if I inspect them individually. Does anyone else have experience clearing these issues in Search Console? Any ideas what's going on here!
Technical SEO | | DougRoberts1 -
Embedding PDF previews and maintaining crawlability/link-equity.
One site that I'm working on has previously had a great deal of success from the pdf preview content on the site. The pdf previews are quite substantial and rank for many many long-tail terms that drive a reasonable amount of traffic back to the site to purchase the full version of the product. As part of a site redesign, the way the pdf previews are embedded/presented on the page is changing slightly: The proposed modal pop-up on the new site the code looks like thie: <object data="my-pdf-preview.pdf" type="application/pdf" style="width:100%; min-height:600px; max-height:100%;max-height:100%;"><embed src="my-pdf-preview.pdf" type="application/pdf"></object> Where as the old code looked like this: <object data="mt-pdf-previewpreview.pdf#view=FitH,50&scrollbar=1&toolbar=0&statusbar=0&messages=0&navpanes=0" <br="">type='application/pdf'
Intermediate & Advanced SEO | | DougRoberts
width='100%'
height='600'> It appears your Web browser is not configured to display PDF files. No worries, you can download the PDF file here.</object> Note: how previously the code contained a plain, standard link to the pdf document. My worry is that without this link, search engines won't a) be able to discover/crawl the pdf content or b) pass any link-equity to these pdfs. Does anyone have any experience/recommendations about this? I'd like to have some information before I request that they add a plain link to the pdf previews back onto the on-page content.0 -
Impact changing domain from ccTLD to .com
I've got a couple of clients who have an international market for their products or services. Both of these clients have a .co.uk domain. For one site the US market is the major audience, the other it's european countries. At the moment, neither of these clients have translated page or content targeted to a specific country. There are no plans at this stage to create such content. Google considers the .co.uk to be targeted to the United Kingdom. The assumption is that by changing this to .com it will increase their international reach. For both the domains, referral and direct traffic is much more diverse than organic (which as you'd expect is heavily UK weighted - but there is some international organic traffic) Does anyone have any experience making such a change? How did the change affect your international reach/visibility? Does anyone have any metrics that they'd like to share that could be used to make a case to clients? (Note, I'm not interested in how you'd go about handling the domain change - I'm happy/confident about doing this.)
International SEO | | DougRoberts0 -
Can a large fluctuation of links cause traffic loss?
I've been asked to look at a site that has lost 70/80% if their search traffic. This happened suddenly around the 17th April. Traffic dropped off over a couple of days and then flat-lined over the next couple of weeks. The screenshot attached, shows the impressions/clicks reported in GWT. When I investigated I found: There had been no changes/updates to the site in question There were no messages in GWT indicating a manual penalty The number of pages indexed shows no significant change There are no particular trends in keywords/queries affected (they all were.) I did discover that ahrefs.com showed that a large number of links were reported lost on the 17th April. (17k links from 1 domain). These links reappeared around the 26th/27th April. But traffic shows no sign of any recovery. The links in question were from a single development server (that shouldn't have been indexed in the first place, but that's another matter.) Is it possible that these links were, maybe artificially, boosting the authority of the affected site? Has the sudden fluctuation in such a large number of links caused the site to trip an algorithmic penalty (penguin?) Without going into too much detail as I'm bound by client confidentiality - The affected site is really a large database and the links pointing to it are generated by a half dozen or so article based sister sites based on how the articles are tagged. The links point to dynamically generated content based on the url. The site does provide a useful/valuable service/purpose - it's not trying to "game the system" in order to rank. That doesn't mean to say that it hasn't been performing better in search than it should have been. This means that the affected site has ~900,000 links pointing to is that are the names of different "entities". Any thoughts/insights would be appreciated. I've expresses a pessimistic outlook to the client, but as you can imaging they are confused and concerned. LVSceCN.png
Intermediate & Advanced SEO | | DougRoberts0 -
Low value link building to sitemap.xml
During some competitive research recently I discovered one of my clients competitors sites had an interesting backlink profile. Looking at the top-pages report in Open Site Explorer the home page was the #1 page (as you'd expect) with 2.5k links from about 500 linking root domains The second page was the sitemap.xml (~1.5k links, 400 linking root domains) and the third was their /feed page (again, ~1.5k links, 350 linking root domains). Links to these two pages aren't something that would happen naturally (particularly the sitemap.xml). There's a whole load of evidence for nasty low quality link building such as over-optimised keyword rich anchor text, comment spam, and even some blog/article based link networks. It's a pretty nasty niche with lots of cut-throat affiliate marketing. My guess here is that someone may have made a mistake using an automated link building too, but I'd be interested in what you might think? Have you seen this before? (Sorry, I can't reveal the domains in question as I'm bound by an NDA.)
Affiliate Marketing | | DougRoberts0 -
What is the best way to optimise the website of a service area business websites for local search?
Service area businesses are a little caught in the middle when it comes to local optimisation and can really struggle to So what approaches would your recommend for optimising a service area business: Assuming the following, perhaps obvious, goals: Make sure your site visitors can clearly understand if their location is covered by your service areas Compete for search traffic for service + location searches (including those with an implided location). One common tactic that I see being used is to generate humongous lists of locations, maybe in a footer. Obviously the more areas you want to include the less relevant (and more spammy looking) your targetting looks. And of course, with loads of locations, you're not going to be able to get them all into your titles (even if you wanted to!) Maybe if you've got locations that are more inportant to you you might create specific landing pages for these location? If you've got the kind of business where the locations you target are determined by the locations you can reasonably get to during a working day it's really tough to find a realistic way to target specific location on your site. I'd be really interested to hear how you approach these kinds of sites.
Image & Video Optimization | | DougRoberts0 -
How to handle large numbers of comments?
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k! As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing! What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account. Does anyone have any particular recommendations? Options I've considered are: Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!) Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?) Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors) How do active comments on a page contribute to an article's freshness? Any thoughts would be greatly appreciated.
Technical SEO | | DougRoberts2 -
Google automatically adding company name to serp titles
Maybe I've been living under a rock, but I was surprised to see that Google had algorithmically modified my page titles in the search results by adding the company name to the end of the (short) title. <title>About Us</title> became About Us - Company Name Interestingly, this wasn't consistent - sometimes it was "company name Limited" and sometimes just "company name. Anyone else notice this or is this a recent change?
Algorithm Updates | | DougRoberts0