It might just be Google being Google, using too many off-page signals (e.g: links, local citations / accepted directory listings). I wonder if there are inbound signals contradicting on-page factors
- Home
- effectdigital
effectdigital
@effectdigital
Job Title: SEO
Company: Effect Digital Ltd.
Effect Digital is a productive partner to ambitious brands. Connecting organisations and audiences in a digital-first world
Favorite Thing about SEO
The blend of art and science
Latest posts made by effectdigital
-
RE: Google Adding Incorrect Location to the end of Title Tags in SERPs
-
RE: Need help understanding this Moz Chart comparing link metrics against competitors...
From the looks of it that just means that 60% of your external backlinks are 'followed', meaning they contribute to Google's rankings. External links are the links which your site has gained from other domains (which link to you) thus they are probably not under your control anyway
-
RE: URL Parameters
Just so you know, if a URL results in a 5XX server error then it usually won't render your canonical tag to begin with! You might want to check your sitemap XML, to check that it's not 'undoing' your canonical tags by feeding these URLs to Google. Indexation tags must be perfectly aligned with your sitemap XML, or you are sending Google mixed messages (e.g: a URL is in sitemap XML so Google should index it, but when it is crawled it contains a canonical tag citing itself as non-canonical, which is the opposite signal)
Everything which Gaston said is right on the money
-
RE: No index for http version of website
No you don't need to change anything. In-fact, you actively DON'T want the HTTP sitemap to be feeding Google a list of HTTP URLs, which I am sure you are trying to steer Google away from. Only feed Google the HTTPS URLs, delete the HTTP sitemap from search console if you can so that it doesn't keep flagging false positives, and feeding Google bad (insecure) URLs
-
RE: How can I optimise my key pages for new (related) key phrases as they arise, without compromising the original optimised keywords?
Expand the content with new sections thematically bound to the new keywords. If you dilute your page by making it about too many things, your original ranking could suffer. Be careful how you proceed! As long as the new keywords are highly related to the old one it shouldn't cause too many problems. Another concern would be, expanding the content to the point where the page's performance (loading speed) suffers. If you embed too much non-optimised rich media, that could be a problem. Check your before and after URLs using Google Page-Speed Insights, and learn how to properly optimise images /videos (and other media) before you upload them into your content. Don't 100% rely on CMS plugins to do this for you
-
RE: Google Adding Incorrect Location to the end of Title Tags in SERPs
Are there references to the London address on the non-London URLs? For example, on the Scotland store page look for references to the London address, e.g: in the footer of the page. You may need to alter some on-page instances to correct this, but the structured data should help
-
RE: Does &pws=0 still work?
This post from Jan 2020 seems to assume that you can still use &pws=0
... but I don't know how reliable it is!
-
RE: Moz bot not discovering important links (high DA sites link)
Rogerbot will take time to crawl links on the web. Lots of those big sites, where the links are very valuable, have thousands or hundreds of thousands of pages. One thing that I do think would be nice, if Moz Pro users could have some crawl allowance devoted to them - and could ask rogerbot to ping URLs (and update Moz's index). It would obviously have to be limited so as not to skew Moz's regular crawling operations, but it could be really helpful for some users
-
RE: Inbound Links - Redirect, Leave Alone, etc
If you want to disavow and redirect at the same time, you probably wouldn't want to use a 301 which passes SEO authority (and negative equity) along to the resultant page. I'd probably use a 302 or a 307, and then disavow the linking domain (or page) in Google's disavow tool which is here. I might also try to no-index the redirecting URL, though with a redirect in place this could not be done within the HTML / source code. You'd have to deploy the no-index directive via the HTTP header instead, using X-Robots
-
RE: Entities and SEO
You're thinking along the right lines, especially focusing on corpus-based co-occurrences between words. The corpus isn't the keyword though, it's the body of text which you check the keyword against (and it has to be substantial to be accurate, in the Gigabyte range). I don't know of any pre-built software to assist you. You would have to develop scripting knowledge
Best posts made by effectdigital
-
RE: How Much Time It Will Take To Lower the Spam Score?
If you're talking about the Moz spam score of the domain, it's higher than many would like but it's not extremely high:
https://d.pr/i/4WwfDq.png (screenshot)
A score of 80 or higher is indicative of very, very spammy sites. Although lots of strong language is used within the tool, 50% isn't super awful really. By the way, Moz is not connected to your disavow file and cannot see it (something many have requested many times, which I continue to request at any given opportunity). As such, disavow work will not decrease your Moz spam score
Moz's spam score is not something which Google use within their ranking algorithm(s). Google have private spam metrics which they do not share with webmasters. Moz's spam score is simply an attempt by our industry to make our 'best guess' at how spammy Google 'might' think a website is. Ultimately though, it's just an indicator and a 'shadow metric', it's meant to mimic the decisions that Google might make but Google (again) does not actually use any Moz metrics (at all) within their ranking algorithm(s)
Your disavow file goes straight to Google, so even if Moz doesn't see it and their best 'guess' is that your spam score is still high, you know that 'actual Google' have seen your disavow work and thus Moz's spam metric is not likely to be accurate for your domain (which is why it's only an indicator, even when looking at other domains, as you don't know what link removal and disavow work they have carried out)
If you want your actual Moz spam score to go down (though there is no reason for such vanity, as Google doesn't use Moz metrics) then you have to actually remove the links and that's that (sorry)
Remember that the spam score is derived based on factors which Moz perceives as being common to penalised websites:
Spam Score: "Represents the percentage of sites with similar features we've found to be penalized or banned by Google." ~ Moz
This is not necessarily linked to backlink features in isolation, I would expect that some on-page features may be counted. The site just doesn't look and feel very legit:
1.) A review site, with only seven reviews, one of which appears to be for a gun or fire-arm (paintball or not, it's a gun)
2.) No seeming ability for any users to add their own reviews, so this is just one person's biased voice. Why does the internet need this website?
3.) Logo is blurry and low-res and doesn't look 'proper'
4.) Only three pages seem to exist. One of these pages is a 'disclaimer'. Webmasters put up disclaimers when, they should be taking more responsibility for the content of their own website but they refuse to do so. Disclaimers are a low-quality signal, and unless there are thousands of contradictory positive signals (which there are not, for this domain) then this is how this will be viewed
5.) The site has no value-add for end-users, or unique value-proposition. People can find more in-depth reviews from product critics they trust, or shallow yet more numerous reviews from review aggregators like Trust Pilot. Either way, they would be on a better site with a better value-proposition for the end user. Why would Google rank this site?
6.) Site claims to be a review site, yet does not make good usage of review schema and star-ratings for prettier SERPs. Seems more like a blog with aspirations to be a review site, which didn't quite make it. The site marks up the supposed 'reviews' with BlogPosting schema, not with review schema
7.) Content is dry with poor layout and feels boring. In most reviews no numerical evaluation is made, no star ratings are given. There's no point at which the author accepts: "I am a reviewer now, I must give an opinion, I must give something useful to the user which they could use at a glance". The unwillingness to take responsibility for giving an opinion, combined with the disclaimer which reinforces the author's 'shunning' of their own content (are they worried their own content is bad? Why are they so careful not to give or take responsibility for opinions? The images all look like stock images, are these fake reviews? Right now it feels like yes they are)
8.) It feels as if this site has bee made 'for the sake of' SEO. That's not the kind of site Google wants to rank
... so as you can see, even if you tackle your poor backlinks, this site doesn't really have much hope of ranking well on Google. Google is ultimately looking for trust and a value proposition. Over the materials which Google already has indexed on their first page of results for the reviewed products, the pages on this site don't really add anything. In addition the domain is giving off multiple off-page AND on-page mistrust signals, which will really stand against in the rankings
In this case I think you'd better head all the way back to the drawing board
Look at this video in which Miley from Google (think she's an ex-Googler now) outlines the #1 common SEO mistake as 'working without a value proposition':
You only need to watch her outline issue #1, the rest of the video isn't that relevant to you
Also watch Moz's video on how unique content isn't good enough to rank any more:
https://moz.com/blog/why-good-unique-content-needs-to-die-whiteboard-friday
... and how you should craft 10x content to replace the prior 'plague' of 'good unique content':
https://moz.com/blog/how-to-create-10x-content-whiteboard-friday
After watching these videos, you should begin to understand why what you're doing isn't working and why it won't work
-
RE: Does Google Understand H2 As Subtitle?
Yeah Google is perfectly able to interpret an H2 as a sub-heading. It's more of a directive than an absolute rule, for example if you crammed loads of H2s into your footer and made them really small, Google would be able to tell that the H2 was being deployed illegitimately
In your case you seem to be using the H2 correctly. I think it adds some space to add a little extra context to your pages, I think that's a really good idea! I might use the space a little differently though
This is what you have:
H1: Flavour & Chidinma – 40 Yrs
H2: 40 Yrs by Flavour & Chidinma - Mp3 Download
They essentially say exactly the same thing just with the difference of "MP3 Download"
I might use the H1 more as the news heading and the H2 for the additional context of what exactly the reader will be getting
H1: Flavour and Chidinma Release 40yrs Everlasting EP
H2: 40 Yrs by Flavour & Chidinma - Mp3 Download & Video
I gave the page a schema scan:
Nice usage of Article schema. You could also think about using AudioObject schema for the MP3 download. Google have recently come out and said that whilst some schemas don't result in visual changes in the SERPs, they're still a good structural framework for Google to work with (in terms of contextualising information) so usually I always push for the maximum Schema.org implementation possible
Did you know that MP3 files also contain their own Meta data, inside of the file? You can inspect and modify the Meta data with industry-standard audio-editing software, or simple applications such as MP3Tag
This is what your MP3 file looks like in terms of the internal MP3-tagging Meta:
Screenshot: https://d.pr/i/iUxtv0.png
I have boxed in red the field "Album Artist" which has not been filled out. Most media players and media apps, actually categorise music into artists by the "Album Artist" field and not by the "Artist" field (makes no sense, I know!)
You might consider copying the Artist text into the Album Artist field and re-saving the file, then re-uploading it. There are a lot of sites that illegally rip music and upload it in hopes of search rankings and ad-revenue. Much of the time, those sites fail to correctly fill out their MP3 file Meta (sometimes everything is 100% blank) and that's often a piracy signal
I don't think that's what your doing, but it might pay to verify you have correctly amended MP3 Meta before uploading the files to your site (especially as a UX thing, if people download the MP3 and then can't find it on their media player then it won't get many listens)
Fun track by the way, thanks for the listen
-
RE: Root domain change - how do we best handle existing backlinks from our own content platforms on youtube, etc?
You're better off amending the links if at all possible. 301 redirects are great, but they can break down for many reasons. One reason a 301 can be refused equity flow, is if the old content is too 'dissimilar' to the new content (think Boolean string similarity, not 'what humans think'). If the old content and new content are 60% similar, don't expect 100% of the authority to go through. If the old and new content are only 20% similar, barely any SEO authority (if any) will translate across
This is to combat SEO-authority sculpting through redirects. If webmasters decided they voluntarily, editorially wanted to link to one old page, but the new page is barely similar to that old resource - are the old hyperlinks still 'valid' in terms of contributing SEO authority? In many cases, no they are not (the webmasters or editors, may not have chosen to link to the new content - even though they did link to the old content). Past a certain point, content has to re-prove itself
Amending the hyperlinks circumvents that judgement, though links do also decay over time. In general, I have found link amends to be superior to 301 redirects, 90% of the time
-
RE: My product category pages are not being indexed on google can someone help?
This is probably more of a ranking authority problem, rather than an indexation problem. If you can force Google to render one of your category URLs within its search results, then it's highly likely the page is indeed indexed (it's just not ranking very well for associated keywords)
Follow this link:
https://www.google.co.uk/search?q=site%3Askirtinguk.com%2Fproduct-category%2Fmdf-skirting-board%2F
As you can see, the category URL which you referenced is indexed. Google can render it within their search results!
Although Google know the page exists and it is in their index, they don't bother to keep a cache of the URL: http://webcache.googleusercontent.com/search?q=cache:https%3A%2F%2Fwww.skirtinguk.com%2Fproduct-category%2Fmdf-skirting-board%2F
This probably means that they don't think many people use the page or that it is of low value.
What you have to keep in mind is, lower value long-tail terms (like product keywords or part number keywords) are much easier to achieve. Category terms are worth more in terms of search volume, so competition for them is higher. If your site ranks for product terms but not for category terms, it probably means your authority and / or trust metrics (as well as UX metrics) may be lower. Remember: Google don't consider their ranking results to be a space to advertise lots of companies. They want to render the best results possible for the end-user (that way people keep 'Googling' and Google continue to leverage revenue from Google AdWords etc)
Let's look at your site's domain-level metrics and see if they paint a picture of an 'authoritative' site which should be ranking for such terms...
Domain Level Metrics from Moz
Domain Authority: 24 (low)
Total Inbound Links: 1,200+
Total Referring Domains (much more important than total link count!): 123 - This is too many links from too few domains IMO
Ranking keywords: 38
Domain Level Metrics from Ahrefs
Homepage URL Rating: 11 (very low)
Domain Rating: 11 (very low)
Total Inbound Links: 2,110+
Referring Domains: 149 - Again, the disparity here could be causing problems! Not a diverse backlink profile
Ranking Keywords: 374 (Ahrefs usually finds more, go with this figure)
SEO Traffic Insights: Between 250 and 380 visits (from SEO) a day on average, not much traffic at all from SEO before November 2016 when things improved significantly
SEMRush Traffic Insights (to compare against Ahrefs): Estimates between 100 and 150 visits from SEO per day. This is narrowed to UK only though. Seems to tally with what Ahrefs is saying, the Ahrefs data is probably more accurate
Domain Level Metrics from Majestic SEO
Trust Flow: 5 - This is extremely low and really bad! Basically Majestic track the number of clicks from a seed set of trusted sites, to your site. A low number (it's on a scale of 0 to 100 I think) indicates that trustworthy seed sites aren't linking to you, or that where you are linked - people avoid clicking a link to your site (or visiting it)
Citation Flow: 24 - low but now awful
What do I get from all of this info?
I don't think your site is doing enough digital PR, or making 'enough of a difference to the web' to rank highly for category related terms. Certainly the site looks very drab and 'cookie-cutter' in terms of the template. It doesn't instil a sense of pride in the business behind the website. That can put people off linking to you, which can cause your SEO authority to fall flat on its face leaving you with no ranking power.
A lot of the product images look as if they are fake which probably isn't helping. They actually look at lot like ads which often look a bit cartoony or CGI-generated, with a balance between blue and white (colour deployment). Maybe they're being misinterpreted as spam due to Google PLA (Page Layout Algorithm). Design is not helping you out at all I am afraid!
So who is ranking for MDF skirting board? The top non-PPC (ad-based) result on Google.co.uk is this one:
https://skirtingboardsdirect.com/products/category/mdf-skirting-boards/
Ok so their content is better and deeper than yours (bullet-pointed specs or stats often imply 'granular' content to Google, which Google really likes - your content is just one solid paragraph). Overall though, I'd actually say their design is awful! It's worse than the design of your site (so maybe design isn't such a big factor here after all).
Let's compare some top-line SEO authority metrics on your site against those earned by this competitor
- Domain Authority from Moz: 24
- Referring Domains from Moz: 123
- Ahrefs Homepage URL Rating: 11
- Ahrefs Domain Rating: 11
- Ahrefs Referring Domains: 149
- Majestic SEO Trust Flow: 5
- Majestic SEO Citation Flow: 24
Now the other site...
- Domain Authority from Moz: 33 (+9)
- Referring Domains from Moz: 464 (+341)
- Ahrefs Homepage URL Rating: 31 (+20)
- Ahrefs Domain Rating: 65 (+54)
- Ahrefs Referring Domains: 265 (+116)
- Majestic SEO Trust Flow: 29 (+24)
- Majestic SEO Citation Flow: 30 (+6)
They beat you in all the important areas! That's not good.
Your category-level URLs aren't Meta no indexed, or blocked in the robots.txt file. Since we have found evidence that Google are in fact indexing your category level URLs, it's actually a ranking / authority problem, cleverly disguised as an indexation issue (I can see why you assumed that). These pages aren't **good enough **to be frequently indexed by Google, for keywords which they know hold lucrative financial value. Only the better sites (or the more authoritative ones) will rank there
A main competitor has similar design standards but has slightly deeper content and much more SEO authority than you do. The same is probably true for other competing sites. In SEO, you have to fight to maintain your positions. Sitting back is equivalent to begging your competitors to steal all of your traffic...
Hope this analysis helps!
-
RE: If website users don't accept GDPR cookie consent, does that prevent GA-GTM from tracking pageviews and any traffic from that user that would cause significant traffic decreases?
This is a common and over-zealous implementation of GDPR tracking compliance. Lots of people have lost lots of data, by going slightly overboard in a similar way. Basically you have taken GDPR compliance too far!
GDPR is supposed to protect the user's data, but in terms of - is there a 1 or a 0 in a box in an SQL database for whether an anonymous user visited your site or not (traffic data, not belonging to the user) - it's actually fine to track that (in most instances) without consent. Why? Because the data cannot be used to identify the user, ergo it's your website data and not the user's user data
There used to be a GA hack which Google patched, which forced GA to render IP addresses - but even before it was patched, they banned people (who were using the exploit) from GA for breaking ToS. That kind of data (PII / PID), unless you have specifically set something up through event tracking that records sensitive stuff - just shouldn't even be in Google Analytics at all (and if you do have data like that in your GA, you may be breaking Google's ToS depending upon deployment)
If the data which you will be storing (data controller rules apply) or sending to a 3rd party to store (in which case you are only the data processor and they are the data controller) does not contain PID (personally identifiable data - e.g: email addresses, physical addresses, first and last names, phone numbers etc) - then it's not really covered by GDPR. If you can say that these users have an interest in your business and show that a portion of them transact regularly, you're even less at risk of breaking GDPR compliance
If you're worried about cookie stuff:
"Note: gtag.js and analytics.js do not require setting cookies to transmit data to Google Analytics."
It's possible with some advanced features switched on like re-marketing related stuff, this might change. But by default at least, it seems as if Google themselves are saying that the transmission of data and the deploying of any cookies are not related to each other, and that without cookies the later scripts can send data to GA just fine without cookies
If you are not tracking basic, page-view level data which is not the user's data (which is not PII / PID), then you are over-applying GDPR. The reason there aren't loads of people moaning about this problem, is that it's only a problem for the minority of people who have accidentally over-applied GDPR compliance. As such it's not a problem for others, so there's no outcry
There'**s lots more info here: **https://www.blastam.com/blog/gdpr-need-consent-for-google-analytics-tracking
"This direction is quite clear. If you have enabled Advertising features in Google Analytics, then you need consent from the EU citizen first. Google defines ‘Advertising features’ as:
- Remarketing with Google Analytics.
- Google Display Network Impression Reporting.
- Google Analytics Demographics and Interest Reporting.
- Integrated services that require Google Analytics to collect data for advertising purposes, including the collection of data via advertising cookies and identifiers.
-"
If you aren't using most, many or any of the advanced advertising features, your implementation is likely to be way too aggressive. Even if you are using those advanced features, you only need consent for those elements and specifically where they apply and transmit data. A broad-brush ban on transmitting all GA data is thoroughly overkill
Think about proposing a more granular, more detailed approach. Yes it will likely need some custom dev time to get it right and it could be costly, but the benefit is not throwing away all your future data for absolutely no reason at all
Don't forget that, as the data 'storer' (controller), a lot of the burden is actually on Google's side
**Read more here: **https://privacy.google.com/businesses/compliance/#!?modal_active=none
Hope this helps
-
RE: Why Would My Page Have a Higher PA and DA, Links & On-Page Grade & Still Not Rank?
Steve's answer is really great. Basically in SEO we have to cater to Google's PageRank algorithm. We used to be able to see a very watered down, simplified version of PageRank using the Google toolbar for Firefox (before Chrome became big) and using various Chrome extensions thereafter
Google figured out that people were misusing this data and shut off the API which supplied the (very, very simplified version of) PageRank (a number for 0-10 for each URL on the web). PageRank still exists and Google still use it in their ranking algorithms, but no one except Googlers (and even then, only certain ones) can see it. Arguably no one could ever really see it, as TBPR (Toolbar PageRank) was really simplified and watered down, it was never a full view on a page's 'actual' PageRank
Suddenly, marketers had no way to evaluate the SEO authority of each web page they were looking at. Many stepped in to fill this hole (Ahrefs supply a URL and domain rating metric, Majestic SEO supply Citation Flow and Trust Flow metrics, Moz of course were first with PA and DA)
These metrics are our industry's attempt to fill a hole left by Google's removal of bad data from the public eye. Moz attempt to use various signals and metrics (link counts, search traffic estimates for URLs) to re-build TBPR as PA and DA
... but Google don't use PA and DA. Google use PR (PageRank). PA and DA are 'shadow metrics', they indicate and mimic but they are indicators only and cannot (read: absolutely must not) be taken at face value
For example, although link counts affected Google's old TBPR (Toolbar PageRank) metric, other things did too. If a site was blocked from Google, if a site had a penalty or algorithmic devaluations. Those things could lower or nullify the TBPR rating of a website. Since Google and Moz are not 'connected' in data terms, Moz's metrics miss many of the 'true' authority nullifying circumstances which could occur - thus you can end up with high PA / DA and still no traffic
Things that can affect you:
- Algorithmic devaluations, where the sites linking to your site are penalised and thus they no longer pass SEO authority to you - making your results go down as well. Not a penalty, just Darwinism in action I am afraid
- An actual penalty on your site
- Poor keyword targeting where your keywords aren't properly used in your content and / or Meta data, stuff like that. Sounds like this one is a real concern for you, as you may have SEO authority but NO relevance!
- Technical issues like an architecture which Google can't (or doesn't want to spend the time to) index, e.g: over-reliance on generated content through JavaScript (which Google can crawl, but it takes them much longer - so if you're a nobody don't expect them to care much or take that time)
- Technical indexation issues like blocking your own site with Meta no-index directives or robots.txt crawl blocks
- Legal challenges to your business or content in the form of DMCA requests, people filing reports directly with Google to have content removed from your site - there are many other types of legal challenge that can affect SEO
- Content duplication, internal or external
- Spam reports and disavow logs against your website
... there are many other factors, a big one is that your site may lack a value-proposition for end users. If other sites doing what you do, existed before you - and they're cheaper, have better reviews or tout unique features like free shipping (click fit and collect services for clothing, etc etc) then your offering itself may just not be competitive (and no matter how good your SEO is the site was doomed from the business end). Google expects sites to 'add value' to the web
The best thing to do is concentrate on your value proposition and making your site genuinely popular online. It's not easy. Building a successful site is as hard as building a successful business, it's just the digital reflection of what you are and what you do
-
RE: Why My Domain Authority Dropped
See my relevant answer to an older, similar question here:
https://moz.com/community/q/why-did-my-site-s-da-just-drop-by-50
"Keep in mind that PageRank (which is used by Google to weight the popularity and authority of web pages, yes it's still true even after the little toolbars got deleted) does not read, utilise or rely upon Moz's DA metric in any way shape or form. DA is a 'shadow metric'. Since Google took away our view of the simplified PageRank algorithm (TBPR - ToolBar PageRank, which is dead) people working in SEO needed a new metric to evaluate the quality and ranking potential of web pages
Moz stepped in to supply this (in the form of PA / DA) but Google still use PageRank (they just don't show it to us any more). Whilst Moz's PA/DA metrics are a good 'indicator' of success, Google isn't using them (at all) and so they do not directly affect your rankings"
Only someone from Moz can confirm why your DA dropped, but it may have dropped for a reason that wouldn't impact or effect Google rankings in the slightest (so don't panic yet!)
-
RE: Still Need to Write Title & Description Tag?
It's true that Google no longer necessarily must use Meta descriptions and Title tags, however Google is still pre-disposed to using them if they are written well. Why let Google crawl all your content, let a mechanical brain 'decide' which snippet (or paragraph) to display - when you can still control it with minimal effort?
If Title tags and Meta descriptions are written badly (poor grammar, keyword stuffed) then Google now can take these elements from your content instead. That's a fall-back though, it's not a reason to 'get lazy' and 'not do stuff'
In SEO, there are no 'magic bullet' solutions (that's commonly said, in our industry). But if there are no magic bullet solutions, that means that there are very rarely, any single changes that massively increase results. If that is true, it means that most factors in SEO only hold a very small (yet relatively equal) weighting
If you follow me on that, you'll see why the small 'seemingly unimportant' stuff is still critical. If in SEO, all factors under your control only make up a tiny sliver of the whole pie - then by saying 'I won't do the small stuff' what you are really saying is 'I won't do any SEO'. But these small, seemingly unimportant changes - are all part of what gives you an 'edge'. When you come up against a competitor of relatively equal standing (and popularity), you might just get ahead. That's what SEO is really for
Imagine you have two car manufacturers stripping down and gutting out their cars for the rally track (maybe one is Mitsubishi and one is Subaru). They go around the chassis, making tiny reductions here and there - so that their car finishes the track less than a second faster than the other. That has value to them, but if the team said "well each of these tiny changes doesn't do much, so let's sit on our butts and do nothing" - the opposition WOULD beat them
The art of optimisation is small, consistent, fractional weight loss and streamlining. That's what it means, not just in SEO but everywhere
If you have an SEO company which is hesitant to do their own darn job, look elsewhere. You need someone who just 'gets on with it' instead of taking your money and making excuses when you pull them up on stuff (although budget is also a factor there, since I don't know anything in that area - I at least have to say that for them)
Meta descriptions don't influence Google's rankings any more, but a search result which has a nicely written description may draw more traffic through from Google without having its ranking position increased. If people see it and it looks more attractive, they may choose that result over the top one. Although Google can 'generate' SERP snippets from your content - since when has generated crap ever been better than hand-written, targeted, authored text? Do you let Google Ads write all your own Ad-text for PPC? No? Then don't do it in SEO
Page titles can still influence rankings. Not as strongly as they used to, but I see evidence every day that they can still make the difference in some small situations. That's what optimisation is, bundling up all the small stuff into a package and benefiting from it. Someone who says "this is too small to bother with" ISN'T an optimiser. Optimisation is attention to detail for small, cumulative, multiplicative gains which snowball over time
Too many people are still selling SEO as a pure marketing package instead of a B.I. & fat-trimming / competitive edge based product
Final thoughts: Yes, write your own page titles and Meta - and don't let people fob you off when you're paying them your own real money. They should be coming to you with things they have missed (it does happen, people are human after all) not the other way around with you having to pick up on stuff, go to forums, aggregate answers. You're doing the research now - that they should have done before they even opened their doors. Everything I have said is (EXTREMELY) common SEO knowledge
-
RE: Increasing in 404 errors that doesnt exist
It's so annoying when things like that happen! When Google refuses to give the 'linked from' data, it's a real head-test working out where the links are coming from. Did you know that the links could even be coming from other websites, not just your own? When a user follows a link to your site (regardless of where that link is from), Google consider it your error if a valid page isn't returned
Since this error is only occurring in the old area of WMT, it probably doesn't matter much. That being said, one simple fix would be to 301 redirect all the broken links, to the functional article pages. After that you can just bulk mark them all as fixed
Usually I tell people to fix the actual link, but if it's an external link which you have no control over (or if Google can't even be bothered to tell you what the linking page is) then 301 and mark as fixed is probably your best bet. Especially since, these are only individual article pages (it's not like a malformed version of your homepage or something)
If you email me the domain (check my profile page) then I might be able to crawl your site for you to determine whether there are any obviously broken internal links. Regardless, you'd want the 301s as a back-stop anyway
Hope that helps
-
RE: Product Subdomain Outranking "Marketing" Domains
You are very right to be worried about rocking that particular boat. If you de-index a page, it basically nullifies its SEO authority. Since the page which you would nullify, is a homepage-level URL (you gave the example 'www.client.com') then this would basically be SEOicide
Most other pages on your site, probably get most of their SEO authority and ranking power from your homepage (directly or indirectly, e.g: homepage linking to sub-page vs homepage linking to category, which then links to sub-page)
This is because, it's almost certain that your homepage will be the URL which has gained the most links from across the web. People are lazy, they just pick the shortest URL when linking. I'm not saying you don't have good deeplinks, just that 'most' of the good ones are probably hitting the homepage
So if you nullify the homepage's right to hold SEO authority, what happens to everything underneath the homepage? Are you imagining an avalanche right now? That's right, this would be one of the worst possible ideas in the universe. Write it down, print it out and burn it
Search-console level geo-targeting is for whole sites, not pages or (usually, though there can be exceptions) sections - you know that right? What that does is tell Google which country you want the website (the whole property which you have selected) to rank in. It basically stops that property from ranking well globally and gives minor boosts in the location which has been selected. If you just took your homepage level property and told it that it's US now, prepare to kiss most of your other traffic goodbye (hard lesson). If you were semi-smart and added /US/ as a separate property, and only set the geo targeting to US for that property - breathe a sigh of relief. It likely won't solve your issue but it won't be a complete catastrophe either (phew!)
Really the only decent tool you have to direct Google to rank individual web pages for regions and / or languages is the hreflang tag. These tags tell Google: "hey, you landed on me and I'm a valid page. But if you want to see versions of me in other languages - go to these other URLs through my hreflang links". Hreflangs only work if they are mutually agreed (both pages contain mirrored hreflangs to each other, and both pages do NOT give multiple URLs for a single language / location combination - or language / location in isolation)
The problem is, even if you do everything right - Google really has to believe "yes, this other page is another version of exactly the same page I'm looking at right now". Google can do stuff like, take the main content of both URLs, put it into a single string, then check the Boolean string similarity of both content strings to find the 'percentage' of the content's similarity. Well, this is how I check content similarity - Google does something similar, but probably infinitely more elegant and clever. In the case of hreflangs string translation is probably also enacted
If Google's mechanical mind, thinks that the pages are very different - then it will simply ignore the hreflang (just like Google will not pass SEO authority through a 301 redirect, if the contents of the old and new page are highly dissimilar in machine terms)
This is a fail-safe that Google has, to stop people from moving high rankings on 'useful' or 'proven' (via hyperlinks) URLs (content) - onto less useful, or less proven pages (which by Google's logic, if the content is very different, should have to re-prove their worth). Remember, what a human thinks is similar is irrelevant here. You need to focus on what a machine would find similar (can be VERY different things there)
So even if you do it all properly and use hreflangs, since the nature of the pages is very different (one is functional, helps users navigate, log-in and download something - that's very useful; whilst the other is selly, marketing content is usually thin) - it's unlikely that Google will swallow your intended URL serves
You'd be better off making the homepage include some marketing elements and making the marketing URLs include some of the functional elements. If both pages do both things well and are essentially the same, then hreflangs might actually start to work
If you want to keep the marketing URLs pure sell, fine - but they will only be useful as paid traffic landing pages (like from Google Ads, Pinterest Ads or FaceBook ads) where you can connect your ad to the advertorial (marketing) URLs. People expect ads to land on marketing-centric pages. People don't expect (or necessarily want) that for just regular web searches. The channel (SEO) is called 'organic' for a reason!
Effect Digital is a productive partner to ambitious brands. Connecting organisations and audiences in a digital-first world
Looks like your connection to Moz was lost, please wait while we try to reconnect.