Ayup - and to accomplish that means being willing/able to do some business analysis, not just site analysis. Which moves into the realm of web marketing optimization, not just SEO. Which is where the real value in this whole process lies IMO
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best posts made by ThompsonPaul
-
RE: Estimating the number of LRD I need to outrank competitor
-
RE: Htaccess - Redirecting TAG or Category pages
The regex in your RedirectMatch doesn't say what you think it says, Jes
This part (note the bolded part of the expression (.*)
/category/Sample-Category**(.*)**
doesn't actually say "match the URL that is specifically** /category/Sample-Category"**
That**.*** is a wildcard thatmeans "and any other additional characters that might occur here"
So what it's saying is "match the URL /category/Sample-Category _**as well as **_any URLs that have any additional characters after the letter "y" in category. Which is what is catching your -1 variation of the URL (or the -size-30 in your second example).
In addition, that wildcard has been set as a variable (the fact it's in brackets), which you are then attempting to insert into the end of the new URL (with the $1), which I don't think is your intent.
Instead, try:
RedirectMatch 301 /category/Sample-Category https://OurDomain.com.au/New-Page/
you should get the redirect you're looking for, and not have it interfere with the other ones you wish to write.
Let me know if that solves the issue? Or if I've misunderstood why you were trying to include the wildcard variable?
Paul
P.S. You'll need to be very specific whether the origin or target URLs use trailing slashes - I just replicated the examples you provided.
-
RE: Probably basic, but how to use image Title and Alt Text - and confusing advice from Moz!
That Moz help page is kinda half-right For many browsers, in the absence of a title attribute, they will display the alt text on hover instead. But if a title attribute is declared, it will be used, as you note.
Keep in mind - image title attributes are not used as ranking factors for regular search, but they are used as ranking factors for Google Image Search. So still well worth optimising them if your site benefits from image search specifically (as a good photographer's site likely would).
Paul
-
RE: How do I know if I am correctly solving an uppercase url issue that may be affecting Googlebot?
It was still a good idea to create the redirects for the upper-case versions to help cut down duplicate content issues. Rel-canonical "could" have been used, but I find it's much better to actually redirect.
But that means the lower-case URLs are the canonical URLs, so ONLY they should appear in the sitemap. (Sitemaps aren't supposed to contain any URLs that redirect.) Right now, you're giving the search crawlers contradictory directives, and they don't do well with those
For additional cleanup, it would be good to have rules added to the CMS so that upper-case URL slugs cannot be created in the first place. Also run a check (can probably be done in the database) to ensure that any internal links on the site have been re-written NOT to use the uppercase URLs. there's no sense generating unnecessary redirects for URLs you control. (I suspect this is the majority of the cases that Screaming Frog is picking up.) You need to ensure all navigation and internal links are using the canonical lowercase version.
The more directly the crawlers can access the final URL, the better your indexing will be. So don't have the sitemap sending them through redirects, and don't let your site's internal links do so either.
Hope that helps?
-
RE: Google Indexing Of Pages As HTTPS vs HTTP
That's not going to solve your problem, vikasnwu. Your immediate issue is that you have URLs in the index that are HTTPS and will cause searchers who click on them not to reach your site due to the security error warnings. The only way to fix that quickly is to get the SSL certificate and redirect to HTTP in place.
You've sent the search engines a number of very conflicting signals. Waiting while they try to work out what URLs they're supposed to use and then waiting while they reindex them is likely to cause significant traffic issues and ongoing ranking harm before the SEs figure it out for themselves. The whole point of what I recommended is it doesn't depend on the SEs figuring anything out - you will have provided directives that force them to do what you need.
Paul
-
RE: Image Height/Width attributes, how important are they and should a best practice site include this as std
Image h x w attributes don't affect the actual speed of your page load much, Dan. They do strongly affect the perceived speed to the user.
If the size attributes are included, the browser can leave a correctly-sized space for each image as the page gets rendered, even if the images haven't started to download yet. Then the rest of the page content flows in around the image "placeholders". (Images are always slower than text.)
If no image size attributes are present, the browser essentially ignores the placing of the images until the image files actually download, then redraws the whole page to add the space back in for the images.
This redrawing for the images means that text and other elements will move around on the page until all the images have downloaded and it has finished rendering. This gives the user an impression of a much slower page, since they can't start to read the content until it has stopped moving around. Done properly, the visitor can start reading the top of the page even while all the images lower on the page are still downloading.
So yes, obviously including height and width attributes for images is standard best practice for designing an effective on-page user experience.
Hope that helps?
Paul
P.S. As proof, Google thinks they're such a standard requirement that they have included a check for them as part of the scoring algorithm of their Google Page Speed tool.
-
RE: Massive Amount of Pages Deindexed
First thing to confirm - did you recently migrate to HTTPS?
-
RE: Ive been using moz for just a minute now , i used it to check my website and find quite a number of errors , unfortunately i use a wordpress website and even with the tips , is till dont know how to fix the issues.
You've got some work to do, @ Dogara. It's essential to realise that just installing SEO plugins doesn't finish the job - they must also be carefully configured. And then the pages themselves must be optimised using the information the SEO plugin provides. Think of the plugin as a tool to make the optimisations easier, not one that will do all the work for you. Here's the task list I would tackle if I were you:
First things first - make certain you have a solid current backup of your website that you know how to recover if things should go sideways.
You currently have two competing SEO plugins active - definitely not recommended. You have both Squirrly and Yoast Premium. Since Squirrly doesn't appear to be configured at all, it should be removed. (This is assuming you haven't' done any customisation work with Squirrly - as it appears to me from a quick scan through your pages, But I didn't' do an exhaustive check, so if you've done customisations in this plugin, they may need to be exported, then imported into Yoast.)
Your Yoast Premium hasn't been updated in a full year - get it updated both for security and functionality. (And get all themes and other plugins updated too if they're behind - this is the biggest thing you can do for your website's security. Did I mention you need to have a solid backup first? )
Fix your page layout templates - they are duplicating the page title and featured image.
Set up configuration of Yoast settings, configure the defaults for page:
-
Turn off the meta keywords functionality (no longer used)
-
Decide what you wish to do with all your redundant archive types that are creating a huge amount of duplicate content and bloat. My recommendations:
-
Since your site only appears to have one author, disable author archives.
-
turn off date-based archives. You're not using them anywhere that I can see, and few people are likely to search by date on the site
-
no-index the tag archives. These are straight-up massive duplicate content on your site as they are just lists of posts that are also listed elsewhere, like your categories.
-
add a couple of paragraphs of quality introductory text on each of your category pages (needs WordPress customisation to do this depending on your theme - may be doable with a plugin.) The alternative to this is to no-index your categories as well, but for a site like yours, this probably isn't recommended, since those categories are used as your primary header navigation.
-
NOTE! These recommendations are based on assumptions about how visitors use the site. If you have business reasons for keeping some of these archives, the decisions may be different!
-
write solid custom meta descriptions for your categories (assuming you are going to keep them indexed.) Currently, it is these category pages having no meta descriptions that is giving you that high error total in the Moz crawl. Do note that when you fix the meta descriptions, you may start seeing a large number of "duplicate meta description" errors listed in a new Moz crawl. This is because you have a large number of paginated pages for each category, and each will have the same meta description as the main page. This is not an issue, even though Moz may flag it, since the pages have proper pagination code in place already (re-next and rel-prev in headers). Note that Google has just this week changed the number of characters allowed in meta descriptions to be much longer - tools may not have caught up to this change yet.)
-
while you're editing the category meta description, take the opportunity to write better SEO page titles for each of them as well. They're edited in the same place as the meta descriptions in Yoast, so easy to do at the same time.
-
get the template for your homepage adjusted to include proper rel-next and rel=prev meta tags in the header so that its pagination is handled properly.
-
turn off JetPack's XML sitemap functionality and turn on the built-in sitemap tool in Yoast. You'll want to make certain only the appropriate sections of the site are in the sitemap (e.g. Any post types/taxonomies you've no-indexed, should be marked to exclude from the sitemap - like tags, author archives etc etc. You'll also need to resubmit a new sitemap address to your Google Search Console - make sure it's set up for the HTTPS address and submit the sitemap address https://hipmack.co/sitemap_index.xml
-
the "URLs too long" warning is somewhat arbitrary, but make certain you are rewriting the URLs of your new posts when you create them if they are too long (more than 4 or 5 words) I wouldn't' bother going back to change the old ones at this stage.
-
you are currently using an HTTPS Redirection plugin to manage the internal files of the site after your HTTPS migration. Would strongly recommend using a Search & Replace plugin for your database to properly rewrite these so you don't have a large number of internal redirects. Better for speed and more reliable.
-
Moz will tell you which page titles are too long and you can go into Yoast for each related page/post and rewrite them. Note that Google will still index a "too long" title, it'll just lose the end of the title when displaying it on search results pages. (So, for example, if it's just the website name getting cut off at the end, it'snot a big deal. This is also a good time to optimise the meta description for those posts as that's done in the same spot as the title are edited.
Whew! And that's just the start, but if you get those things cleaned up, you'll be well on your way to cleaning up the technical SEO of your site.
Paul
-
-
RE: Website structure - best tools to analyse and plan, visually
Screaming Frog also has a basic, useful site visualisation capability built into it.
-
RE: Should I switch from trailing slash to no trailing slash?
Honestly? I'd spend the time to get the Custom CMS fixed to allow trailing slashes in the navigation links. That would eliminate the redirect issue, Instead of just trading it off to another set of links that would have to redirect.
It sounds like a code sanitising issue in the CMS. Worth spending a couple of hundred dollars to fix the root cause of the issue instead of spending that money to apply bandaids that cause other problems elsewhere. (And bonus, maybe you can get proper canonicalisation built at the same time.)
Of course, yea, this does depend on having/finding a competent developer and having a test environment that doesn't endanger the live site.
Any chance you could push for this option?
Paul