Thanks for that clarification CleverPhD, forgot to mention that.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by LoganRay
-
RE: Robots.txt & meta noindex--site still shows up on Google Search
-
RE: Robots.txt & meta noindex--site still shows up on Google Search
Hi,
First things first, it's a common misconception that the robots.txt disallow: / will prevent indexing. It's only indented to prevent crawling, which is why you don't get a meta description pulled into the result snippet. If you have links pointing to that page and a disallow: / on your robots, it's still eligible for indexation.
Second, it's pretty weird that the noindex tag isn't effective, as that's the only sure-fire way to get de-indexed intentionally. I would recommend creating an XML sitemap for all URLs on that domain that are noindex'd and resubmit that in Search Console. If Google hasn't crawled your site since adding the noindex, they don't know it's there. In my experience, forcing them to recrawl via XML submission has been effective at getting noindex noticed quicker.
I would also recommend taking a look at the link profile and removing any possible links pointing to your noindex pages, this will help future attempts at indexing.
-
RE: Merging Pages and SEO
Hi,
Anytime a site redesign occurs, you're going to lose traffic. 301 redirects are going to be your best bet to minimize the traffic loss when you flip the switch. Where you're most likely to take a hit is from organic though, depending on what kind of content condensing you're doing, you might lose out on a lot of rankings. I would dig into Google Analytics and Search Console and see how valuable those pages are in terms of organic traffic before deciding to condense. There are definitely some good cases for this, but there's also a lot of instances where I wouldn't recommend combining 3 pages into 1.
-
RE: Sitemap: unique sitemap or different sitemaps by Country
It depends on how many pages you'll have in each of the subsequent country-level sitemaps. If it's only going to be a few pages in each country, then it's probably not worth it. It'll take extra time for you to generate and waste some of your crawl budget. If you have a substantial amount of content in each of the country folders, it could help search engines understand your site structure a little more clearly.
Either way you go isn't going to make or break your SEO campaign.
-
RE: Url-delimiter vs. SEO
Hi Samuel,
In general, URLs should not contain any unnecessary folders (delimiters). In your first example, the /b/ is not needed since you've already got a /blog/ folder. In the second example, that page appears to be main site content, you don't need any additional folders unless they're specifying a general topic under which you'll be adding more specific pages.
You're also burying your keywords a one step further into the URL than is needed. Google says they don't put too much weight on URL structure, but in my experiences, well planned and logical URL structures perform better. It's not going to have a huge impact on your rankings, but it will help to some degree.
-
RE: 301 Redirects, Sitemaps and Indexing - How to hide redirected urls from search engines?
Redirected URLs are generally removed from the index pretty quickly. I just checked and the /solutions URL is not indexed in Google, so you don't need to take additional measures to have that removed.
A good practice is to have no redirects in your sitemap. When I run new sitemaps, I use Screaming Frog and the first thing I do is remove everything that isn't a 200 status code. Keeping a clean XML sitemap helps your crawl budget and gets bots to focus on the more important parts of your site rather than having them step through unnecessary steps.
-
RE: Duplicate title while setting canonical tag.
You'll definitely want to keep that canonical tag in place. Some tools don't recognize canonicals, so I wouldn't worry too much about duplicate notifications due to parameters like that. If you noindex that page, it will apply to the root of that URL, not strictly the parameter'd version.