No problem, glad to help! Best of luck with whichever route you go with!
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by LoganRay
-
RE: Looking to remove dates from URL permalink structure. What do you think of this idea?
-
RE: Looking to remove dates from URL permalink structure. What do you think of this idea?
Unfortunately, I don't have any examples for ya. Never come across this particular topic for a client.
-
RE: Looking to remove dates from URL permalink structure. What do you think of this idea?
Yes, I'm saying you should keep URLs as they are. I'm always an advocate for not changing URL structure unless there's a really good, highly beneficial reason for doing so. I don't know of a way to change only new URL structures while keeping old ones the same, but I'm no WP expert.
-
RE: Looking to remove dates from URL permalink structure. What do you think of this idea?
OK, now that I understand the reasoning...
I believe there's a better, less-risky approach. What I would do is write a completely new post based on information from the old post. At the same time you publish the new post, go back to the old version and add these two things: a canonical tag pointing to the new version, and a bit of _very readable _text at the top linking to the new post. Something like "Hey, thanks for your interest in our content. Feel free to read on, but we thought you should know we've updated this post which can be found here: link"
This accomplishes a few important things. It eliminates the need for a risky project that could affect your entire site just for the ability to update posts (which I'm guessing doesn't happen too often, what percent of posts get updated?). The canonical tag removes the dupe content risk so you're not cannibalizing your own content. And leaving the old post there gives people the opportunity to discover old content that, while possibly not relevant anymore, still demonstrates you've been a trustworthy source of information for a long time.
-
RE: Looking to remove dates from URL permalink structure. What do you think of this idea?
Jeff,
Based on the traffic you say this blog gets, I'm assuming its rather large and has hundreds, if not thousands of posts. Which leads me to one simple question:
Why? This seems like a HUGE amount of risk and a pretty decent amount of work to go into something that's really not going to provide any benefit.
*edit: It should also be noted that just because Google has recently stated that redirects now pass all link juice doesn't mean you should go needlessly add a massive amount of redirects. There are other implications that redirects have, like load time for example. If you have 1,000 redirects, every single one of those is going to be checked before any page on your site loads, which takes a lot of time.
-
RE: Does a "Read More" button to open up the full content affect SEO?
Hi,
Google used to weigh 'Read More', Tabbed, and accordioned content slightly less than immediately visible content. However, with the recently increased emphasis on user experience and their mobile-first mindset, they've since changed their stance on this type of content. More details including statements from Googlers can be found here: https://www.seroundtable.com/google-content-tabs-hidden-change-22950.html
-
RE: Absolute vs. Relative Canonical Links
Hi,
I'd definitely recommend using absolute URLs for canonical tags. Part of their benefit is preventing duplication due to www vs. non-www and https/http issues. If you're using relative, you don't get to specify protocol or www preference.
Additionally, you don't want to only solve for Google. They've obviously got the largest share or organic search, but that other search engines should still index/crawl content accordingly.
-
RE: Best practice for deindexing large quantities of pages
Unfortunately, I don't think there's any easy/fast way to do this. I just ran a test to see how long it take Google to actually obey a noindex tag, and it's taken a little over 2 months for them all to be removed. I had 2 WP blogs that I added the noindex tag to all category, tag, and author pages and monitored the index count 4 or 5 times per week by running site:example.com inurl:/category/ queries. There was a lot of fluctuation at the beginnning, but eventually took hold after about 2 months. On one of the sites, I did add an XML sitemap with only the noindexed URLs on it, submitted it via Search Console, but that didn't seem to have an impact on how quickly they were dropped out.
See the screenshot below of my plotting of indexed pages per subfolder:
-
RE: What's a good WPM for a copywriter?
Hi,
I wouldn't measure your copywriter based on WPM, I'd focus instead on the quality of the writing. I recommend measuring based on the performance of the content. Is it helping improve rankings? Driving traffic? Nurturing leads? Increasing sales? I don't know what your line of business is, but I would choose some KPIs by which you measure the effectiveness of the content being produced. It doesn't matter if it's 1 WPM if that creates 3x as much revenue as content that was written at 100 WPM.
-
RE: One domain - Multiple servers
You can find more details about how a reverse proxy works here. Regarding the setup, unfortunately that's outside of my wheelhouse - we had to rely on the tech support team to help out with that.
-
RE: One domain - Multiple servers
Hey there,
Sounds like you need a reverse proxy. I recently had a client that was going to migrate small portions of their site over a few months, and this is what we ended up recommending. The situation was this (sounds similar to yours): The client was moving a site with about 1,800 pages from one development firm to another, each with their own proprietary CMS and hosting services. The reverse proxy was essentially a connection between the 2 servers. A URL was requested on server 1, the original developer, the reverse proxy mapped that URL to content on server 2, the new developer. The result was the URLs from server 1 stayed the same, but the content came from server 2 for any URL that had been mapped in the reverse proxy.
NOW FOR A WORD OF CAUTION: This reverse proxy worked great, until security settings on one of the servers prevented requests from coming through, which resulted in dead URLs. So if you're going to move forward with this, be sure that your servers will play nice with each other and not block requests from the other.
-
RE: Too many Tags and Categories what should I do to clean this up?
Sounds like you're probably using WP, if so, I'd highly recommend this plugin to handle your category and tag pages. I made the same observation you did not too long ago and went on a mission to figure out the best solution, and noindexing these pages with that plugin is what I came up with.
-
RE: Why add .html to WordPress pages?
I totally understand not wanting to rely on plugins if they're not necessary. 301 redirects generally don't impact the rankings of a site all that much if the redirects are pointing to the same content. So dropping .html by way of a single redirect rule is not likely to ruin your organic traffic numbers.
-
RE: Why add .html to WordPress pages?
There's only one logical reason I can come up with for this: At some point in the past, the site was not on WP and had .html extensions on URLs. When the site was moved to WP, they may have wanted to keep URLs exactly the same, which would require finding a way to add file extensions in WP.
-
RE: How do I fix my portfolio causing duplicate content issues?
James,
They do have that threshold, confirmed by Tawny's comment on this thread from a while back.
-
RE: How do I fix my portfolio causing duplicate content issues?
Hi Rena,
Issues like this pop up frequently in Moz QA. Here's what's happening - the only difference on these pages is the name of the product, the meta data, and the URL. This being the case, the majority of the source code for each page is the same, when you take into account all the code for the template (header, footer, etc). Moz sets a threshold of 90% for source code match, any 2 pages that have at least 90% same source code get flagged as duplicate.
So what can you do about it? It looks like you're using these pages as product detail pages like you'd see on an ecommerce site. Product detail pages typically have descriptions providing details about the particular product. I'd recommend writing some descriptive content for each of these pages. This will increase the uniqueness of the pages and should differentiate them enough to not trigger the dupe content warning. Descriptions on these products will also help your users' experience on the site, so you're not doing this just to appease the bots.
Hope that helps clear things up!
-
RE: Country Code Top Level Domains & Duplicate Content
Jay,
This is pretty common, and as long as you follow international SEO best-practices, you'll be alright. Here's a couple resources that include everything you'll need to do this properly:
The hreflang tag is basically a canonical tag that lets search engines know you have other versions of this content out there meant for different countries and/or languages.