I would think you would want to create unique content and either remove duplicate content or implement rel=canonicals. I would imagine hiding duplicated content could be seen as manipulative to Google.
Best posts made by JordanLowry
-
RE: Duplicate content hidden behind tabs
-
RE: Page content not being recognised?
I crawled your site and you have roughly 30 pages with under 300 words on them and the rest of your pages are under 1000 words or so. I'd recommend building out some of the content on these thinner pages.
You also have some paginated urls with a /page/ folder attached to them that you could apply a meta no index tag to and remove from Google.
I'd also look at disabling your current SEO plugin because "Your SEO optimized title" appears on all the pages and is causing your duplicate title tag issue. And I know meta descriptions aren't a ranking factor but all yours are blank and it's a great opportunity to build these out to drive extra click throughs from Google search results.
I hope that helps a bit.
-
RE: Google Analytics reporting traffic for 404 pages
I am not 100% sure if this is the cause but when I look at the source code I see the google analytics tracking code on the page. The other reason could be you have a custom report set up to collect data about 404 pages but it does not appear to be the case when looking at the attached image. I would recommend putting a 301 in place for that page since it appears to get a lot of traffic.
Hope that helps some.
-
RE: OK to change the anchor text of a link?
It would be highly unlikely you would be penalized for updating the anchor text in one blog. I would be more concerned if you had an alarmingly high number of exact match anchor text links pointing to your client's site.
-
RE: Page content not being recognised?
The issue is that there are duplicate title tags on the site. You could have a developer remove the extra title tag that say "Your SEO optimized title". Then, you would need to work with a developer to correct the tag issue also. Currently, there isn't any content within the tag just leftover generic text "page contents". I'm not a developer but it appears as if it's an issue with the theme and these appear to be generic default settings. You can view this on any page of your site by pressing "CTRL U" on a PC and then pressing "CTR F" and search for <title>or <body> and you should be able to see the code issues.</p> <p>The Moz report could be incorrect I looked and you definitely have more than 50 words on certain pages but the majority of the pages are thin content. I wouldn't pay too much attention to that report.</p> <p>Also, just because Moz or another tool can't recognize certain content doesn't mean Google isn't actively crawling and indexing it. To double check this take a page where Moz is telling you there is 50 words or no content and do a "fetch and render" in Google Webmaster tools. This will show you how Google is viewing the page and will be a more accurate representation of what they are viewing.</p> <p>I hope that helps clear up the situation a bit more. Like I said I'm not a developer so that's my best guess as to what's going on.</p></title>
-
RE: How do I deal with Negative SEO (Spammy Links)?
Moz has a good blog addressing negative seo and the potential effects it may have. Matt Cutts has stated that it isnt impossible for a competitor to harm your site through negative seo but it is very rare. So unless you have thousands of links pointing at your site I would not worry too much. Just make use of the disavow tool and you should be fine.
-
RE: Unable to fully see full spectrum of links built on some backlink checkers
This may be a silly question but have you checked in Google Webmaster tool's? I have noticed for some clients I work on Moz may not always show all the backlinks but it is still a good starting point to see if you have any questionable links pointing to your domain. Whereas Search Console or Webmaster tools gives a more complete breakdown of all my backlinks.
If you have Search Console set up download all your backlinks and compare it the other backlink checking tools you have.
-
RE: Page content not being recognised?
It's kinda hard to understand what's going on without looking at the source code. But my first question is did you implement 301 redirects from your Wix site to your WordPress site, I believe Moz looks at the content within the
tags on your site. Also, did you set up your WordPress site within search console and submit your XML Sitemap? That will let you see any issues with your content being indexed.
Back to the thin content issue if you do indeed have thin content on your pages it's possible you are being penalized. Google is pretty explicit about having thin content that provides little or no value. I'd do an audit of your web pages using screaming frog or Moz and review the word count for some of your key pages.
Hope that helps some.
-
RE: Is it possible that Google would disregard canonical tag?
A rel canonical tag is more of a "hint" or "suggestion". It is entirely possible Google believes the urls are not equivalent and has decided to ignore the canonical.
I hope that helps some.
-
RE: Page content not being recognised?
Sorry I couldn't attach the screenshot. But I wouldn't panic too much a lot of times websites tend to experience a loss in rankings after web redesigns so I'd give it some time and see if rankings improve. In the meantime, I'd look at updating the code to remove the extra title tag and fix the body tags.
-
RE: What is the fastest way disassociate an old URL with a new domain name?
First I think it will take time for Google to completely disassociate the old domain from the new domain. With that being said have you tried to disavow all the spammy backlinks on the old domain? You could also meta no index the entire site and get it
You could also either meta no index the entire site and get it de index from Google or block it from being crawled within its robots.txt file. That is probably the closest you could come to deleting the domain.
I hope that helps some.
-
RE: Page rank and menus
I think the easiest way to figure out whether or not Page Rank is properly flowing to those deeper sub category landing pages is to crawl the site with Screaming Frog.
I don't remember if a Moz crawl pulls page rank for every page but if not crawl your client site with Screaming Frog and connect to Moz's API. This will pull in the Page Rank for every page and you can sort to find out if there are any issues.
How does the hamburger menu look like on mobile? I'd be curious to see what the engagement looks like when the entire menu opens by default.
-
RE: METADATA DESCRIPTION
That specific meta data tag is for Facebook. There's a good chance if that issue is showing in your audit then you may not have a meta description present.
You can always manually look at the source code for this tag: "<meta name="description".
I wouldn't stress about it too much updating meta descriptions is usually low priority item.
Best of luck!
-
RE: Robots.txt blocked internal resources Wordpress
I would leave all the disallows out except for the /wp-admin/ section. For example, I'd rewrite the robots.txt file to read:
User-agent: *
Disallow: /wp-admin/Also, you kind of want Google to index your cached content. In the event your servers go down it will still be able to make your content available.
I hope that helps. Let me know how that works out for you!
-
RE: How to handle images (lazy loading, compressing, caching...) to impact page load and thus SEO?
Image Format
I believe the preferred performance is WebP. But I usually try to use png over jpg.Compressing Images
I think what might work best in your case is some sort of plugin like WP Smush. If you have a ton of images I'd invest in a tool or plugin that dynamically compresses images as they are uploaded to your site.I like WP Smush because it also strips out the metadata associated with images along with compressing them.
If you have a ton of images it could be an attractive solution for you that you can scale.
Outside of another plugin, you could try some sort of cloud-based solution to dynamically compress images before you upload them.
I've tested an open-source image compression tool called Caesium in the past. This tool reduced some of my images by almost 40%. It performed better than the plugins I was using but I'm not so sure it would be a scaleable solution for you.
Out of curiosity, how bad are your load times? Are you currently running into site speed problems or are you trying to make incremental improvements?
-
RE: I've all the things set up, still keywords are not rankign anywhere in Google.
I took a quick look and it appears as though your site was ranking a bit at one point. However, it looks as though your rankings took a nosedive.
You went from roughly 2k+ keywords ranking in some fashion to almost nothing. Without probing further it's hard to be certain what happened exactly but it could be a penalty. And it might have to do with the forum links you've built or your content.
I'd review your content and make sure it's unique and that it provides value to the searcher. I'd also do some sort of link audit and consider removing any irrelevant forum links that were created.
Best of luck!