So there were no external links pointing to the subdomain admin.site.com ? If that's the case you could probably just noindex/nofollow the thing or let it 404. You could write an .htaccess rule to rewrite the domain name, but it's actually probably not worth it now that I think about it. The exception, of course, is if the subdomain had external links pointed to it.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by Carson-Ward
-
RE: Backlinks from subdomain, can it hurt ranking?
-
RE: Backlinks from subdomain, can it hurt ranking?
Okay, so the situation here is a little unclear, but the solution should be pretty straightforward.
If the admin.site.com was different from the original site domain, simply noindex/nofollow all of the pages on that domain. I recommend this over a robots.txt rule because it will actually remove them from the index. You can add a disallow all rule in robots.txt later once the site is completely noindexed.
If the admin.site.com was the same domain, I'd recommend redirecting all of those pages to the new URLs again and then launching a noindex/nofollow version blocked with robots.txt, though I'm not sure why it needs to exist for reference. If the subdomain was different from the old site you could also probably just noindex/nofollow all of it without the redirect. It's not best practice, but it's not that big a deal.
Hope this helps to answer.
-
RE: Referring domain issues
Good answers here - did you get this taken care of? I'd say choose one domain and redirect or forward the others that have the same stuff. To explain it to my boss, I'd say.
- It confuses customers to have the same content on two domains. They might not know which company they're dealing with.
- You probably don't want half the traffic going to one site and half going to the other, especially if their content and user intent is similar. Every live domain is another analytics profiles I have to check on and watch for issues.
- Don't expect any ranking bonus from multiple domains, because when content is duplicate Google will just choose 1 page to rank.
- Maintaining multiple sites is more work than it's worth. We can get more done by focusing on our core domain unless there's a strong case for creating a new brand. (I wouldn't create a new site unless it was for a distinct brand).
Again, probably want to 301 redirect to the primary/canonical domain. If the domains have no links and no traffic (as I'd expect) forwarding through the registrar is fine, too.
-
RE: No-indexed pages are still showing up as landing pages in Google Analytics
I generally recommend meta noindex without blocking via robots.txt for reasons like this.
First, click secondary dimension and check out source / medium. Are they coming from Google? If all you did was use the remove tool on Google Webmaster Tools, Bing/Yahoo don't get the message yet.
Search engines cannot see a noindex tag on the page if they're blocked from crawling the page. They can't access the page to read the tag. So it can sit around in the index despite not having been crawled for a while. (Though usually it's removed eventually).
Also keep in mind that you see some landing page traffic from instances where GA fails to fire the first time. It's usually a VERY tiny percentage, but I often see traffic to some (virtual pageview) popups that can't even load without entering info on our site (e.g. it's not even a possible landing page).
Might I also ask why you removed the job listing from the index? I was thinking this might be a good time to use rather than blocking crawlers outright. That assumes, of course, that you keep listing up for a fixed time. If you know when the job listing is going to expire, you can just tell the search engine. They might even send some traffic to your individual listings while live.
-
RE: I added a WP Customer Reviews plugin but nothing seems to appear on Google search
Unfortunately, Google has been scaling back reviews quite a bit. We have several sites where we used to show up with star ratings on every page. Now we're lucky to get them to show up at all.
I have noticed it relies a lot on the query. If you are trying to rank for local queries where Google pulls in a local pack, for example, they're really not likely to show your stars. Maybe this is to prevent confusion. Maybe it's to help themselves on Google Maps.
Try doing a site: search with the exact URL. In my experience it's most likely to show up in low-competition searches, and nothing is less competitive than a site: search with your own site. For example https://www.google.com/search?q=site%3Ahttp%3A%2F%2Fwww.shopperapproved.com does seem to pull in stars.
The good news: what you have is working.
The bad news: you're at Google's mercy as to when your snippets will pull in. Page and domain authority (in Google's eyes) does seem to play a part, and the query will change things up. My advice is always to implement it, continue marketing, and wait for Google to realize you're really an awesome company/site with reviews worth highlighting.
Unfortunately, there's no magic number of reviews or count of links. It just depends on competition in the SERP.
-
RE: Layered navigation and hiding nav from user agent
If you're really worried about indexation I think that's a fine solution. It's definitely easier to manage, and it'll also be easier to track pageviews in most analytics platforms. The only downside is that if someone emails or links to a category page with filters applied the recipient won't see it. But generally people share products and not category pages, so it's not a big deal. I'd probably go that route.
Also make sure that your category pages still update the URL when you go to page 2, or that page 2 is somehow also being indexed. You don't want products that don't get indexed because categories can't be crawled.
-
RE: Layered navigation and hiding nav from user agent
Yes, the bots will crawl the pages, but they will not INDEX them.
There is a concern there, but mostly if the bots get caught in some kind of crawl trap - where they're trying out a near-infinite set of variables and getting stuck in a loop. Otherwise the spiders should understand the variables. You can actually check it in Webmaster tools to make sure Google understands. Instructions for that here:
https://support.google.com/webmasters/answer/6080550?hl=en
Ultimately Google will definitely not penalize you for having lots of duplicate content on URLs through variables, but it might be an issue with Googlebot not finding all your pages. You can make sure that doesn't happen by checking the indexation of your sitemap.
You could also try to block any URLs with the URL parameter in robots.txt. Make sure you get some help on the RegEx if you plan to do this. My advice is that blocking the variables in robots.txt is not worth it, as Google should have no problems with the variables - especially if the canonical tags are working.
Googlebot at least is smart enough these days to know when to stop crawling variable pages, so I think there are more important on-site things to worry about. Make sure your categories are linked to and optimized, for example.
-
RE: SEO dealing with a CDN on a site.
Very odd, then, that they're being removed from the index. Do you think it's possible that the images have different URLs depending on which server they're cached on? That could definitely do it. I'd have a friend across the country pull them up and see if the image URL changes.
I'm assuming that the image has some dynamic characters on it, which is pretty common with CDNs under certain configurations. Unfortunately, I've never used MaxCDN. If the image is just cdn.site.com/image.png - I'm afraid I have absolutely no idea why they wouldn't be re-indexed. I have similar CDN images that pull in fine.
-
RE: Layered navigation and hiding nav from user agent
If I'm not mistaken Magento has canonical tags on category pages by default, so you might be trying to solve an issue that doesn't exist. Take a look at the source code on faceted navigation to confirm. Or you can send me the site and I'll look over it.
-
RE: SEO dealing with a CDN on a site.
Hi there,
Could you tell me whether the URLs on your images are static on the CDN sub-domain? Or do they change regularly?
-
RE: How do you check the google cache for hashbang pages?
I was actually trying to give you the tools to figure out what's cached and indexed. You can just run a site search for the content and look at the cache, though. For example:
If nothing shows up it's probably not indexed.