Stephanie, Logan is correct--we used to submit blog posts on YouMoz, and they could have the potential to be boosted up so they would show up differently if the post was good enough. That has been paused for now, and I'm looking forward to being able to contribute again.
Best posts made by becole
-
RE: Is it possible to do guest blogging on moz blog?
-
RE: What is the fastest way to deindex content from Google?
Rosemary, in order to remove the content quickly, you have to do several things. You see, Google's processes for crawling, etc. and removing content from the index don't always happen all at once. So, it's best to do several things:
-
Remove the content. When visitors or bots visit the URL, use the "410 Gone" server header code to ensure that it's not just a 404 error being used.
-
If the content must stay and cannot be removed but still needs to be removed from Google's index, consider password protecting the content, putting it behind a paywall, making users log in to see the content, and/or adding a meta robots noindex tag on the page.
-
Add a robots.txt file on the subdomain so that it tells the bots to stop crawling. If you use something like dev.yourdomain.com for a dev section of the site, make sure that you have a robots.txt file at dev.yourdomain.com/robots.txt.
-
Use Google Search Console to remove the content. Once logged in, use the removal tool: https://www.google.com/webmasters/tools/removals?pli=1
By using several approaches, this is going to be the fastest way to remove the content.
-
-
RE: Directory Listings no longer counted in Backlinks?
Excal, I've always looked considered the links that are shown in Google Search Console to be a "snippet" of all of the links to your website. For example, there are links that I know are pointing to our website that are good links--but they come and go in Google Search Console. They don't show us all the links. So, just as you have seen them go away, there's a good chance that they will come back and be listed again.
I do know that the more often you download the links the more often they will refresh the list of links.
-
RE: Writing unique meta titles for canonicalised products or not?
No, you do not need to write unique meta titles for the others. However, you may want to swap out the name of the color so that the title tag of the page reflects the page that the user is on. This is really more for 'user experience' since than for the search engines or for SEO.
For example, if someone likes one particular color, they may choose to bookmark that page in their web browser. If they bookmark it, then the title tag would appear in the title of the bookmark--so having the correct product information there (the color) would be helpful.
-
RE: Google Indexing Pages with Made Up URL
Brian, when this happens, there is typically one reason: somewhere there is a link with that URL in it. What we've seen before is that oftentimes those links are created by hackers or spammers that then try to create content on your site with that URL. For example, when a site is hacked, they will create a page on your site and then link to it.
Without the URL (or the page name without your domain name), it's tough for me to see what might be causing this. But, there has to be a link somewhere to it in order for Google to want to index it.
What I would do is use a server header check tool (such as http://www.rexswain.com/httpview.html) to see if the page has a "200 OK" server response or a 404 error. Google typically doesn't index pages that deliver 404 errors. It could be that the server is set up to deliver a "page not found" on your site but it comes up with a "200 OK" in the server header, so Google indexes the page.
Check your site to see if there is a link to the page. If the link exists, then fix it. Then, look at Majestic.com or Open Site Explorer to see if they show any links from other sites to the page. If those links exist, see if you can get rid of those links.
-
RE: Google is putting brandname: in title tag
Donnleath, I wouldn't worry too much about this. In fact, Google has been rewriting title tags for about 3 or 4 years, now.
Google, for whatever reason, has decided to rewrite your title tag for you if the one you're using better matches the actual search query being used by the searcher. It's quite possible that if your home page, for example, ranks for one search query they will rewrite it but for another search query they won't rewrite it.
Google will use, from time to time, your brand name, company name, and even elements of your site's navigation to rewrite your title tag. I've seen them take internal links from a site's navigation and breadcrumb trail and use those words in the title tag.
In your case, what you need to decide is if the title tags that Google is rewriting are better or 'worse' for your searchers. If they're worse, then you might consider looking at your site's navigation and breadcrumb trail to see if there's something that you can fix on your site to maybe influence Google to rewrite them another way.
If you see that Google is rewriting ALL of those title tags, though, on all pages, then you might want to take a look at your site's title tags and see if they do need to be rewritten, taking what Google is suggesting, into account.
-
RE: Hotel SEO / Rank Conundrum
Meisha, this can definitely be frustrating. When it comes to local listings, and individual units, keep in mind that every unit should have it's own unique unit number, so it would have it's own address.
You mentioned this: "He has also taken ownership of the building Google Plus page, Facebook page, etc. He only owns a handful of units in the building. "
If that other person has taken ownership of the entire building essentially, and the entire Google Plus page, Facebook page, etc. then is sounds as if he is misrepresenting his ownership. Therefore, pressure can be put on him to disclose his ownership of only certain units in the building, and you should be able to force him (legally) to only represent the units that he actually owns.
If this is the case, then he would need to update his Google Local listing(s) so that they only show the actual address of the unit(s) that he owns. If it doesn't currently, and it shows that he owns the entire building, then he should be forced to update it.
You should consult a lawyer, but most likely a stern letter to him asking him to update the website, Google Listings, and any Facebook (and other URLs) so that they only show the unit numbers he actually owns would probably go a long way. In the meantime, any listings that you create should also reflect the actual units that you own, as well.
When it comes go Google's local listings, it's perfectly fine to have multiple "businesses" at one location, as they have unique suite numbers. In this case, there are individual unit numbers, so there is an option to create a listing for each unit. It's not okay for this other person to misrepresent his ownership.
-
RE: Mass uploading low quality product pages
Becky, if you are aware that you have a lot of content that's going to be duplicate, then you've already identified the first step--which is to recognize that they are duplicates. Too many people just upload those pages and don't realize that they're duplicates and then wonder, after the fact, why their site's traffic went down. So for that, I commend you.
In order to deal with this, though, you need to determine which pages are truly going to be duplicates of other pages. Once you've done that, then you should use the canonical tag. The canonical tag should be placed on the duplicate pages and point back to the main page (or the one you don't want to be marked as duplicate).
Come up with a strategic, realistic plan for making those pages unique by adding or rewriting the content. And you might want to look at information such as your site's analytics or make a list of your best-selling products and deal with those first.
Adding a noindex tag to pages and removing them from the index really shouldn't be an option, because you DO want those pages indexed--adding content to them will make them unique and you'll be able to remove the canonical tag. Once you mark a page and tell the search engines not to index that page, it's much tougher to get it BACK in the index, so I wouldn't do that.
-
RE: Sitemap Size effect SEO
Spacecollective, your website's sitemap file(s) don't have any direct impact on search engine rankings. If you have a large website with over 4500 URLs, then most likely you're using content management system (CMS) that "should" be able to create a sitemap file.
If your website's CMS can't do that, then I would recommend crawling the website yourself and updating the sitemap file. However, keep in mind that if you do it manually then you'll need to update it whenever you add or remove a page on the website.
Typically, if your navigational structure is set up in a way that all pages on the website can be crawled via links on the site, you generally shouldn't have anything to worry about.
-
RE: Two companies merging into a new website. How to merge two existing websites into a brand new website and preserve search rankings.
Roy, this is definitely a complex task--which should take careful planning and organization. The steps that are outlined in the link that you provided is a good start, but that's only a small part of what needs to be done .There are a lot of sub-tasks that need to be taken care of in between those larger tasks.
When it comes to moving site A to B, there is no site C involved--so just think about it as if you're moving site A to C and then B to C. Or, you could also first think about combining both sites and rather than moving site A to B you can choose the best content on each and then just move them to site C.
What's important, though, is to figure out which content and pages are duplicated on both sites and then choose the best page(s) and move those to site C. There will be content that's essentially not on both sites, so those can just be moved. The key is to spend plenty of time organizing the content and deciding which content can go away, which needs to be moved, which needs to be combined, and soforth.
There is one major step that's missing in that other list, which is to use verify all sites (http and https, as well as http://www and https://www) in Google Search Console, set up those 301 redirects, and use the Google Change of Address tool to tell Google that the site's moved.
There is also a mention of rel canonical, and since the sites are moving entirely, canonical tags won't be appropriate to use. You'll need to use 301 Permanend Redirects to move the content from one site to another, especially since site A and B won't exist anymore (they'll be redirected).
-
RE: Switching from Http to Https, but what about images and image link juice?
Shawn124, whenever you move from HTTP to HTTPs, you'll need to set up the 301 permanent redirects for pages on the site only. The other elements, such as images, JavaScript (if they're external files), and .CSS files will need to be changed only in the code so that they reference the new HTTPs URLs, and not HTTP.
If you load an HTTP element (such as an image that uses the full URL in it's reference rather than the image filename only) on an HTTPs URL, then the browser will give you an error. So generally you need to do two things:
-
set up 301 Permanent Redirect for the page URLs.
-
search the entire website for all references to HTTP and change them to HTTPs (unless you're linking out to an external site).
If the site is in WordPress, you can use the Search and Replace plugin to replace it all at once in the database.
-
-
RE: Glossary Page - best practice
Brian, yes, this is the best practice. The canonical tag is essentially telling the search engines that the letter page is a duplicate of what's on the other page. So, they should give the credit to the other page.
Technically speaking, those letter pages are crawled by the search engines, but since they canonical tag is there the page is not indexed.
Again, this is the best practice if you're going to have the content appear in more than one location. Ideally, I would probably split it up into separate pages (a page for each term) if you can write enough content for each term to have it's own page. But, given the scenario you're outlining, this is most likely the best practice for your site. I'm asuming that the letter pages are clickable in your site's navigation and that users can click on them easily.
-
RE: Looking at google shopping results from other country
Dieter, one other way you can do this is to use a proxy service. There are several proxy services out there, such as SurfEasy, Strong VPN, or Hola that allow you to surf the web as if you're actually surfing from another country. You can choose the country and then you'll use a VPN to access the sites you need to from another country.
-
RE: SERPs started showing the incorrect date next to my pages
smmour, we've actually noticed this as well, this past week. One site in particular that I'm familiar with shows a date from February 2012 on the site's home page even though the Google cache date shows that the page was cached just the other day.
Google typically does take the pub-date from a site and uses that typically, especially if it's in the code of a site using WordPress. However, what you're describing sounds more of a Google problem than a problem with your site in particular. Based on the fact that we've noticed this as well, this past week, it appears to be something that you haven't necessarily done.
What intrigues me is the fact that the domain name wasn't registered and the site wasn't live in 2010, the date that it is showing.
-
RE: Need advice: How to replace a high-ranking pdf with a landing page -- without dropping much in rank?
John, that's a good question. Depending on the competitiveness of the keyword, I would hesitate to set up a 301 redirect--you still may see ranking changes if you replaced it with a landing page.
I suspect that the reason why it's ranking is because other sites are linking directly to the PDF file, the content. If you were to remove that PDF, they might stop linking to it.
One option would be to edit the PDF file and make that PDF the landing page (in .pdf format), that would be the same URL.
-
RE: Would you recommend changing image file names retroactively?
Generally speaking, in our experience, it's not worth it to take the time to change filenames (and thus change URLs) unless you absolutely have to. If you're still using the same CMS, then just changing the URL might not be worth it, as you have to take time to set up a 301 redirect from the old URL to the new URL.
What it would be worth it to do is to go with a flat file structure, such as domain.com/category/page or domain.com/page/. In the long run, you won't generally have to change URLs in the future if you move to another CMS.
If those pages are ranking well with the current URL, you may not want to change the filename. But, if they aren't ranking on the first page, it should be fine to change the filename. Don't forget to set up 301 redirects from the old URL to the new URL.
-
RE: Is having a site map page necessary?
Myles92, recently (in the past few months, I don't recall specifically when) Google did give some recommendations that included having an html sitemap page on your website. For a good user experience, it is recommended that you have a good navigation structure as well as an "html sitemap". The html sitemap page allows users to see the overall structure of the website, and click through to a certain page or section of the site.
-
RE: Need some strategy advice for Real Estate Attorneys in competitve locations
Donald, when it comes to local SEO, we typically recommend making sure that you first have to local citations taken care of--and then focusing on the content on the website. The Name, Address, and Phone (NAP) should be consistent across the board.
You can manage the local citations, submissions, fixing of duplicates, etc. yourself manually, or you can use one of the services out there, such as Moz Local, Yext, Advice Local, etc..
Then, as far the on-site issues are concerned, schema.org markup is a must on the site, as well.
It sounds as if you are already handling the content on the site, so continue with that. But the missing link is the local citations and local listings, which sounds like the missing piece here.
-
RE: How do I mprove site visibility and keyword ranking for new product site
Sharon, the site site looks great--I haven't taken the time to go through it like I normally would during an SEO audit of the site, but since you mentioned there were only a few issues that Moz had indicated, then it sounds okay. I would take care of the issues that they point out, though.
I took some time to look at your site's backlinks, though, and noticed that while you have a few, it just isn't nearly enough to make a difference. A site in your industry would have a lot more links--and looking you'll need to look at your competitors to see that they have a lot more.
I would spend some time working on your site's links, as I believe that's going to be the issue you're having, especially given the topic of your site.
-
RE: Disavow links from legit sites but have spammy link profiles?
Godard, that's a good question. What I recommend is that you look at each link individually and determine if that link has been created naturally or in a spammy way. If it's a legitimate link, on a real business's website, then the link should be okay. But, if the link was created in order to try to manipulate Google's rankings then you should disavow it.
Many sites get links just because they're ranking well--and that's typical. It doesn't mean that you necessarily have to disavow them or try go get them removed.
If you are using Moz's Open Site Explorer to see the Domain Authority and Page Authority of the site linking to you, then that would be a way to judge the quality of the site linking to you. If the site linking to you is, in fact, very low quality because it doesn't have a lot of good links pointing to it, then the site's Domain Authority and Page Authority will be low.
When disavowing and working on getting links removed, take a look at Google's Webmaster Guidelines and keep them in mind--if the link wouldn't pass their guidelines, then you should disavow it.
-
RE: Is this a panda penalty
Aaron, looking at the site's links, and the type of links that the site has (targeted anchor text and the sites linking), it looks like it is an issue with the links. If you've already cleaned up the site's on-page content and looked for duplicate content, then I tend to think that it's the links.
Regardless, i would spend some time cleaning up (actually getting toxic and unnatural links removed) as well as disavowing the links that you cannot get cleaned up.
-
RE: 301 vs 302
Paul, that's a good question. Whenever you use a 302 redirect, that's actually a "temporary" redirect, and Google deals with those redirects differently than they do 301 Permanent Redirects.
302 Temporary Redirects should really only be used in cases when you're temporarily redirecting a URL to another one--and you then plan on un-redirecting it back. So, if a site is down for the weekend, you might 302 redirect certain pages elsewhere and then unredirect them.
If you're moving your site to another location, you're permanently moving it. So, you'd use a 301 redirect. Google typically passes the all or most of the "link juice" from one URL to another through the 301 redirect. So, you'll want to use a 301 redirect when you move to a new location.
For more details, see Google's help page here: https://support.google.com/webmasters/answer/93633?hl=en
And if you're moving from one domain to another, then you'll want to learn about the Google Change of Address Tool: https://support.google.com/webmasters/answer/83106?hl=en
To answer your question, though, most likely you'll want to use a 301. There aren't really any reasons why you'd not want to use a 301 redirect.
-
RE: 8th October - anything drastic happen?
phero, according to the Moz Algo Change history, Penguin real-time did roll out a few weeks prior to October 8th. But, as we have heard time and time again form various sources, even Google, that it has continued to roll out. So, I wouldn't be surprised if you were actually hit by a Penguin-related update.
What I would do is take a look again at the site's links to see if there are any links that you need to disavow or get rid of completely. Keep in mind that Penguin isn't just about links, there are other issues that can affect your site's rankings, as well.
-
RE: Ecommerce Canonical Question
DSCarl, taking care of the duplicate content that the site appears to be generating is a big deal. So it definitely needs to be fixed--and that's good that you've identified it.
Ideally, you really do need to be able to canonical the sizes of the dress, for example, to the product page, which is "Green Lace Maxi Dress", assuming that you will have a unique page (along with a unique product description) written for the Green Lace Maxi Dress, which would be different than, say, a Red Lace Maxi Dress.
There are generally two ways to deal with duplicate content like this. One way is to deal with it using canonical tags. But before we had the canonical tag, we certainly did have duplicate content--and we dealt with it using the robots.txt file. You can deal with this issue with a canonical tag or robots.txt.
With the robots.txt file, you would need to identify which pages (for example by looking at your URL parameters) and stop the search engines from indexing URLs with certain parameters in them. This is pretty easy to do if you understand your site structure, your parameters in your URLs (or how you have those set up in folders in the URLs), and can add those to the robots.txt file. Using the robots.txt file sounds like it would be a cheaper option for you (rather than spending $1,000 on plugins or add-ons to your CMS).
Alternatively, the canonical tag is the way to go if you can get it to work properly. Oftentimes if it's not working properly you can contact the developer of the plugin or add-on and see if they'll help you install it or get the settings right so that it works properly on your site.
Either way, it's definitely an issue that you need to deal with, as it will have a dramatic effect on your site's rankings. The canonical tag option is probably preferred if you can get it to work properly, as all of the "link juice" and other "credit" will be passed onto the page you're canonicaling "to".
-
RE: Second Store URL
Hi Rillik, that's a good question. Since you have already established a brand, I wouldn't go create a new brand. Your current customers, even though they know you online, are familiar with your first brand. It would take time to create the new brand and get it established.
The best option would be to use your current brand and current URL (domain name), and add a section to that current site. If you wanted to separate the in-store pickup, then I recommend setting up a subdomain on your current domain/URL. For example, instore.domain.com or store.domain.com or cityname.domain.com (i.e., Dallas.domain.com).
Taking the time to create a new brand and establish that new brand is not really a good idea--you should capitalize on the brand that you have already established.
-
RE: Is the Google results serp broken?
Michael, the site: command typically has not "worked" or provided accurate results for years now. It's just they way it is--and I don't expect Google to really give us the number of pages that are actually indexed anytime soon.
That said, typically I recommend using the site:hobbydb.com command to see the pages, and then click to the final page of results (if you can). I'm currently seeing about 766,000 pages indexed.
-
RE: Should an internal link open in a new tab or in the same window?
Based on our experience, we generally think that as long at you're on the same website, links should open up in the same window. They should not open in a new window. If you are going to open a link in a new window, then you'd want to notify the user that you're going to do that by telling them before they click the link.
For example: this is a link (opens in a new window)
This is a general user experience issue. While I don't necessarily have any specific stats to show, it just makes sense.
-
RE: Having possible problems with rankings due to development website
Zakkyg, as previously mentioned, the problem may be the "building backlinks through Guest Posts and Profile links, and the type of links that you're building to the site. I would take one step further and run the site's links through Link Research Tools' Link Detox to make sure that the links to the site are good links and you're not building any toxic links.
I would also immediately take down the development site completely. Make it deliver 404 error pages, with all URLs. There's no need to set up redirects at all. Since it is an old dev site, it really shouldn't have any traffic or links to it--so I would just remove it.
-
RE: Best use of an old domain?
Before you go redirecting 6,000 links from another domain to the site, you need to be very aware that redirecting those could have disastrous results. Not saying that it will, but before you redirect one domain to another you should be looking at those links and determining if they're good, bad, or toxic links.
Depending on the type of links and the topic, etc. of those links you'll want to potentially clean up those links before redirecting all of the domain. For example, you may want to remove the toxic links (ask the site owner to remove the link) and/or disavow those links, as well.
Don't just redirect the domain name without vetting those links first.
-
RE: When To And When To Not Use AMP
Mark, for our WordPress client sites and our own sites, we're implementing AMP on all pages on the site(s). What we know is that Google's implementation (adding the pages to the results) is changing, and we don't know what the rollout plan is in search. We started only seeing it on news sites, but it's been expanded.
I do recommend making the pages available on sites if it doesn't take away from the content on the pages. So, if the content of a page relies heavily on images, for example, it might not be appropriate to use AMP or have an AMP version because it really isn't going to be a mobile friendly page. But, if it's primarily content-based or text-based, then you would want AMP to be available on those pages.
-
RE: OK to change the anchor text of a link?
Zakkyg, if the link was in an article, and Google has already crawled the link, then keep in mind that they DO know what that anchor text is, and have noted it. If they crawl the article again, and notice that the anchor text has changed, then it could put up a red flag. Typically, when an article is written, it's complete--if the anchor text changes or a link is added, it could potentially be turned into a paid link or "optimized" link, which is what you're doing.
Generally speaking, you have a limited time to get the anchor text updated... if it's an older post then it may not make a difference.
-
RE: Duplicate content on Places to Stay listings pages
Nikki, if you hide the unsuitable places from the Moz crawler, you'll be also hiding them from other search engine crawlers, such as Googlebot and Bingbot. So, when it comes to duplicate content, we typically recommend using canonical tags in order to tell the search engines (and crawlers like the Moz crawler) that the content is duplicate. This way the pages are essentially still recognized, but their "link juice" and any other "value" is passed to the page you're canonicaling "to".
-
RE: How does Google handle read more tags in Wordpress
Gabriel, the "readmore" URL is going to be the first URL that Google indexes, as that URL is typically on the "main blog page" of your blog. Since you update that page on a regular basis (by adding more posts) and the page has been around for a while, Google monitors that page and then updates it fairly quickly.
It takes more time for the blog post itself to get indexed, and socializing the post (such as Tweeting the URL) will get it indexed faster than just waiting for Google to crawl. There are a lot of factors that Google uses in order to determine indexing speed and pages that they add to the index, though.
Since you're using "read more", though, that is a good thing--generally you won't run into duplicate content issues because only a snippet of the post is on the site's main blog page (and category pages, etc.) and not the full blog post. So, that's a good thing.
-
RE: Error 404 Search Console
This could be a Google issue. It takes some time for Google to "forget" about URLs they know about, so they may continue to crawl old URLs.
If you have redirected these URLs and they are not showing a 404 error, then you shouldn't have anything to worry about. I would still mark them as fixed in Google Search Console and then see if they come back again. I would also test those URLs randomly using the Googlebot user agent.
One thing you can do, however, is to crawl those URLs yourself using Screaming Frog or another similar spider tool. Make sure you have the user agent set as Googlebot just to make sure that you're seeing what Google might potentially see. When you crawl, you should see the redirects. If not, then you will need to look into why you're seeing a 404 error.
-
RE: Homepage not indexed - seems to defy explanation
Marcus, I know this is frustrating. I've checked several things, and looked at many of the possibilities that you've already brought up. I don't have access to the Google Search Console, so I cannot comment about any of that data. I'm assuming that you don't have a manual action on the site or any other messages from Google.
What I've seen in the past is issues with schema markup, especially when it comes to reviews and how they're handled on sites. I'm not saying that this is the issue--but I've seen issues that Google has had with these (especially because there is the word "hidden" there in the code). So, you might look into that some more.
The issue could also be related to links--look at the links to the site's home page to see if there is an issue with low quality links pointing to that page or other unnatural links.
If someone has copied the page, added a canonical tag, and then added a "meta noindex tag" to their page, it's possible that they could have taken your page out of the index. This has happened before.
-
RE: Exclude sorting options using nofollow to reduce duplicate content
Ben, it sounds like it should work--but there are preferred methods that I would try before you use nofollow. Ideally, you should use the canonical tag on those pages, and not the nofollow tag. If someone were to link to one of those pages (sorting options), then you wouldn't get any credit passed onto the main page of the category. Using a canonical tag accomplishes what you're trying to do, and has greater benefits in the long run.
Another option would be to get rid of the sorting options entirely, and not allow anyone to sort them (and remove the links to those sorting options, as well. If the sorting options don't exist, then there won't be any duplicate content generated.
-
RE: Please Help me! I need advice for my website
Alexa, having two domain names with the same exact content is not recommended--at least for search engine ranking purposes. You have several options, and I would choose the option that's best for your overall strategy for the website. This should be based on the type of visitors you're trying to attract.
One option is to leave both sites up and running just like it is--and stop the search engines from indexing one of the domains. That can be done via the robots.txt file.
Another option is to add a geo-redirect on the site. If the visitor is coming from Australia, then show them the .com.au site. If they're not from Australia, then redirect them with a 301 redirect to the .com site. This would solve the issue with the duplicate content, as this would work out fine with the search engines if set up properly.
Another option is to redirect the .com.au site to the .com site with a 301 Permanent Redirect, as the .com would typically work for US and Australian visitors. Typically .com.au wouldn't go over very well with US visitors, so that would be the better option (to only use the .com site).
You could take the time to rewrite the content and add an Australian and US address to their respective sites, and then you'd show visitors and the search engines that the .com is for the US and .com.au is for Australia. Typically that would work well, as the search engines would show the .com.au site to visitors from Australia and the .com to the other searchers, such as those form the US. I would also add the appropriate geo-related meta tags on the site so that the search engines will know which version to show which searchers.
Whatever you decide, you need to deal with the fact that there are two duplicate copies of the site out there, which is not a good thing. You'll continue to battle search engine ranking issues if you have both copies out there being indexed by the search engines.
-
RE: Best way to handle URLs of the to-be-translated pages on a multilingual site
Lomar, you definitely don't want the duplicate content issue. However, what you could do is a third option: use the canonical tag and put a canonical tag on the /fr/page1.html so that it points to /en/page1.html. You would simply remove the canonical tag when the content is translated and it's a unique page.
Alternatively, I would use this option and remove the 301 redirect when it's translated:
Leave the naming scheme intact and set up a 301 redirect so that /fr/page1.html redirects to /en/page1.html
-
RE: Disadvantages of Migrating Website to New URL
Matt, there are really a lot of reasons why a site won't seem to gain any ground when it comes to ranking in Google. It could be links, it could be content, it could be lack of social media, it could be a lack of local links and citations, or it could be something else. It could be just that you're in a competitive market and that the competition is doing more (or has been around longer).
Regardless, it's most likely NOT the URL. Every domain name has an equal chance of ranking in the search results. So whether or not your keyword is in the URL really doesn't make any difference. It could still be a URL issue or domain name issue, such as if you have other domain names redirecting to your site or if you have used another domain name in the past that has the same content on it.
Without specifics, it's tough to pinpoint a specific reason why the site's not ranking well.
-
RE: What to do with large number of old/outdated pages?
The first thing I would do is look at Google Analytics for the past year (or more) to see if those pages have any traffic. You've mentioned that you looked at that, but didn't say how long of a time it was.
Typically, we do recommend redirecting those pages to the most appropriate current page on the website. If those pages don't have anything to do with what you're current selling or offering, then you might want to simply server up a 404 error (ideally a 410 gone error).
I would also look at the links, such as by using Open Site Explorer or Majestic.com to make sure there aren't any external sites linking to those pages. If there are external links to certain pages, you'll want to redirect them.
-
RE: Google Indexing Pages with Made Up URL
Brian, that's definitely an issue. If it's not delivering a 404 error when you go to a non-existent page on your site, that's the problem. I could theoretically go to yourdomain.com/aslksjdltkjlkjalskdj.html, make a link to it, and Google would index the page.
Check with your web developer to see how you can make sure that 404 error pages (page not found) delivers a 404 error in the server header.
There are lots of ways that Google will discover new URLs (even someone browsing with Google Chrome might allow Google to discover a new URL and then crawl it). So, you'll want to make sure that you have this fixed on your site.
-
RE: What to do with blog content that is no longer relevant to our business
What you do with the content really depends on how much content you're talking about. If it's less than 50 posts, you may want to remove the content entirely and 301 redirect to a new post--which explains that you're no longer providing those services, and then highlight the services that you do offer.
I would, though, look at your site's Google Analytics to see how much traffic those posts are getting. If there is a lot of traffic, but it's irrelevant, consider moving them to another domain and 301 redirecting them from your domain to the new domain name. You could then advertise your other services on that site with banner ads or be a "sponsor". That way others may see the posts and then click thru to your main site because they're interested in the services you do have to offer.
If the posts just don't get much traffic, then it may be worth it to just delete the posts entirely (keep a backup of them, though) and that will allow Google to re-evaluate the content and topic of your site.
-
RE: SEO website migration gone wrong - noticed too late?
Luke, generally speaking, the others are right--you'll want to get going as quickly as possible to recover the lost traffic. Most likely they didn't set up 301 Permanent Redirects from the old URLs to the new ones, and that's what I would concentrate on first. I'd recommend looking at Google Search Console's crawl errors.
If you can get ahold of the site's log files and analyze the site's 404 errors for traffic, then you'll want to set up 301 redirects for pages that have traffic coming to them first.
You'll also want to crawl the site and look for site issues, as most likely someone who didn't know to migrate a site properly may have missed major SEO-related issues when they built the site.
Finally, looking at the site's links and which pages those links are pointing to will be helpful, as that may "save" some link juice. If you can, get some of those links changed or updated so they point directly to the new page and not to a page that redirects.
-
RE: Do I submit a sitemap for a highly dynamic site or not? If so, what's the best way to go about doing it?
Welcome to Moz! It looks like the site has about 169,000 pages indexed Google currently. So, if that's the number of pages you have on your site, then they're crawling and indexing it just fine.
Since you did bring up the fact that you're dealing with dynamic pages, or dynamic URLs, it is important that you have a sitemap (probably multiple sitemaps) available so that Google can quickly crawl and have the proper URLs indexed.
You currently don't have a sitemap file here: https://jane.com/sitemap.xml which is where it should reside. I recommend also listing the sitemap file(s) in your robots.txt file here, as well: https://jane.com/robots.txt
Your site's web development team will need to auto generate the sitemap files, which currently isn't happening right now. I recommend having up to 50,000 URLs in each file, as it can get quite large if it's over that number. If you're able to generate the files based on certain criteria (such as main pages in the site, categories, or something else), then that would be helpful, as well.
-
RE: Does replacing of external redirects impact SEO?
vtcrm, I'm not sure what you mean by "external redirects', although I suspect that you're referring to links from other websites that are pointing to a page on your site that doesn't exist. So, you are redirecting that page to another URL on your site.
If you are able to create content on a URL that's appropriate for the link that it's pointing to, then great--that should actually help your site's overall SEO. You'll be able to recover some traffic, as well. So, anytime that you can get rid of a redirect from an external site and replace it with content, that's a good thing.
You may also want to look at your crawl errors in Google Search Console, there may be pages that have traffic and links that you can also create content on--if you previously removed those pages from your site.
-
RE: DIsavow links even without a penalty?
There have been some great responses so far--overall, you should be proactive with your clients (and even your own sites) when it comes to links. If you do see spam links or the types of links that Patrick has suggested, be proactive and disavow them.
If you do feel there are links that need to get removed, though, then i would go ahead and try to get those removed.
-
RE: High total links, but very few root domains?
Welcome to the Moz community, theguildedteapot! Happy that you're here.
When it comes to links, you should look at the number of root domains, and not links from each domain. For example, we typically really only "count" one or two links from each domain. Even though you have 1,000 links from one site, you really need to look at it as if it's one link.
Personally, I'd rather have 10 links from 10 domains than 100 links from 3 domains.
-
RE: Advice for rapidly declining ranking-- can an old indexed sitemap cause this?
Bruce, whenever you lose rankings, there are a few things that you should check. First, I would crawl your own website using Screaming Frog, Deep Crawl, or another crawler to see if you can identify any issues. I would also look at the links to the site--have you lost links or gained any "shady" or unnatural links recently? Have you done anything related to a disavow file that might have caused this?
There are so many potential issues to deal with that could have been the culprit--but I would first crawl your own website and see if it's can be crawled properly. Then look at the links.
As for the sitemap issue, I don't think that it's going to be a sitemap issue if your pages are being indexed. But, if the sitemap does not have the current pages on your site and is not up to date I would fix it.
-
RE: Do CTR manipulation services actually work to improve rankings?
Generally speaking, these types of schemes don't work, as Google is quite aware of people trying to do this. With the latest Google algorithm updates in the past few years, we've seen it become tougher and tougher for black hat SEO or gray hat SEO to be successful by trying to manipulate clicks, etc. and faking traffic.
We are aware of other techniques that have been manipulating Google suggest, though, and while that doesn't specifically influence rankings, it can lead to people searching for keywords that they wouldn't normally search for.