Hi Leowa,
As long as it comes before you're fine.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: SEO Consultant
Company: eSizemore.com
Favorite Thing about SEO
Constant change and challanges - and community.
Hi Leowa,
As long as it comes before you're fine.
Thanks for the clarification on the platform Suarezventures.
I have worked with plenty of brands that have a similar setup on Shopify. They usually put the blog on a subdomain because Shopify's content management system - let's see, how do I say this nicely... sucks. These clients put up Wordpress on a subdomain. Some also put up a landing page platform like Hubspot or Unbounce to which they send paid traffic.
Your plan to put the eCommerce site on a subdomain has some benefits in that the content side won't be affected by future platform migrations on the eCommerce site. However, the content side will benefit the most from being at the main level with the homepage and most of the backlinks. Thus, organic search traffic to the eCommerce site could be harmed by this move. I normally wouldn't recommend it for that reason (because the business is eCommerce, which is what pays for the content) but in your case, it sounds like the eCommerce site doesn't bring in much traffic as it is.
Good luck. Let us know how it turns out.
Hi Suarezventures,
I typically draw the subdomain vs top-level domain line at whether the two sites / experiences and purposes are vastly different. For example, a site like blogspot that hosts different websites on subdomains, or a brand that has a forum community on a subdomain because it runs on a different server and has a much different purpose than the main domain.
Ideally, if you're moving to Wordpress you'd have the content and the store on the same site (e.g. https://site.com). If this isn't possible for them, having one or the other on a subdomain would be better than having them on (Squarespace?).
What about having the new site on a subdomain so you don't have to deal with migrating the existing site? Can' t you leave it there and put up store.site.com on WP?
Hi Christian,
I don't see any evidence of the site being deindexed now. Here are some things I checked for you, along with a few observations:
Nothing in the Robots.txt file, or robots meta tag, or X-robots HTTP header response that would keep these pages from being indexed by Google
The rel= canonical tags appear to be functioning properly
The home page is indexed and not duplicated by other indexed pages
Google has about 86 pages indexd from your domain
Hrefl Lang tags appear to be implemented properly
There are only about 50 links going into the domain from other sites, and the ones from Moz are the best of what few aren't just random scraper sites (harmless, but annoying).
Sometimes Google ranks a brand higher when it first comes out because it's a chicken or egg situation. How else can they collect data for their machine to chew on unless some traffic is sent to a new site? We used to call this phenomenon "the Google sandbox" a long time ago, but it is essentially (in its effect) the same thing. We do it ourselves with A:B testing and paid advertising. You have to spend some budget to gain enough data to know what's working and what isn't.
I don't think you have a technical SEO problem here. I think you need to continue building a brand and producing useful, rich content. Good luck!
Hello Ross,
The spam comments below have been reported.
To your questions:
I don't know of any way to "restore" data that you never exported or saved, but there are several documented processes for automating it each month. For example, I recommend this solution: interface: https://gsuite.google.com/marketplace/app/search_analytics_for_sheets/1035646374811
This should also take care of your other question about getting more rows of data than what's provided in the GSC interface.
Can you provide more information? What is the site? What is the TLD? What is the target country in GSC? Where are most of the links from? Do you have Href Lang tags? Etc...
Although I have seen no evidence or documentation that local GMB listing reviews have any impact on non-localized search results for a brand's non-local homepage.
However, the picking apart of each little ranking algorithm factor is short-sighted. These things do not work in a vacuum. I have a client with a single location who does mostly eCommerce. But they allow people to come into the warehouse and buy directly if they happen to be in the area. This means we can get a GMB listing, and reviews for the location. I fully support this strategy, even if it doesn't help the homepage rank better.
Julian,
Will you be translating the content into other languages?
And/or customizing it for the location? For example, changing US English spelling to the rest of the world English spelling?
To answer your question, no, you don't have to worry as much about duplicate content if it is only happening on different ccTLDs. The biggest issue with duplicating content is that Google has to choose one to show. In this case, that decision becomes easy to do: Show the one in the right country.
Use of Href Lang tags, and setting the target country in GSC, are helpful hints as well. I recommend using this tool by Aleyda Solis and team to build out the tags: https://www.aleydasolis.com/english/international-seo-tools/hreflang-tags-generator/
Hello Sam09,
Was your question answered by Jeroen? Were you asking about the "keyword" meta tag?
I can see right away that not everyone agrees on how to optimize a category page. This is just my opinion based on experience, but I really don't think that 500 words, or several paragraphs, is necessary or even desirable on a category page. And I don't recommend putting the copy down at the bottom either.
If you use the word "SEO copy" to explain a chunk of text, right away that should be a red flag to you. If you claim that copy is good for users, why would you put it way down at the bottom? Because it kills your conversion rate? Well then it's not good for users is it? What's good for users will improve your conversion rate. Remember that.
In my experience a category page only needs about two-three useful sentences with appropriate keywords to rank. And by "useful" I don't mean just saying "This is our Green Widgets page where you'll find the bets green widgets in the world. We have big green widgets, small green widgets and cheap green widgets."
I mean something like "Find the best green widget for your needs by using our filtering options above. Choose to see green widgets sorted by price, color and popularity, or simply browse the offerings below. Call 1-800-Green-Widgets or click the 'Chat' button to the right if you have any questions."
If it is a complicated topic, like something scientific or technical, you may consider adding more copy for the users to help them choose the right brand / product. In this case, a drop-down "Read More" type of div works well, as does a link to a larger "guide" on another page. If they don't know what they want yet, maybe your category page is too far down in the funnel, in which case sending them to a separate "guide" could be beneficial.
I'll leave this topic open for discussion for awhile since others my find some very good reasons to disagree. In the end, what you really need to do is test out a few different options on different category pages and go with whatever works best for your users and your site.
I want to make sure everyone, including myself, understands you Alex. Correct me if I'm wrong, but you're saying that the website is totally new (a start-up) and nothing (at least nothing owned by the company you're with) has ever been on that domain name. While building the site the previous guy accidentally allowed the development version of the site to be indexed, and/or allowed product pages that you don't want on the site at all to be indexed. Since it is a brand new site those "old" pages that were deleted didn't have any external links, and didn't have any traffic from Google or elsewhere outside of the company.
IF that is the case, then you can probably just let those pages stay as 404s. Eventually, since nobody is linking to them, they will drop out of the index on their own.
I wouldn't use the URL removal tool in this case. For one thing, it is a dangerous tool and if you don't have experience with this sort of thing it could do more harm than good. It should only take a few weeks for those URLs that were briefly live and indexed to go away if you are serving a 404 or 410 http header response code on those URLs.
I hope this helps. Please let us know if we have misinterpreted your problem.
You may want to consider rewriting those URLs. The character escape codes shared by Thomas above may meet your needs, but I'd rewrite URLs like that. Assuming you are on a Linux server (as opposed to Windows, which would require ISAPI ) you'll do this by editing the htaccess file. I'd advise getting a developer to consult with you on this, as rewriting URls often involves writing regular expressions that can be easy to goof up.
Here's a good Moz article on the topic: http://moz.com/blog/down-and-dirty-write-your-own-url-rewrite
And here's another great introductory resource: http://coding.smashingmagazine.com/2011/11/02/introduction-to-url-rewriting/
If you want to try doing it yourself feel free to post your website, with the example URLs, so other members can help you with the rewrite code. I strongly suggest working with a developer on this though.
I've dealt with similar issues with robots.txt blocks of the entire site, as well as robots meta noindex tags. You should be fine now that you've taken the noindex tag off, and the old pages are redirecting. It may take longer for Google to update their index with the new URLs, but otherwise I don't think you need to worry too much. Maybe resubmit the sitemap and do another fetch on key pages.
Good luck!
Hello Geraldine,
I'm going to rephrase your questions here just to make sure I understand them correctly.
1. Your Sort-By URLs are using self-referencing rel= canonical tags.
When I looked into this issue I found that some of these pages no longer exist and produce a 404 error page. Example:
https://www.tidy-books.co.uk/childrens-bookcases-shelves/sort-by/position/sort-direction/desc/l/letters:5
I did find some that I was able to access and verified the problem above. You should contact MageWorks to ask if there is a way you can fix this with the SEO plugin you're using. If not, I suggest adding a disallow statement in the robots.txt file for the /sort-by/ folders to keep them from being accessed and indexed. Then I would go into Google Search Console (GSC) and remove that entire directory from the index, as described here: https://support.google.com/webmasters/answer/1663419?hl=en .
2. You have duplicate title tags, mostly caused by the blog and the sort-by URLs discussed above.
If you're not using the Yoast SEO plugin for Wordpress you should be. Without seeing the report and knowing exactly what those URLs are with duplicate title tags, I can't help answer this question very well. Common examples are Tag pages and paginated Category pages. I usually advise blog owners to disallow the Tag pages in the Robots.txt file and removing the directory from Google SERPs using GSC URL Removal Tool, as described above.
3. Image directory pages are causing mobile usability errors in GSC and you have blocked them from being indexed.
Because those index pages do not need to be accessed in order to render the page, I don't see any problem with blocking those in the robots.txt file. However, you do NOT want to block anything that needs to be accessed in order to render the page so I would advise you to use the Robots.txt tester tool in GSC to make sure your Robots.txt code is not inadvertently blocking those image files as well, such as https://www.tidy-books.co.uk/skin/frontend/tidybooks/us/images/eco.png . As long as you aren't blocking access to the actual image files, only to the directory pages, your solution is OK.
It would be better to serve an error code on those pages though. Best practice is to not allow folder index pages to be loaded like that. Otherwise, it creates a bit of a security risk. Often webmasters will show a "403: Forbidden" code on those URLs.
I hope this helps. Please feel free to respond with any updates or clarifications. Hopefully others will chime in now that your issues are broken up into component parts. In the future, you'll get better responses by asking three different questions instead of lumping them all into one. This is because someone my know how to help with one of your questions, but not the others, and are less inclined to answer the one they can help with because they can't answer the others.
Good luck!
As I said before, a 301 redirect will pass pagerank. Even if it goes to a blocked folder, that's still domain-level benefit coming into your site from "paid" links.
The best solution, in my opinion, is for sites to run their affiliate program through another domain first, and 302 (temporary) redirect the user to the main site.
Affiliate links to www.YourAffiliateDomain.com/?afflink-id=123, which has a domain-wide robots.txt disallow. The ?afflink-id=123 part tells the system where to redirect the user to on the primary domain. The user goes from that URL through a 302 redirect to the appropriate URL on your primary domain.
No pagerank is passed and you can kill off the domain if you ever need to and those redirects will stop coming into the site.
If you are unable to do all of this you can submit a disavow file for all non-compliant affiliate domains after asking them to nofollow their links. I think the limit is supposed to be 2,000 domains, but I've heard of people doing as much as 4,000 with no problem. Just give it a try and see what happens.
The way the page is set up now that content is NOT indexable. The good news is that it isn't indexed on any other domains either. Sometimes these review companies will allow your reviews to be indexed on their domains, but not yours. They claim ignorance, but I think that type of arrangement is downright low. They'd be stealing your content and your traffic.
What I would advise for you to do if you're going to stay with the company is ask them to populate a
<noscript>field with the content contained in the script. This is what BazaarVoice / Power Reviews does, and I've seen it work first-hand. It is a 100% honest and correct use of the noscript tag. The thing with BV/PR is you have to disallow the reviews subdomain from being indexed or you'll have duplicate content on your own subdomain.</p></noscript>
Hello Micey123,
I think the link provided went to a phenomenal post, but there may have been a misunderstanding about what the post was instructing. From what I could tell, it was about tracking your own internal site predictive search, and not Google's.
Assuming you can get the full referrer path, including query string, in GA or via log files, I think one way to approach it would be to separate the queries from last to first, and you'll see the last is probably the original query that was "assisted" (or "interrupted", depending on how you look at it) and the first one in the URL was the auto-complete suggestion that was chosen. Here are a few examples.
This is the URL from my address bar while searching on Google for “I’m searching for something" without quotes, and selecting the suggestion for "I'm searching for something".
First query in the URL string (I'm searching for something):
q=i%27m+searching+for+something
(q=i%27m = I’m)
Second query in the URL string (I'm searching for):
q=i%27m+searching+for
This is the URL from my address bar while searching for “baby pandas are” without quotes, and selecting the autocomplete suggestion for “baby pandas are ugly”. I agree Google. Hideous creatures.
First query in the URL string:
q=baby+pandas+are+ugly
Second query in the URL string:
q=baby+pandas+are
URL from address bar while searching for “typing in full” and selecting the autocomplete suggestion for “typing in full sentences”.
Same pattern.
This is the first…
day of my life lyrics (after “day of my life”).
First query:
this is the first day of my life lyrics
Second query:
this is the first day of my life
I hope this helps. But there may be an easier way. I'll will ask around for you if you'd like, but I want to make sure I understand your needs first. Do I?
Becky,
Your competitor uses a javascript framework of some sort to change the products on the page when you click pagination links. There is only ONE page, as far as the URL is concerned (the only think that changes is addition of #2 in the URL, which isn't considered a new "page") by Google.
There are pros and cons to this approach, but the cons can be alleviated with solutions like those discussed in Built Visible's guide to javascrpt framework SEO.
On the other hand, your site uses standard pagination in which each paginated set is a new URL. There are pros and cons to this as well, but the way you have it set up will magnify the cons. For example, each paginated URL has their own self-referencing rel = canonical tags. This is fine, but if you're going to allow them to represent themselves as completely new pages, you need to get rid of that SEO text at the bottom of all but the very first page. Otherwise it's duplicate content.
Furthermore, I would mark all paginated pages with a Robots Noindex,Follow tag so they don't bloat Google's index.
Hello Icourse,
The keywords can be tweaked later by your SEO if needed. At this point, other than the top one or two keyword phrases for that product, they should focus on providing visitors with all of the information they will need to make a purchasing decision.
I like to check with customer service to find out what questions people have been asking about various products. For example, if they tell me a lot of customers call in about whether Item A will work with Item D I will be sure to include "Works with Item D." in the product description, perhaps under an FAQ or Specs. tab on the product page. If it is a new site, or if you don't have a customer service department, Q&A sites like Yahoo Answers often provide some good ideas.
In my experience, the more detail the better as long as it is presented well. You don't want one long page of text, but if you can provide all of the specs, options, instructions and details on the product page it will not only help your rankings, but will probably save you money in call center costs.
The more instructions you give a good copywriter the worse their writing will be, in my opinion. If they are professionals (you get what you pay for with copywriting!) and are writing high quality copy about luxury products I would just let them at it in small chunks. You can have your SEO make minor adjustments and recommendations as they go.
I will leave this question open for discussion.
10/3/2024
Your step-by-step guide for using vector embeddings to identify internal linking opportunities at scale so you can confidently apply these techniques to your SEO strategy.
9/26/2017 You should do a mobile/desktop parity audit if content is added, removed, hidden, or changed between devices without sending the user to a new URL. When two or more versions of a website are available on the same URL a parity audit will crawl each version, compare the differences, and look for errors.
3/22/2017 Learn how to do content audits for SEO in this Moz guide by Everett Sizemore, including tips for crawling large websites, rendering JavaScript content, and auditing dynamic mobile content.
9/7/2016 How much dead weight is bringing your site down in rankings? Armed with compelling case study data, Everett Sizemore shares his single most effective and scaleable tactic for improving your site's overall level of quality, in most situations.
1/12/2016 Learn how eCommerce catalog content pruning can increase your revenue in 2016. Everett Sizemore shows you how to prune responsibly, and why you should start doing it today.
4/14/2015 If eCommerce businesses hope to remain competitive with Amazon, eBay, big box brands, and other online retail juggernauts, they'll need to learn how to conduct content marketing, lead generation, and contact nurturing as part of a comprehensive inbound marketing strategy. Learn how to add lead generation and contact nurturing to your inbound marketing toolkit.
8/22/2012 Mozzers just can't get enough of e-commerce. On July 31st, Everett did a webinar for us about tips and tricks for making your e-commerce sites SEO-friendly. As attendees had a ton of questions -- not all of which we could answer in our limited time together, we wanted to make sure they all were answered.
3/29/2012 A lot of things can go wrong when you change most of the URLs on a website with thousands or millions of pages. But this is the story of how something went a little too "right", and how it was fixed by doing something a little bit "wrong". On February, 28 2012 FreeShipping.org relaunched with a new design and updated site architecture.The site'...
12/14/2010 An e-commerce SEO's conundrum: The pages that generate revenue are the ones you hope to rank highly, yet they're the ones to which nobody wants to link. Here are a few ideas to help you get links into product pages on an e-commerce site.
10/1/2009 One of my favorite blog posts of all time was called Do It F*&king Now, written by Quadszilla. That post lifted my productivity levels tenfold - for a few weeks. Maybe I should go back and read it again. But first, I want to share another way to boost productivity Reading blog posts, forums, t...
2/5/2007 Personalized Search: The Next SEO FrontierThere is a reason folks from Yahoo were discussed in The Wisdom of Crowds. I doubt anyone involved in determining the future of a major search engine's modus operandi has not read that book. If the crowd is truly wiser, one would assume that Google would use data gathered from mil...
I have over a decade of experience in search, focusing on technical and eCommerce SEO. I'm a Moz Associate, conference speaker, and former Director of a leading eCommerce marketing agency. Hire me for the hard stuff. That's what I do.
Looks like your connection to Moz was lost, please wait while we try to reconnect.